本文已被:浏览 528次 下载 1082次
Received:March 27, 2023 Revised:April 27, 2023
Received:March 27, 2023 Revised:April 27, 2023
中文摘要: 知识蒸馏被广泛应用于语义分割以减少计算量. 以往的语义分割知识提取方法侧重于像素级的特征对齐和类内特征变化提取, 忽略了对语义分割非常重要的类间距离知识的传递. 为了解决这个问题, 本文提出了一种类间距离提取方法, 将特征空间中的类间距离从教师网络转移到学生网络. 此外, 语义分割是一个位置相关的任务, 因此本文开发了一个位置信息提取模块来帮助学生网络编码更多的位置信息. 在Cityscapes、Pascal VOC和ADE20K这3个流行的语义分割数据集上的大量实验表明, 该方法有助于提高语义分割模型的精度, 取得了较好的性能.
中文关键词: 知识蒸馏|语义分割|模型压缩
Abstract:Knowledge distillation is widely adopted in semantic segmentation to reduce the computation cost. The previous knowledge distillation methods for semantic segmentation focus on pixel-wise feature alignment and intra-class feature variation distillation, neglecting to transfer the knowledge of the inter-class distance, which is important for semantic segmentation. To address this issue, this study proposes an inter-class distance distillation (IDD) method to transfer the inter-class distance in the feature space from the teacher network to the student network. Furthermore, since semantic segmentation is a position-dependent task, thus this study exploits a position information distillation module to help the student network encode more position information. Extensive experiments on three popular semantic segmentation datasets: Cityscapes, Pascal VOC, and ADE20K show that the proposed method is helpful to improve the accuracy of semantic segmentation models and achieves great performance.
文章编号: 中图分类号: 文献标志码:
基金项目:
引用文本:
邓文革,王亚军,隋立林,孙国栋,张正博.基于类间距离蒸馏的语义分割.计算机系统应用,2023,32(10):235-241
DENG Wen-Ge,WANG Ya-Jun,SUI Li-Lin,SUN Guo-Dong,ZHANG Zheng-Bo.Distilling Inter-class Distance for Semantic Segmentation.COMPUTER SYSTEMS APPLICATIONS,2023,32(10):235-241
邓文革,王亚军,隋立林,孙国栋,张正博.基于类间距离蒸馏的语义分割.计算机系统应用,2023,32(10):235-241
DENG Wen-Ge,WANG Ya-Jun,SUI Li-Lin,SUN Guo-Dong,ZHANG Zheng-Bo.Distilling Inter-class Distance for Semantic Segmentation.COMPUTER SYSTEMS APPLICATIONS,2023,32(10):235-241