###
计算机系统应用英文版:2021,30(12):248-254
本文二维码信息
码上扫一扫!
基于关系型蒸馏的分步神经网络压缩方法
(西安工程大学 计算机学院, 西安 710048)
Compression Method for Stepwise Neural Network Based on Relational Distillation
(School of Computer Science, Xi’an Polytechnic University, Xi’an 710048, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 573次   下载 1168
Received:February 24, 2021    Revised:March 15, 2021
中文摘要: 针对关系型知识蒸馏方法中教师网络与学生网络的层数差距过大导致蒸馏效果下降的问题, 提出一种基于关系型蒸馏的分步神经网络压缩方法. 该方法的要点在于, 在教师网络和学生网络之间增加一个中间网络分步进行关系型蒸馏, 同时在每一次蒸馏过程中都增加额外的单体信息来进一步优化和增强学生模型的学习能力, 实现神经网络压缩. 实验结果表明, 本文的方法在CIFAR-10和CIFAR-100图像分类数据集上的分类准确度相较于原始的关系型知识蒸馏方法均有0.2%左右的提升.
Abstract:This study aims at the problem that the distillation effect decreases when the gap between the teacher network and the student network in relational knowledge distillation is too large. A stepwise neural network compression method based on relational distillation is proposed. The key point of this method is to add an intermediate network between the teacher network and the student network for relational distillation step by step. Moreover, in each distillation process, additional monomer information is added to further optimize and enhance the learning ability of the student model. The experimental results show that the classification accuracy of the proposed method on CIFAR-10 and CIFAR-100 image classification datasets is improved by about 0.2% compared with that of the original relational knowledge distillation method.
文章编号:     中图分类号:    文献标志码:
基金项目:陕西省自然科学基金(2019JQ-849)
引用文本:
刘昊,张晓滨.基于关系型蒸馏的分步神经网络压缩方法.计算机系统应用,2021,30(12):248-254
LIU Hao,ZHANG Xiao-Bin.Compression Method for Stepwise Neural Network Based on Relational Distillation.COMPUTER SYSTEMS APPLICATIONS,2021,30(12):248-254