Compression Method for Stepwise Neural Network Based on Relational Distillation
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    This study aims at the problem that the distillation effect decreases when the gap between the teacher network and the student network in relational knowledge distillation is too large. A stepwise neural network compression method based on relational distillation is proposed. The key point of this method is to add an intermediate network between the teacher network and the student network for relational distillation step by step. Moreover, in each distillation process, additional monomer information is added to further optimize and enhance the learning ability of the student model. The experimental results show that the classification accuracy of the proposed method on CIFAR-10 and CIFAR-100 image classification datasets is improved by about 0.2% compared with that of the original relational knowledge distillation method.

    Reference
    Related
    Cited by
Get Citation

刘昊,张晓滨.基于关系型蒸馏的分步神经网络压缩方法.计算机系统应用,2021,30(12):248-254

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:February 24,2021
  • Revised:March 15,2021
  • Adopted:
  • Online: December 10,2021
  • Published:
Article QR Code
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063