本文已被:浏览 979次 下载 2023次
Received:December 01, 2020 Revised:January 04, 2021
Received:December 01, 2020 Revised:January 04, 2021
中文摘要: 构建卷积神经网络要耗费大量的人力资源, 且训练过程中需要消耗大量的算力资源. 利用空洞卷积代替卷积神经网络中的池化操作, 能有效增加感受野, 降低运算复杂度, 但是空洞卷积会带来空间层次和信息连续性的丢失. 本文提出了一种并行不对称空洞卷积模块, 该模块能够补全空洞卷积所丢失的信息, 可以嵌入到现有的卷积神经网络中, 代替3×3卷积进行网络训练, 从而加速网络的收敛, 提高网络的性能. 实验结果表明, 利用本文所提出的并行不对称空洞卷积模块, 可以显著提高不同网络在CIFAR-10等数据集上的分类效果.
Abstract:Creating a convolutional neural network consumes substantial human resources, and much computing power is needed during training. The application of dilated convolution instead of the pooling operation in the convolutional neural network can considerably increase the receptive field and reduce the computational complexity, but the dilated convolution will bring about the loss of spatial hierarchy and information continuity. This study proposes a parallel asymmetric dilated convolution module, which can fill in the information lost by dilated convolution and be embedded in the current convolutional neural networks to replace the 3×3 convolution for network training. As a result, network convergence is accelerated and network performance is improved. The experimental results show that the proposed module can significantly improve the classification of various classical networks on CIFAR-10 and other data sets.
文章编号: 中图分类号: 文献标志码:
基金项目:国家自然科学基金(61801159, 61571174); 杭州师范大学星光计划
引用文本:
张智杰,尉飞,葛青青,赵宝奇,孙军梅,李秀梅.一种并行不对称空洞卷积模块.计算机系统应用,2021,30(9):206-211
ZHANG Zhi-Jie,YU Fei,GE Qing-Qing,ZHAO Bao-Qi,SUN Jun-Mei,LI Xiu-Mei.Parallel Asymmetric Dilated Convolution Module.COMPUTER SYSTEMS APPLICATIONS,2021,30(9):206-211
张智杰,尉飞,葛青青,赵宝奇,孙军梅,李秀梅.一种并行不对称空洞卷积模块.计算机系统应用,2021,30(9):206-211
ZHANG Zhi-Jie,YU Fei,GE Qing-Qing,ZHAO Bao-Qi,SUN Jun-Mei,LI Xiu-Mei.Parallel Asymmetric Dilated Convolution Module.COMPUTER SYSTEMS APPLICATIONS,2021,30(9):206-211