本文已被:浏览 1930次 下载 3090次
Received:July 13, 2019 Revised:August 20, 2019
Received:July 13, 2019 Revised:August 20, 2019
中文摘要: 针对现有基于深度学习的图像超分辨率重建方法,其对细节纹理恢复过程中容易产生伪纹理,并且没有充分利用原始低分辨率图像丰富的局部特征层信息的问题,提出一种基于注意力生成对抗网络的超分辨率重建方法.该方法中生成器部分是通过注意力递归网络构成,其网络中还引入了密集残差块结构.首先,生成器利用自编码结构提取图像局部特征层信息,并提升分辨率;然后,通过判别器进行图像修正,最终将图像重建为高分辨率图像.实验结果表明,在多种面向峰值信噪比超分辨率评价方法的网络中,所设计的网络表现出了稳定的训练性能,改善了图像的视觉质量,同时具有较强的鲁棒性.
Abstract:The existing image super-resolution reconstruction method based on deep learning is easy to generate pseudo texture, and the rich local feature layer information in the original low-resolution image is not fully utilized. In order to improve image quality, a super-resolution reconstruction method based on attentive generative adversarial is proposed. The generator part of the method is constructed by attention recursive network, and a dense residual block structure is also introduced in the network. First, the generator extracts the local feature layer information of the image by using the self-encoding structure to improve the resolution. Then, the image is corrected by the discriminator. Finally, the image is reconstructed into a high-resolution image. In a variety of networks for peak signal-to-noise ratio super-resolution evaluation methods, the experimental results show that the designed network exhibits stable training performance, improves the visual quality of the image, and has strong robustness.
keywords: super-resolution reconstruction generative adversarial network attention network residual network feature extraction
文章编号: 中图分类号: 文献标志码:
基金项目:
引用文本:
丁明航,邓然然,邵恒.基于注意力生成对抗网络的图像超分辨率重建方法.计算机系统应用,2020,29(2):205-211
DING Ming-Hang,DENG Ran-Ran,SHAO Heng.Image Super-Resolution Reconstruction Method Based on Attentive Generative Adversarial Network.COMPUTER SYSTEMS APPLICATIONS,2020,29(2):205-211
丁明航,邓然然,邵恒.基于注意力生成对抗网络的图像超分辨率重建方法.计算机系统应用,2020,29(2):205-211
DING Ming-Hang,DENG Ran-Ran,SHAO Heng.Image Super-Resolution Reconstruction Method Based on Attentive Generative Adversarial Network.COMPUTER SYSTEMS APPLICATIONS,2020,29(2):205-211