###
计算机系统应用英文版:2022,31(12):135-146
本文二维码信息
码上扫一扫!
改进U-Net的高分辨率遥感图像轻量化分割
(成都信息工程大学 计算机学院, 成都 610225)
Lightweight Segmentation for High Resolution Remote Sensing Image Based on Improved U-Net
(School of Computer Science, Chengdu University of Information Technology, Chengdu 610225, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 967次   下载 2410
Received:March 17, 2022    Revised:April 14, 2022
中文摘要: 针对传统图像分割方法分割效率低下, 遥感图像特征复杂多样, 复杂场景下分割性能受到限制等问题, 在基于U-Net网络架构的基础上, 提出一种能够较好提取遥感图像特征并兼顾效率的改进U-Net模型. 首先, 以EfficientNetV2作为U-Net的编码网络, 增强特征提取能力, 提高训练和推理效率, 然后在解码部分使用卷积结构重参数化方法并结合通道注意力机制, 几乎不增加推理时间的前提下提升网络性能, 最后结合多尺度卷积融合模块, 提高网络对不同尺度目标的特征提取能力和更好地结合上下文信息. 实验表明, 改进的网络在遥感图像分割性能提升的同时分割效率也提高.
Abstract:Considering the problems of low segmentation efficiency of traditional image segmentation methods, complex and diverse features of remote sensing images, and limited segmentation performance in complex scenes, an improved U-Net model is proposed on the basis of the U-Net network architecture, which can satisfactorily extract the features of remote sensing images while maintaining efficiency. First, EfficientNetV2 is used as the encoding network of U-Net to enhance the feature extraction ability and improve the training and inference efficiency. Then, the convolutional structural re-parameterization method is applied in the decoding network and is combined with the channel attention mechanism to improve the network performance without increasing the inference time. Finally, the multi-scale convolution fusion module is employed to improve the feature extraction ability of the network for objects with different scales and the utilization of context information. The experiments reveal that the improved network can not only improve the segmentation performance of remote sensing images but also promote segmentation efficiency.
文章编号:     中图分类号:    文献标志码:
基金项目:四川省科技计划重点研发项目(2020YFG0442, 2020YFG0453)
引用文本:
胡伟,文武,魏敏.改进U-Net的高分辨率遥感图像轻量化分割.计算机系统应用,2022,31(12):135-146
HU Wei,WEN Wu,WEI Min.Lightweight Segmentation for High Resolution Remote Sensing Image Based on Improved U-Net.COMPUTER SYSTEMS APPLICATIONS,2022,31(12):135-146