本文已被:浏览 327次 下载 1376次
Received:August 25, 2023 Revised:October 09, 2023
Received:August 25, 2023 Revised:October 09, 2023
中文摘要: 一些主流的图像任意风格迁移模型在保持内容图像的显著性信息和细节特征方面依然有局限性, 生成的图像往往具有内容模糊、细节失真等问题. 针对以上问题, 本文提出一种可以有效保留内容图像细节特征的图像任意风格迁移模型. 模型包括灵活地融合从编码器提取到的浅层至深层的多层级图像特征; 提出一种新的特征融合模块, 该模块可以高质量地融合内容特征和风格特征. 此外, 还提出一个新的损失函数, 该损失函数可以很好地保持内容和风格全局结构, 消除伪影. 实验结果表明, 本文提出的图像任意风格迁移模型可以很好地平衡风格和内容, 保留内容图像完整的语义信息和细节特征, 生成视觉效果更好的风格化图像.
Abstract:Some mainstream image arbitrary style transfer models still have limitations in maintaining the saliency information and detailed features of content images, resulting in problems such as content blurring and loss of details in the generated images. To solve the problems, this study proposes an arbitrary style transfer model that can effectively preserve the detailed features of content images. The model includes flexible fusing shallow to deep multi-layer image features extracted from the encoder. A new feature fusion is proposed, which allows for a high-quality fusion of content features and style features. In addition, a new loss function is proposed, which can well preserve the global structure of content and style and eliminate artifacts. The experimental results show that the proposed image arbitrary style transfer model can effectively balance style and content, preserve the complete semantic information and detailed features of the content image, and generate stylized images with better visual effects.
keywords: image arbitrary style transfer preserving detailed features multi-layer image features feature fusion loss function attention mechanism
文章编号: 中图分类号: 文献标志码:
基金项目:江苏省高等学校自然科学研究面上项目(19KJB520032); 江苏师范大学博士学位教师科研支持项目(20XSRS018); 江苏省研究生科研与实践创新计划(KYCX22_2859)
引用文本:
蒋亨畅,张笃振.保留细节特征的图像任意风格迁移.计算机系统应用,2024,33(3):118-125
JIANG Heng-Chang,ZHANG Du-Zhen.Image Arbitrary Style Transfer with Preserving Detailed Features.COMPUTER SYSTEMS APPLICATIONS,2024,33(3):118-125
蒋亨畅,张笃振.保留细节特征的图像任意风格迁移.计算机系统应用,2024,33(3):118-125
JIANG Heng-Chang,ZHANG Du-Zhen.Image Arbitrary Style Transfer with Preserving Detailed Features.COMPUTER SYSTEMS APPLICATIONS,2024,33(3):118-125