本文已被:浏览 705次 下载 1675次
Received:October 04, 2022 Revised:November 04, 2022
Received:October 04, 2022 Revised:November 04, 2022
中文摘要: 地貌晕渲是大尺度战场仿真中的重要一环, 针对现有的地貌晕渲技术在细节处纹理特征不明显的问题, 提出了一种结合高程曲率和环境光遮蔽的大尺度战场地貌晕渲增强方法. 第1步, 通过分析数字高程数据的曲率属性生成地形曲率图, 曲率图与卫星影像叠加可以突出显示地貌特征线. 第2步, 提出一种基于深度可分离卷积的环境光遮蔽计算方法, 能够增强战场地形在沟壑处的视觉表现. 最后将曲率图、环境光遮蔽与卫星影像三者融合生成实时地貌晕渲效果. 实验表明, 本文方法可以在较低级别的全球卫星影像上呈现更好的视觉效果, 使得观察者在把握三维地形整体走势的同时, 能进一步分析地貌细节处的纹理特征.
Abstract:Relief shading is an important part of large-scale battlefield simulation. Aiming at the problem that texture features of the existing relief shading technologies are not obvious in terms of details, this study proposes a large-scale battlefield relief shading enhancement method that combines elevation curvature and ambient occlusion. In the first step, by analyzing the curvature attribute of digital elevation data, a terrain curvature map is generated and then superimposed with satellite images to highlight geomorphic feature lines. In the second step, an ambient occlusion calculation method based on depthwise separable convolution is proposed, which can enhance the visual performance of battlefield terrain in gullies. Finally, the curvature map, ambient occlusion, and satellite images are fused to generate a real-time relief shading effect. Experiments show that the proposed method can present better visual effects on low-level global satellite images so that the observer can further analyze the texture features in terms of terrain details while grasping the overall trend of the three-dimensional terrain.
keywords: battlefield simulation relief shading plan curvature ambient occlusion depthwise separable convolution
文章编号: 中图分类号: 文献标志码:
基金项目:国家自然科学基金重点项目(U20A20161)
引用文本:
孙胜哲,李辉.面向大尺度战场的地貌晕渲增强方法.计算机系统应用,2023,32(5):11-19
SUN Sheng-Zhe,LI Hui.Method of Enhancing Relief Shading for Large-scale Battlefield.COMPUTER SYSTEMS APPLICATIONS,2023,32(5):11-19
孙胜哲,李辉.面向大尺度战场的地貌晕渲增强方法.计算机系统应用,2023,32(5):11-19
SUN Sheng-Zhe,LI Hui.Method of Enhancing Relief Shading for Large-scale Battlefield.COMPUTER SYSTEMS APPLICATIONS,2023,32(5):11-19