###
计算机系统应用英文版:2022,31(2):335-341
本文二维码信息
码上扫一扫!
基于增强现实和人脸姿态估计的虚拟试戴技术
(1.兰州大学 信息科学与工程学院, 兰州 730000;2.凯迪雷拉大学 经济管理学院, 碧瑶 2600)
Virtual Try-on Technique Based on Augmented Reality and Face Pose Estimation
(1.School of Information Science and Engineering, Lanzhou University, Lanzhou 730000, China;2.Economics and Management School, University of the Cordilleras, Baguio 2600, Philippines)
摘要
图/表
参考文献
相似文献
本文已被:浏览 1115次   下载 1603
Received:April 04, 2021    Revised:April 29, 2021
中文摘要: 得益于增强现实技术, 虚拟信息能够在真实世界中予以体现和融合, 使得脱离实体的虚拟信息在现实世界中的应用场景不断泛化. 基于此, 本文提出了一种高效实时的虚拟试戴技术, 可运用于多种实际场景中. 比如在电商场景下, 用户购买商品前可在先线上选择相应款式的模型文件进行虚拟试戴, 并根据效果辅助决策. 拟议方法依据人脸姿态参数将模型文件映射至可与实时视频流加合的图形状态, 在特定区域加合后反馈给视频帧中, 最终加合的模型文件能够自适应头部的位置变化. 实验结果表明, 拟议方法在人脸远近位置、图形渲染以及佩戴实时性等方面具有较好的效果.
Abstract:Thanks to the augmented reality technology, virtual information can be embodied and integrated in the real world, enabling the increasingly wide applications of virtual information detached from the physical world in real-world scenarios. On this basis, this study proposes an efficient real-time virtual try-on technique that can be applied to a variety of practical scenarios. For example, in the e-commerce scenario, users can select the corresponding model file of a certain style online for virtual try-on before purchasing the product and make their decisions upon the virtual try-on results. The proposed method maps the model file to a graphical state that can be added with the real-time video stream on the basis of the face pose parameters. After the addition is performed in a specific region, the result is fed back to the video frame. The final added model file is able to adapt itself to the position change of the head. The experimental results show that the proposed method delivers a good performance in face distance and position, graphics rendering, and real-time wearability.
文章编号:     中图分类号:    文献标志码:
基金项目:
引用文本:
黄奕棋,黄起豹,杨民强.基于增强现实和人脸姿态估计的虚拟试戴技术.计算机系统应用,2022,31(2):335-341
HUANG Yi-Qi,HUANG Qi-Bao,YANG Min-Qiang.Virtual Try-on Technique Based on Augmented Reality and Face Pose Estimation.COMPUTER SYSTEMS APPLICATIONS,2022,31(2):335-341