###
计算机系统应用英文版:2019,28(11):208-212
本文二维码信息
码上扫一扫!
基于Leap Motion的动态手势识别方法
(1.四川大学 电子信息学院 图像信息研究所, 成都 610065;2.成都西图科技有限公司, 成都 610065)
Dynamic Gesture Recognition Method Based on Leap Motion
(1.Image Information Institute, College of Electronics and Information Engineering, Sichuan University, Chengdu 610065, China;2.Chengdu Xitu Technology Co. Ltd., Chengdu 610065, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 1860次   下载 3783
Received:April 25, 2019    Revised:May 21, 2019
中文摘要: 伴随虚拟现实(Virtual Reality,VR)技术的发展,以及人们对人机交互性能和体验感的要求提高,手势识别作为影响虚拟现实中交互操作的重要技术之一,其精确度急需提升[1].针对当前手势识别方法在一些动作类似的手势识别中表现欠佳的问题,提出了一种多特征动态手势识别方法.该方法首先使用体感控制器Leap Motion追踪动态手势获取数据,然后在特征提取过程中增加对位移向量角度和拐点判定计数的提取,接着进行动态手势隐马尔科夫模型(Hidden Markov Model,HMM)的训练,最后根据待测手势与模型的匹配率进行识别.从实验结果中得出,该多特征识别方法能够提升相似手势的识别率.
Abstract:With the development of Virtual Reality (VR) technology and the increasing demand for human-computer interaction performance and experience, gesture recognition is one of the important technologies affecting the interaction in VR, and its accuracy needs to be improved. Aiming at the problem that the current gesture recognition method performs poorly in some similar gesture recognition, a multi-feature dynamic gesture recognition method is proposed. Firstly, this method uses Leap Motion to track the dynamic gestures to acquire data, then adds the displacement vector angle and the inflection point judgment in the feature extraction process, after that performs the training of the dynamic gesture Hidden Markov Model (HMM). Finally, the recognition is carried out according to the matching ratio of the gesture to be tested and the model. It is concluded from the experimental results that the multi-feature recognition method can improve the recognition rate of similar gestures.
文章编号:     中图分类号:    文献标志码:
基金项目:四川省科技计划项目(2018HH0143);四川省教育厅项目(18ZB0355)
引用文本:
高宇,何小海,吴晓红,王正勇,张豫堃.基于Leap Motion的动态手势识别方法.计算机系统应用,2019,28(11):208-212
GAO Yu,HE Xiao-Hai,WU Xiao-Hong,WANG Zheng-Yong,ZHANG Yu-Kun.Dynamic Gesture Recognition Method Based on Leap Motion.COMPUTER SYSTEMS APPLICATIONS,2019,28(11):208-212