Gesture Interaction Method in Virtual Classroom Based on Leap Motion
CSTR:
Author:
  • Article
  • | |
  • Metrics
  • |
  • Reference [25]
  • |
  • Related [20]
  • | | |
  • Comments
    Abstract:

    With the rapid development of virtual reality technology, somatosensory sensors such as Leap Motion appear and are widely used in human-computer interaction. This study proposes a Leap Motion gesture interaction method based on a deep neural network to resolve the problem that the Leap Motion somatosensory controller has a low recognition rate and a slow recognition speed at the edge of its recognition range. In addition to the defined interactive gestures, a three-dimensional interactive system is designed and applied to a virtual scene. Specifically, the system captures data with Leap Motion, uses the deep neural network to extract features from the acquired infrared images, and implements gesture classification and recognition. Then, the changes in the hand coordinates between two adjacent frames acquired by Leap Motion are utilized to determine dynamic gestures. Finally, the interaction function in the virtual scene is fulfilled by investigating the dynamic gestures. Experimental verification shows that the proposed gesture recognition method is superior to the built-in gesture recognition method of Leap Motion in both recognition speed and recognition accuracy. Moreover, it still maintains a high recognition rate at the edge of Leap Motion’s recognition range.

    Reference
    [1] 黄俊, 景红. 基于Leap Motion的手势识别在虚拟交互中的研究. 计算机应用研究, 2017, 34(4): 1231–1234. [doi: 10.3969/j.issn.1001-3695.2017.04.062
    [2] 刘瑜兴, 王淑侠, 徐光耀, 等. 基于Leap Motion的三维手势交互系统研究. 图学学报, 2019, 40(3): 556–564
    [3] Avola D, Bernardi M, Cinque L, et al. Exploiting recurrent neural networks and leap motion controller for the recognition of sign language and semaphoric hand gestures. IEEE Transactions on Multimedia, 2018, 21(1): 234–245
    [4] Liu FL, Zeng W, Yuan CZ, et al. Kinect-based hand gesture recognition using trajectory information, hand motion dynamics and neural networks. Artificial Intelligence Review, 2019, 52(1): 563–583
    [5] 林书坦, 尹长青. 基于LeapMotion的数字手势识别. 电脑知识与技术, 2015, 11(35): 108–109
    [6] 金童, 蒋玉茹, 邱伟. 基于Leap Motion的汉字空中书写及识别系统. 电子技术与软件工程, 2019, (23): 47–49
    [7] 李菲菲. 基于Leap Motion的手势交互及在虚拟场景中的应用研究[硕士学位论文]. 济南: 山东大学, 2019.
    [8] 林莹莹, 蔡睿凡, 朱雨真, 等. 基于Leap Motion的虚拟现实陶艺体验系统. 图学学报, 2020, 41(1): 57–65
    [9] 彭玉青, 赵晓松, 陶慧芳, 等. 复杂背景下基于深度学习的手势识别. 机器人, 2019, 41(4): 534–542
    [10] 谷学静, 周自朋, 郭宇承, 等. 基于CNN-LSTM混合模型的动态手势识别方法. 计算机应用与软件, 2021, 38(11): 205–209. [doi: 10.3969/j.issn.1000-386x.2021.11.032
    [11] 张维, 林泽一, 程坚, 等. 动态手势理解与交互综述. 软件学报, 2021, 32(10): 3051–3067. [doi: 10.13328/j.cnki.jos.006217
    [12] 解迎刚, 王全. 基于视觉的动态手势识别研究综述. 计算机工程与应用, 2021, 57(22): 68–77. [doi: 10.3778/j.issn.1002-8331.2105-0314
    [13] Adithya V, Rajesh R. A deep convolutional neural network approach for static hand gesture recognition. Procedia Computer Science, 2020, 171: 2353–2361
    [14] Wu BX, Yang CG, Zhong JP. Research on transfer learning of vision-based gesture recognition. International Journal of Automation and Computing, 2021, 18(3): 422–431
    [15] 王粉花, 张强, 黄超, 等. 融合双流三维卷积和注意力机制的动态手势识别. 电子与信息学报, 2021, 43(5): 1389–1396. [doi: 10.11999/JEIT200065
    [16] 卢迪, 马文强. 基于改进YOLOv4-tiny算法的手势识别. 电子与信息学报, 2021, 43(11): 3257–3265. [doi: 10.11999/JEIT201047
    [17] 王粉花, 黄超, 赵波, 等. 基于YOLO算法的手势识别. 北京理工大学学报, 2020, 40(8): 873–879
    [18] 常建红. 基于改进Faster RCNN算法的手势识别研究[硕士学位论文]. 保定: 河北大学, 2020.
    [19] 胡鹏程. 基于深度学习的手势识别系统研究与实现[硕士学位论文]. 北京: 北京邮电大学, 2018.
    [20] 吴晓凤, 张江鑫, 徐欣晨. 基于Faster R-CNN的手势识别算法. 计算机辅助设计与图形学学报, 2018, 30(3): 468–476
    [21] 高宇, 何小海, 吴晓红, 等. 基于Leap Motion的动态手势识别方法. 计算机系统应用, 2019, 28(11): 208–212. [doi: 10.15888/j.cnki.csa.007172
    [22] Zhang QX, Deng F. Dynamic gesture recognition based on leapmotion and HMM-CART model. Journal of Physics: Conference Series, 2017, 910(1): 012037
    [23] Lu W, Tong Z, Chu JH. Dynamic hand gesture recognition with leap motion controller. IEEE Signal Processing Letters, 2016, 23(9): 1188–1192
    [24] Potter L E, Araullo J, Carter L. The leap motion controller: A view on sign language. Proceedings of the 25th Australian Computer-human Interaction Conference: Augmentation, Application, Innovation, Collaboration. Adelaide: ACM, 2013. 175–178.
    [25] Zai?i IA, Pentiuc ?G, Vatavu RD. On free-hand TV control: Experimental results on user-elicited gestures with Leap Motion. Personal and Ubiquitous Computing, 2015, 19(5–6): 821–838
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

胡发丽,高全力,王西汉,李庆敏.基于Leap Motion的虚拟课堂手势交互方法.计算机系统应用,2022,31(8):160-168

Copy
Share
Article Metrics
  • Abstract:800
  • PDF: 1722
  • HTML: 1851
  • Cited by: 0
History
  • Received:October 30,2021
  • Revised:December 02,2021
  • Online: June 01,2022
Article QR Code
You are the first990410Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063