Abstract:With the rapid development of virtual reality technology, somatosensory sensors such as Leap Motion appear and are widely used in human-computer interaction. This study proposes a Leap Motion gesture interaction method based on a deep neural network to resolve the problem that the Leap Motion somatosensory controller has a low recognition rate and a slow recognition speed at the edge of its recognition range. In addition to the defined interactive gestures, a three-dimensional interactive system is designed and applied to a virtual scene. Specifically, the system captures data with Leap Motion, uses the deep neural network to extract features from the acquired infrared images, and implements gesture classification and recognition. Then, the changes in the hand coordinates between two adjacent frames acquired by Leap Motion are utilized to determine dynamic gestures. Finally, the interaction function in the virtual scene is fulfilled by investigating the dynamic gestures. Experimental verification shows that the proposed gesture recognition method is superior to the built-in gesture recognition method of Leap Motion in both recognition speed and recognition accuracy. Moreover, it still maintains a high recognition rate at the edge of Leap Motion’s recognition range.