The Influence of the Threshold of the Size of the Graphic Element on the General Dynamic Gesture Behavior
Nowadays, the image clarity and reality of augmented reality and virtual reality are constantly improving. However, The interaction in 3D space still relies on the handle or other mechanical objects to operate it. Therefore, how to interact with interfaces and objects in three-dimensional space in a natural way is a problem that current researchers will consider. The research direction of this paper is to explore the influence of the size of the graphic element in the interactive interface on the dynamic gesture behavior through the behavior experiment of the user. In the paper, the researcher observes the gestures when interacting with objects of the different size in the virtual space, and then, researchers analyzes the correlation between the size of object and the dynamic gesture. The correlation between the size of the graphic element presentation and the dynamic gesture behavior is obtained.
KeywordsVirtual reality Natural gesture interaction 3D interactive space Graphic element rendering size Dynamic gesture recognition Leap motion HTC vive
The authors wish to thank Science and Technology on Avionics Integration Laboratory and Aeronautical Science Fund (No. 20185569008), supported by “the Fundamental Research Funds for the Central Universities”, National Natural Science Foundation of China (no. 71871056), National Natural Science Foundation of China (no. 71471037).
- 4.Chen, C.: Research on natural gesture interaction technology based on LeapMotion in industrial robot teaching. Doctoral dissertationGoogle Scholar
- 7.Yuan, M.L., Ong, S.K., Nee, A.Y.C.: Registration using natural features for augmented reality systems. IEEE Trans. Visual. Comput. Graphics 12, 560–580 (2006)Google Scholar
- 8.Sidharta, R.: Augmented reality tangible interfaces for CAD design review. Masters thesis, Iowa State University (2002)Google Scholar
- 14.Feng, Z., Yang, B., Xu, T.: Direct operation type 3D human-computer interaction paradigm based on natural gesture tracking. CJC 37, 1309–1323 (2014)Google Scholar
- 15.Song, W., Cai, X., Xi, Y.: Real-time single camera natural user interface engine development. MTA 76, 11159–11175 (2017)Google Scholar
- 16.Hou, Y., Zhou, H., Wang, Z.: Review of research progress in deep learning in speech recognition. JCA 34, 2241–2246 (2017)Google Scholar