Abstract
Hand gesture recognition is important for interactions under VR environment. Traditional vision-based approaches encounter occlusion problems, and thus, wearable devices could be an effective supplement. This study presents a hand grasps recognition method in virtual reality settings, by fusing signals acquired using force myography (FMG), a muscular activity-based hand gesture recognition method, and Leap Motion. We conducted an experiment where participants performed grasping of virtual objects with VR goggles on their head, an FMG band on their wrist, and a Leap Motion positioned either on the desk or on the goggles (two experimental settings). The FMG, Leap Motion, and fusion of both signals were used for training and testing a simple, but effective linear discriminant analysis classifier, as well as three other mainstream classification algorithms. The results showed that the fusion of both signals achieved a significant improvement in classification accuracy, compared to using Leap Motion alone in both experimental settings.
Similar content being viewed by others
References
Al-Timemy AH, Khushaba RN, Bugmann G, Escudero J (2015) Improving the performance against force variation of EMG controlled multifunctional upper-limb prostheses for transradial amputees. IEEE Trans Neural Syst Rehabil Eng. https://doi.org/10.1109/tnsre.2015.2445634
Amsuss S, Goebel PM, Jiang N, Graimann B, Paredes L, Farina D (2014) Self-correcting pattern recognition system of surface EMG signals for upper limb prosthesis control. IEEE Trans Biomed Eng 61:1167–1176. https://doi.org/10.1109/TBME.2013.2296274
Burdea GC, Coiffet P (2003) Virtual reality technology. Wiley, New York
Castro MCF, Arjunan SP, Kumar DK (2015) Selection of suitable hand gestures for reliable myoelectric human computer interface. Biomed Eng Online 14:1–11. https://doi.org/10.1186/s12938-015-0025-5
Chang C, Lin C (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol 2:1–27. https://doi.org/10.1145/1961189.1961199
Chuan CH, Regina E, Guardino C (2014) American sign language recognition using leap motion sensor. In: 2014 13th international conference on machine learning applications, 2014, pp 541–544. https://doi.org/10.1109/icmla.2014.110
Colgan A (2018) How does the leap motion controller work. Leap Motion Blog. http://blog.leapmotion.com/hardware-to-software-how-does-the-leap-motion-controller-work/. Accessed February 16, 2018
Cutkosky MR (1989) On grasp choice, grasp models, and the design of hands for manufacturing tasks. Robot Autom IEEE Trans 5:269–279. https://doi.org/10.1109/70.34763
Dementyev A, Paradiso JA (2014) WristFlex: low-power gesture input with wrist-worn pressure sensors. In: Proceedings of the 27th annual ACM symposium user interface software technology, UIST’14, ACM Press, New York, New York, USA, 2014, pp 161–166. https://doi.org/10.1145/2642918.2647396
Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Mach Learn 40:139–157. https://doi.org/10.1023/A:1007607513941
Englehart K, Hudgins B (2003) A robust, real-time control scheme for multifunction myoelectric control. Biomed Eng IEEE Trans 50:848–854. https://doi.org/10.1109/TBME.2003.813539
Farrell TR, Weir RF (2007) The optimal controller delay for myoelectric prostheses. IEEE Neural Syst Rehabil Eng. https://doi.org/10.1109/tnsre.2007.891391
Faria AJ, Hutchinson D, Wellington WJ, Gold S (2009) Developments in business gaming: a review of the past 40 years. Simul. Gaming. 40:464–487
Feix T, Bullock IM, Dollar AM (2014) Analysis of human grasping behavior: object characteristics and grasp type. IEEE Trans Haptics 7:311–323. https://doi.org/10.1109/TOH.2014.2326871
Fisher RA (1936) The use of multiple measurements in taxonomic problems. Ann Eugen 7:179–188. https://doi.org/10.1111/j.1469-1809.1936.tb02137.x
Fove Inc. (2017) FOVE 0 eye tracking virtual reality devkit user manual. archive.getfove.com/setup/FOVE0_User_Manual.pdf. Accessed April 5, 2017
Grimm F, Naros G, Gharabaghi A (2016) Closed-loop task difficulty adaptation during virtual reality reach-to-grasp training assisted with an exoskeleton for stroke rehabilitation. Front Neurosci 10:518
Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sens (Switz) 14:3702–3720. https://doi.org/10.3390/s140203702
Holden MK (2005) Virtual environments for motor rehabilitation: review. Cyberpsychol Behav 8:187–211
InterlinkElectronics (2010) FSR® integration guide & evaluation parts catalog with suggested electrical interfaces
Jiang X, Merhi L-K, Menon C (2017a) Force exertion affects grasp classification using force myography. IEEE Trans Human-Mach Syst. https://doi.org/10.1109/thms.2017.2693245
Jiang X, Merhi L-K, Xiao ZG, Menon C (2017b) Exploration of force myography and surface electromyography in hand gesture classification. Med Eng Phys 41:63–73. https://doi.org/10.1016/j.medengphy.2017.01.015
Jin H, Chen Q, Chen Z, Hu Y, Zhang J (2016) Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task. CAAI Trans Intell Technol 1:104–113
Kolsch M (2004) Vision based hand gesture interfaces for wearable computing and virtual environments. University of California, Santa Barbara
Li N, Yang D, Jiang L, Liu H, Cai H (2012) Combined use of FSR sensor array and SVM classifier for finger motion recognition based on pressure distribution map. J Bionic Eng 9:39–47. https://doi.org/10.1016/S1672-6529(11)60095-4
Marin G, Dominio F, Zanuttigh P (2014) Hand gesture recognition with leap motion and kinect devices. In: 2014 IEEE international conference on image process (ICIP), 2014, pp 1565–1569
Mine M et al (1995) Virtual environment interaction techniques. UNC Chapel Hill Comput. Sci. Tech. Rep. TR95-018, pp 507242–507248
Motion L (2017) Leap motion SDK, https://developer.leapmotion.com/get-started. Accessed November 17, 2017
Ong SK, Nee AYC (2013) Virtual and augmented reality applications in manufacturing. Springer Science & Business Media, Berlin
Palacios JM, Sagüés C, Montijano E, Llorente S (2013) Human-computer interaction based on hand gestures using RGB-D sensors. Sensors. 13:11842–11860
Potter LE, Araullo J, Carter L (2013) The leap motion controller: a view on sign language. In: Proceedings of the 25th Australian computer-human interaction conference: augmentation, application, innovation, collaboration, ACM, New York, NY, USA, 2013, pp 175–178. https://doi.org/10.1145/2541016.2541072
Riillo F, Quitadamo LR, Cavrini F, Gruppioni E, Pinto CA, Pastò NC, Sbernini L, Albero L, Saggio G (2014) Optimization of EMG-based hand gesture recognition: supervised vs. unsupervised data preprocessing on healthy subjects and transradial amputees. Biomed Signal Process Control 14:117–125. https://doi.org/10.1016/j.bspc.2014.07.007
Sadarangani GP, Jiang X, Simpson LA, Eng JJ, Menon C (2017) Force myography for monitoring grasping in individuals with stroke with mild to moderate upper-extremity impairments: a preliminary investigation in a controlled environment. Front Bioeng Biotechnol 5:42. https://doi.org/10.3389/fbioe.2017.00042
Sagayam KM, Hemanth DJ (2017) Hand posture and gesture recognition techniques for virtual reality applications: a survey. Virtual Real 21:91–107. https://doi.org/10.1007/s10055-016-0301-0
Satava RM (1997) Virtual reality and telepresence for military medicine. Ann Acad Med Singapore 26:118–120
Scheme E, Englehart K (2011) Electromyogram pattern recognition for control of powered upper-limb prostheses: state of the art and challenges for clinical use. J Rehabil Res Dev 48:643. https://doi.org/10.1682/JRRD.2010.09.0177
Silva ECP, Clua EWG, Montenegro AA (2015) Sensor data fusion for full arm tracking using Myo Armband and leap motion. In: 2015 14th Brazilian symposium on computer games digital entertainment (SBGames), pp 128–134
Sutherland LM, Middleton PF, Anthony A, Hamdorf J, Cregan P, Scott D, Maddern GJ (2006) Surgical simulation: a systematic review. Ann Surg 243:291–300
Vapnik V (1998) Statistical learning theory. Wiley, New York
Vargas HF, Vivas OA (2014) Gesture recognition system for surgical robot’s manipulation. In: 2014 XIX symposium on image, signal process and artificial vision (STSIVA), 2014, pp 1–5
Weichert F, Bachmann D, Rudak B, Fisseler D (2013a) Analysis of the accuracy and robustness of the leap motion controller. Sens (Switz) 13:6380–6393. https://doi.org/10.3390/s130506380
Weichert F, Bachmann D, Rudak B, Fisseler D (2013b) Analysis of the accuracy and robustness of the leap motion controller. Sens (Switz) 13:6380–6393. https://doi.org/10.3390/s130506380
Wininger M, Kim N-H, Craelius W (2008) Pressure signature of forearm as predictor of grip force. J Rehabil Res Dev 45:883–892. https://doi.org/10.1682/JRRD.2007.11.0187
Yaniger SI (1991) Force sensing resistors: a review of the technology. Electro Int. https://doi.org/10.1109/electr.1991.718294
Zhang H, Zhao Y, Yao F, Xu L, Shang P, Li G (2013) An adaptation strategy of using LDA classifier for EMG pattern recognition. In: 2013 35th annual international conference of the IEEE engineering in medicine and biology society (EMBC). https://doi.org/10.1109/embc.2013.6610488
Zurada JM (1992) Introduction to artificial neural systems. West, St Paul
Acknowledgement
This research was supported by the Natural Sciences and Engineering Research Council of Canada (NSERC), the Canadian Institutes of Health Research (CIHR), and the Canada Research Chair (CRC) program. The authors thank Mary Yu, Tingyu Hu, and Wenxuan Song for helping with the VR content development and data collection.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Jiang, X., Xiao, Z.G. & Menon, C. Virtual grasps recognition using fusion of Leap Motion and force myography. Virtual Reality 22, 297–308 (2018). https://doi.org/10.1007/s10055-018-0339-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10055-018-0339-2