Design and Assessment of Two Handling Interaction Techniques for 3D Virtual Objects Using the Myo Armband

  • Yadira Garnica Bonome
  • Abel González Mondéjar
  • Renato Cherullo de Oliveira
  • Eduardo de Albuquerque
  • Alberto RaposoEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10909)


Hand gesture recognition using electromyography signals (EMG) has attracted increased attention due to the rise of cheaper wearable devices that can record accurate EMG data. One of the outstanding devices in this area is the Myo armband, equipped with eight EMG sensors and a nine-axis inertial measurement unit. The use of Myo armband in virtual reality, however, is very limited, because it can only recognize five pre-set gestures. In this work, we do not use these gestures, but the raw data provided by the device in order to measure the force applied to a gesture and to use Myo vibrations as a feedback system, aiming to improve the user experience. We propose two techniques designed to explore the capabilities of the Myo armband as an interaction tool for input and feedback in a VRE. The objective is to evaluate the usability of the Myo as an input and output device for selection and manipulation of 3D objects in virtual reality environments. The proposed techniques were evaluated by conducting user tests with ten users. We analyzed the usefulness, efficiency, effectiveness, learnability and satisfaction of each technique and we conclude that both techniques had high usability grades, demonstrating that Myo armband can be used to perform selection and manipulation task, and it can enrich the experience making it more realistic by using the possibility of measuring the strength applied to the gesture and the vibration feedback system.


3D interaction Virtual reality Gesture-based control Myo armband 


  1. 1.
    Rautaray, S.S., Kumar, A., Agrawal, A.: Human computer interaction with hand gestures in virtual environment. In: Kundu, M.K., Mitra, S., Mazumdar, D., Pal, S.K. (eds.) PerMIn 2012. LNCS, vol. 7143, pp. 106–113. Springer, Heidelberg (2012). Scholar
  2. 2.
    Sathiyanarayanan, M., Mulling, T., Nazir, B.: Controlling a robot using a wearable device (MYO). Int. J. Eng. Dev. Res. 3(3), 1–6 (2015)Google Scholar
  3. 3.
    Nuwer, R.: Armband adds a twitch to gesture control (2013). Accessed 7 Feb 2018
  4. 4.
    Premaratne, P., Nguyen, Q.: Consumer electronics control system based on hand gesture moment invariants. IET Comput. Vis. 1(1), 35–41 (2007)CrossRefGoogle Scholar
  5. 5.
    Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., Yang, J.: A framework for hand gesture recognition based on accelerometer and EMG sensors. IEEE Trans. Syst. Man Cybern.-Part A: Syst. Hum. 41(6), 1064–1076 (2011)CrossRefGoogle Scholar
  6. 6.
    Mann, S.: Intelligent Image Processing. IEEE (2002)Google Scholar
  7. 7.
    Polsley, S.: An analysis of electromyography as an input method for resilient and affordable systems: human-computer interfacing using the body’s electrical activity. J. Undergrad. Res. (Summer 2013-Spring 2014), 30–36 (2014)Google Scholar
  8. 8.
    Riillo, F., Quitadamo, L.R., Cavrini, F., Saggio, G., Pinto, C.A., Pastò, N.C., Sbernini, L., Gruppioni, E.: Evaluating the influence of subject-related variables on EMG-based hand gesture classification. In: IEEE International Symposium on Medical Measurements and Applications (MeMeA), pp. 1–5. IEEE (2014)Google Scholar
  9. 9.
    McCullough, M., Xu, H., Michelson, J., Jackoski, M., Pease, W., Cobb, W., Kalescky, W., Ladd, J., Williams, B.: Myo arm: swinging to explore a VE. In: Proceedings of the ACM SIGGRAPH Symposium on Applied Perception, pp. 107–113. ACM (2015)Google Scholar
  10. 10.
    Haque, F., Nancel, M., Vogel, D.: Myopoint: pointing and clicking using forearm mounted electromyography and inertial motion sensors. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3653–3656. ACM (2015)Google Scholar
  11. 11.
    Phelan, I., Arden, M., Garcia, C., Roast, C.: Exploring virtual reality and prosthetic training. In: IEEE Virtual Reality (VR), pp. 353–354. IEEE (2015)Google Scholar
  12. 12.
    Chen, X., Zhang, X., Zhao, Z.Y., Yang, J.H., Lantz, V., Wang, K.Q.: Multiple hand gesture recognition based on surface EMG signal. In: The 1st International Conference on Bioinformatics and Biomedical Engineering, ICBBE, pp. 506–509. IEEE (2007)Google Scholar
  13. 13.
    Nguyen, T.T.H., Duval, T.: Poster: 3-point++: a new technique for 3D manipulation of virtual objects. In: IEEE Symposium on 3D User Interfaces (3DUI), pp. 165–166. IEEE (2013)Google Scholar
  14. 14.
    Bowman, D.A., Hodges, L.F.: An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. In: Proceedings of the Symposium on Interactive 3D Graphics, pp. 35–ff. ACM (1997)Google Scholar
  15. 15.
    Rubin, J., Chisnell, D.: Handbook of Usability Testing, Second Edition: How to Plan, Design, and Conduct Effective Tests. Wiley Publishing, Indianapolis (2008)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Yadira Garnica Bonome
    • 1
  • Abel González Mondéjar
    • 1
  • Renato Cherullo de Oliveira
    • 1
  • Eduardo de Albuquerque
    • 1
  • Alberto Raposo
    • 1
    Email author
  1. 1.Department of InformaticsPUC-RioRio de JaneiroBrazil

Personalised recommendations