Advertisement

Motion estimation framework and authoring tools based on MYOs and Bayesian probability

  • Sang-Geol Lee
  • Yunsick Sung
  • Jong Hyuk Park
Article

Abstract

Nowadays, diverse kinds of user interfaces are being developed based on the natural user interface/experience. Examples of these include Leap motion, which measures finger motions to produce finger-based commands and MYO, which measures arm motions for arm-based commands. However, these types of motion sensors are still too expensive to be utilized for commercial applications. Moreover, multiple motion sensors sometimes need to be utilized concurrently in order to estimate user motions accurately. Thus, either the cost of motion sensors or the number utilized needs to be reduced. This paper proposes a motion framework that estimates unmeasured motions based on Bayesian probability and measured motions, where motions are defined by a set of MYO sensor values. Bayesian probability is calculated in advance by measuring co-related motions and counting the occurrence of these measured co-related motions. As a result, the number of MYOs needed is reduced. In experiments conducted using MYOs, the processes used to calculate Bayesian probability and to estimate unmeasured motions were validated. Comparison of the measured motions with the unmeasured motions showed that the difference between the two types of motions was small, and indicated that the proposed motion estimation framework estimates unmeasured motions with an average error of 0.05, which exhibits a 25 % improvement over the traditional method.

Keywords

Bayesian probability MYO Motion sensors NUI/NUX 

Notes

Acknowledgments

This research was supported by the MSIP(Ministry of Science, ICT and Future Planning), Korea, under the ITRC(Information Technology Research Center) support program (IITP-2016-H8501-16-1014) supervised by the IITP(Institute for Information & communications Technology Promotion)

References

  1. 1.
    Carmody T (2010) Why ‘Gorilla arm syndrome’ rules out multitouch notebook displays, internet: http://www.wired.com/gadgetlab/2010/10/gorilla-arm-multitouch/, wired
  2. 2.
    Glegg SMN, Tatla SK, Holsti L (2014) The GestureTek virtual reality system in rehabilitation: a scoping review. Disabil Rehab: Assistive Technol 9(2):89–111CrossRefGoogle Scholar
  3. 3.
    Harrison C, Tan D, Morris D (2010) Skinput: appropriating the body as an input surface. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 453–462Google Scholar
  4. 4.
    Khademi M, Hondori HM, McKenzie A, Dodakian L, Lopes CV, Cramer SC (2014) Free-hand Interaction with leap motion controller for stroke rehabilitation, CHI extended Abstracts, pp 1663–1668Google Scholar
  5. 5.
    Kim PY, Kim JW, Sung Y (2016) Bayesian probability-based hand property control method. In: Proceedings of the 3rd international conference on intelligent technologies and engineering systems (ICITES2014), vol 345, no. 33. Springer International Publishing, Cham, pp 251–256Google Scholar
  6. 6.
    Kim A-R, Rhee S-Y (2013) Mobile robot control using kinect sensor and smartphone. Proc KIIS Spring Conf 23(1):133–134Google Scholar
  7. 7.
    Kim PY, Sung Y, Park J (2015) Bayesian probability-based motion estimation method in ubiquitous computing environments. Adv Comput Sci Ubiquit Comput 373:593–598CrossRefGoogle Scholar
  8. 8.
    Korea Creative Content Agency (2011) Culture technology (CT) deep report. vol 12: BCI (Brain Computer Interface) Technology TrendGoogle Scholar
  9. 9.
    Lee S-B, Jung I-H (2014) A design and implementation of natural user interface system using kinect. J Digit Contents Soc 15(4):473–480CrossRefGoogle Scholar
  10. 10.
    Li W, Zhang Z, Liu Z (2010) Action recognition based on a bag of 3D points. CVPR Workshops, pp 9–14Google Scholar
  11. 11.
    Majoe D, Widmer L, Tschiemer P, Gutknecht J (2009) Tai Chi Motion recognition, embedding the HMM method on a wearable. IEEE Joint Conferences on Pervasive Computing (JCPC), pp 339–344Google Scholar
  12. 12.
    Nymoen K, Haugen MR, Jensenius AR (2015) MuMYO—Evaluating and exploring the MYO armband for musical interaction. International conference on new interfaces for musical expression, pp 1–4Google Scholar
  13. 13.
    Potter LE, Araullo J, Carter L (2013) The leap motion controller: a view on sign language. OZCHI, pp 175–178Google Scholar
  14. 14.
    Schlömer T, Poppinga B, Henze N, Boll S (2008) Gesture recognition with a Wii controller, tangible and embedded interaction, pp 11–14Google Scholar
  15. 15.
    Son J, Sung Y (2016) Bayesian probability and user experience-based smart UI design method. In: Proceedings of the 3rd international conference on intelligent technologies and engineering systems (ICITES2014), vol. 345, no. 32, Cham: Springer International Publishing, pp. 245–250Google Scholar
  16. 16.
    Strelow D, Singh S (2002) Optimal motion estimation from visual and inertial measurements. Presented at the sixth IEEE workshop on applications of computer vision (WACV 2002), pp 314–319Google Scholar
  17. 17.
    Suh D (2015) A study on interactive video installation based on kinect with continuous silhouette line drawings of body movements—based on the work < You> J Korean Soc Media Arts 13(1):119–132Google Scholar
  18. 18.
    Wei W, Yunxiao A (2009) Vision-based human motion recognition: a survey, presented at the second international conference on intelligent networks and intelligent systems (ICINIS), pp 386–389Google Scholar
  19. 19.
    Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringPusan National UniversityBusanSouth Korea
  2. 2.Faculty of Computer EngineeringKeimyung UniversityDaeguSouth Korea
  3. 3.Department of Computer Science and EngineeringSeoul National University of Science and TechnologySeoulSouth Korea

Personalised recommendations