Advertisement

3D Hand Movement Measurement Framework for Studying Human-Computer Interaction

  • Toni KuronenEmail author
  • Tuomas Eerola
  • Lasse Lensu
  • Jukka Häkkinen
  • Heikki Kälviäinen
Conference paper
Part of the Lecture Notes in Networks and Systems book series (LNNS, volume 95)

Abstract

In order to develop better touch and gesture user interfaces, it is important to be able to measure how humans move their hands while interacting with technical devices. The recent advances in high-speed imaging technology and in image-based object tracking techniques have made it possible to accurately measure the hand movement from videos without the need for data gloves or other sensors that would limit the natural hand movements. In this paper, we propose a complete framework to measure hand movements in 3D in human-computer interaction situations. The framework includes the composition of the measurement setup, selecting the object tracking methods, post-processing of the motion trajectories, 3D trajectory reconstruction, and characterizing and visualizing the movement data. We demonstrate the framework in a context where 3D touch screen usability is studied with 3D stimuli.

Keywords

High-speed video Hand tracking Trajectory processing 3D reconstruction Video synchronization Human-computer interaction 

Notes

Acknowledgements

The authors would like thank Dr. Jari Takatalo for his efforts in implementing the experiments and producing the data for the research.

References

  1. 1.
    FFmpeg (2018). https://mpeg.org/. Accessed 01 May 2018
  2. 2.
    Camera calibration toolbox for Matlab (2016). http://www.vision.caltech.edu/bouguetj/calib_doc. Accessed 03 Feb 2016
  3. 3.
    Elliott, D., Hansen, S., Grierson, L.E.M., Lyons, J., Bennett, S.J., Hayes, S.J.: Goal-directed aiming: two components but multiple processes. Psychol. Bull. 136(6), 1023–1044 (2010)CrossRefGoogle Scholar
  4. 4.
    Erdem, C.E., Tekalp, A.M., Sankur, B.: Metrics for performance evaluation of video object segmentation and tracking without ground-truth. In: Proceedings of 2001 International Conference on Image Processing, pp. 69–72 (2001)Google Scholar
  5. 5.
    Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: Vision-based hand pose estimation: a review. Comput. Vis. Image Underst. 108(1–2), 52–73 (2007)CrossRefGoogle Scholar
  6. 6.
    Guna, J., Jakus, G., Pogacnik, M., Tomazic, S., Sodnik, J.: An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2), 3702–3720 (2014)CrossRefGoogle Scholar
  7. 7.
    Hariyono, J., Hoang, V.D., Jo, K.H.: Tracking failure detection using time reverse distance error for human tracking. In: International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, pp. 611–620 (2015)Google Scholar
  8. 8.
    Hartley, R.I., Zisserman, A.: Multiple View Geometry in Computer Vision, 2nd edn. Cambridge University Press, Cambridge (2004)CrossRefGoogle Scholar
  9. 9.
    Heikkilä, J., Silven, O.: A four-step camera calibration procedure with implicit image correction. In: Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, p. 1106 (1997)Google Scholar
  10. 10.
    Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)CrossRefGoogle Scholar
  11. 11.
    Hiltunen, V., Eerola, T., Lensu, L., Kälviäinen, H.: Comparison of general object trackers for hand tracking in high-speed videos. In: International Conference on Pattern Recognition (ICPR), pp. 2215–2220 (2014)Google Scholar
  12. 12.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Forward-backward error: automatic detection of tracking failures. In: 20th International Conference on Pattern Recognition, pp. 2756–2759 (2010)Google Scholar
  13. 13.
    Khoshelham, K., Elberink, S.O.: Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors 12(2), 1437–1454 (2012)CrossRefGoogle Scholar
  14. 14.
    Kuronen, T.: Moving object analysis and trajectory processing with applications in human-computer interaction and chemical processes. Ph.D. thesis, Lappeenranta University of Technology (2018)Google Scholar
  15. 15.
    Kuronen, T., Eerola, T., Lensu, L., Kälviäinen, H.: Two-camera synchronization and trajectory reconstruction for a touch screen usability experiment. In: Advanced Concepts for Intelligent Vision Systems, pp. 125–136 (2018)Google Scholar
  16. 16.
    Kuronen, T., Eerola, T., Lensu, L., Takatalo, J., Häkkinen, J., Kälviäinen, H.: High-speed hand tracking for studying human-computer interaction. In: Image analysis. Proceedings of 19th Scandinavian Conference, SCIA 2015, Copenhagen, Denmark, 15–17 June 2015, pp. 130–141. Springer, Cham (2015)Google Scholar
  17. 17.
    Lyubanenko, V., Kuronen, T., Eerola, T., Lensu, L., Kälviäinen, H., Häkkinen, J. Multi-camera finger tracking and 3D trajectory reconstruction for HCI studies. In: Advanced Concepts for Intelligent Vision Systems, pp. 63–74 (2017)Google Scholar
  18. 18.
    Redmon, J., Farhadi, A. Yolo9000: better, faster, stronger. arXiv preprint, arXiv:1612.08242 (2016)
  19. 19.
    Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1137–1149 (2017)CrossRefGoogle Scholar
  20. 20.
    Shi, J., Tomasi, C.: Good features to track. In: Conference on Computer Vision and Pattern Recognition, pp. 593–600 (1994)Google Scholar
  21. 21.
    Valkov, D., Giesler, A., Hinrichs, K.: Evaluation of depth perception for touch interaction with stereoscopic rendered objects. In: ACM International Conference on Interactive Tabletops and Surfaces, pp. 21–30 (2012)Google Scholar
  22. 22.
    Vojir, T.: Tracking with Kernelized Correlation Filters (2017). https://github.com/vojirt/kcf/. Accessed 01 May 2018
  23. 23.
    Wu, H., Chellappa, R., Sankaranarayanan, A.C., Zhou, S.K.: Robust visual tracking using the time-reversibility constraint. In: IEEE 11th International Conference on Computer Vision (ICCV), pp. 1–8 (2007)Google Scholar
  24. 24.
    Zhang, Z.: Flexible camera calibration by viewing a plane from unknown orientations. In: International Conference on Computer Vision, pp. 666–673 (1999)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Toni Kuronen
    • 1
    Email author
  • Tuomas Eerola
    • 1
  • Lasse Lensu
    • 1
  • Jukka Häkkinen
    • 2
  • Heikki Kälviäinen
    • 1
  1. 1.Computer Vision and Pattern Recognition LaboratoryLUT UniversityLappeenrantaFinland
  2. 2.Institute of Behavioural SciencesUniversity of HelsinkiHelsinkiFinland

Personalised recommendations