Advertisement

Usability of Foot-Based Interaction Techniques for Mobile Solutions

  • Taeyong KimEmail author
  • Jeffrey R. Blum
  • Parisa Alirezaee
  • Andre G. Arnold
  • Pascal E. Fortin
  • Jeremy R. Cooperstock
Chapter
Part of the EAI/Springer Innovations in Communication and Computing book series (EAISICC)

Abstract

Although hand-based interaction dominates mobile applications, this can be unsuitable for use by motor-impaired individuals or in situations such as musical performance or surgery, where the hands are otherwise occupied. The alternative of foot-based interaction, the subject of this chapter, has been shown to offer reasonable performance in such conditions and offers benefits in terms of diversity of input techniques, wide applicability, and social acceptability. This chapter also describes potential applications of foot-based interfaces, with an emphasis on factors related to usability. We aim to inspire designers and developers to consider the potential for leveraging interaction through the feet as a replacement for, or complement to, more traditional application designs.

Notes

Acknowledgments

The authors would like to acknowledge the long-standing support of this research from McGill University’s Center for Intelligent Machines, the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT), and the Natural Sciences and Engineering Research Council of Canada (NSERC).

References

  1. 1.
    Alexander J, Han T, Judd W, Irani P, Subramanian S (2012) Putting your best foot forward: investigating real-world mappings for foot-based gestures. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, Austin, pp 1229–1238Google Scholar
  2. 2.
    Alirezaee P, Girgis R, Kim T, Schlesinger JJ, Cooperstock JR (2017) Did you feel that? Developing novel multimodal alarms for high consequence clinical environments. Georgia Institute of Technology, AtlantaCrossRefGoogle Scholar
  3. 3.
    Anlauff J, Fung J Cooperstock JR (2017) Vibewalk: foot-based tactons during walking and quiet stance. In: World haptics conference (WHC), 2017 IEEE, Munich, Germany, pp 647–652Google Scholar
  4. 4.
    Baker CF (1984) Sensory overload and noise in the icu: sources of environmental stress. Crit Care Nurs Q 6(4):66–80Google Scholar
  5. 5.
    Bitterman N (2006) Technologies and solutions for data display in the operating room. J Clin Monit Comput 20(3):165–173CrossRefGoogle Scholar
  6. 6.
    Blom KJ, Beckhaus S (2010) Virtual collision notification. In: 3D user interfaces (3DUI), 2010 IEEE symposium on. IEEE, Waltham, MA, USA, pp 35–38Google Scholar
  7. 7.
    Blum JR, Frissen I, Cooperstock JR (2015) Improving haptic feedback on wearable devices through accelerometer measurements. In: Proceedings of the 28th annual ACM Symposium on User Interface Software & Technology – UIST ‘15 pp, pp 31–36.  https://doi.org/10.1145/2807442.2807474 URL http://dl.acm.org/citation.cfm?doid=2807442.2807474 CrossRefGoogle Scholar
  8. 8.
    Brewster S, Brown L (2004) Tactons: structured tactile messages for non-visual information display. In: Proceedings of the fifth conference on Australasian user. Interface 28(January):15–23. http://dl.acm.org/citation.cfm?id=976313
  9. 9.
    Brown LM, Brewster SA, Purchase HC (2006) Multidimensional tactons for non-visual information presentation in mobile devices. In: Proceedings of the 8th conference on human-computer interaction with mobile devices and services. ACM, New York, Helsinki, Finland ACM, New York, pp 231–238CrossRefGoogle Scholar
  10. 10.
    Brown LM, Kaaresoja T (2006) Feel who’s talking: using tactons for mobile phone alerts. In: CHI’06 extended abstracts on human factors in computing systems. Montreal, Quebec, Canada ACM, New York, pp 604–609Google Scholar
  11. 11.
    Chan A, MacLean K, McGrenere J (2008) Designing haptic icons to support collaborative turntaking. Int J Hum Comput Stud 66(5):333–355CrossRefGoogle Scholar
  12. 12.
    Choi S, Kuchenbecker KJ (2013) Vibrotactile display: perception, technology, and applications. In: Proc IEEE 101(9):2093–2104CrossRefGoogle Scholar
  13. 13.
    Crossan A, Brewster S, Ng A (2010) Foot tapping for mobile interaction. In: Proceedings of the 24th BCS interaction specialist group conference. British Computer Society, Swinton, pp 418–422Google Scholar
  14. 14.
    Douglas SA, Kirkpatrick AE, MacKenzie IS (1999) Testing pointing device performance and user assessment with the iso 9241, part 9 standard. In: Proceedings of the SIGCHI conference on human factors in computing systems. Pittsburgh, Pennsylvania, USA ACM, New York, pp 215–222Google Scholar
  15. 15.
    Fukahori K, Sakamoto D, Igarashi T (2015) Exploring subtle foot plantar-based gestures with sock-placed pressure sensors. In: Proceedings of the 33rd AnnualACMConference on human factors in computing systems. Seoul, Republic of Korea ACM, New YorkGoogle Scholar
  16. 16.
    Han T, Alexander J, Karnik A, Irani P, Subramanian S (2011) Kick: investigating the use of kick gestures for mobile interactions. In: Proceedings of the 13th international conference on human computer interaction with mobile devices and services. Stockholm, Sweden, ACM, New York pp 29–32Google Scholar
  17. 17.
    Hanggi P (2002) Stochastic resonance in biology how noise can enhance detection of weak signals and help improve biological information processing. ChemPhysChem 3(3):285–290CrossRefGoogle Scholar
  18. 18.
    Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. Adv Psychol 52:139–183CrossRefGoogle Scholar
  19. 19.
    Hatscher B, Luz M, Hansen C (2017) Foot interaction concepts to support radiological interventions. In: Mensch und Computer 2017-Tagungsband, Regensburg, 2017Google Scholar
  20. 20.
    Hijmans JM, Geertzen JH, Schokker B, Postema K (2007) Development of vibrating insoles. Int J Rehabil Res 30(4):343–345CrossRefGoogle Scholar
  21. 21.
    Horodniczy D, Cooperstock JR (2017) Free the hands! Enhanced target selection via a variablefriction shoe. In: Proceedings of the 2017 CHI conference on human factors in computing systems. ACM, Denver, Colorado, USA, ACM, New York, pp 255–259Google Scholar
  22. 22.
    Kennedy PM, Inglis JT (2002) Distribution and behaviour of glabrous cutaneous receptors in the human foot sole. J Physiol 538(3):995–1002CrossRefGoogle Scholar
  23. 23.
    Kim T, Cooperstock JR (2018) Enhanced pressure-based multimodal immersive experiences. In: Proceedings of the 9th augmented human international conference. Seoul, Republic of Korea, ACM, New York p 26Google Scholar
  24. 24.
    Klamka K, Siegel A, Vogt S, Göbel F, Stellmach S, Dachselt R (2015) Look&pedal: hands-free navigation in zoomable information spaces through gaze-supported foot input. In: Proceedings of the 2015 ACM on international conference on multimodal interaction. Seattle, Washington, USA, ACM, New York pp 123–130CrossRefGoogle Scholar
  25. 25.
    Meier A, Matthies DJ, Urban B, Wettach R (2015) Exploring vibrotactile feedback on the body and foot for the purpose of pedestrian navigation. In: Proceedings of the 2nd international workshop on sensor-based activity recognition and interaction. ACM, Rostock, Germany, ACM, New York, p 11Google Scholar
  26. 26.
    Meredith MA, Stein BE (1986) Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol 56(3):640–662CrossRefGoogle Scholar
  27. 27.
    Mondor TA, Finley GA (2003) The perceived urgency of auditory warning alarms used in the hospital operating room is inappropriate. Can J Anaesth 50(3):221–228CrossRefGoogle Scholar
  28. 28.
    Pakkanen T, Raisamo R (2004) Appropriateness of foot interaction for non-accurate spatial tasks. In: CHI’04 extended abstracts on human factors in computing systems. Vienna, Austria ACM, NEW York, pp 1123–1126Google Scholar
  29. 29.
    Paradiso J, Abler C, Hsiao KY, Reynolds M (1997) The magic carpet: physical sensing for immersive environments. In: CHI’97 extended abstracts on human factors in computing systems. Atlanta, Georgia ACM, NEW York, pp 277–278Google Scholar
  30. 30.
    Priplata AA, Patritti BL, Niemi JB, Hughes R, Gravelle DC, Lipsitz LA, Veves A, Stein J, Bonato P, Collins JJ (2006) Noise-enhanced balance control in patients with diabetes and patients with stroke. Ann Neurol 59(1):4–12CrossRefGoogle Scholar
  31. 31.
    Rico J, Brewster S (2010) Usable gestures for mobile interfaces: evaluating social acceptability. In: Proceedings of the SIGCHI conference on human factors in computing systems. Atlanta, Georgia, USA ACM, NEW York pp 887–896Google Scholar
  32. 32.
    Sanderson P (2006) The multimodal world of medical monitoring displays. Appl Ergon 37(4):501–512CrossRefGoogle Scholar
  33. 33.
    Saunders W, Vogel D (2016) Tap-kick-click: foot interaction for a standing desk. In: Proceedings of the 2016 ACM conference on designing interactive systems. Brisbane, QLD, Australia ACM, NEW York, pp 323–333Google Scholar
  34. 34.
    Schöning J, Daiber F, Krüger A, Rohs M (2009) Using hands and feet to navigate and manipulate spatial data. In: CHI’09 extended abstracts on human factors in computing systems. ACM, Boston, MA, USA NEW York, pp 4663–4668Google Scholar
  35. 35.
    Scott J, Dearman D, Yatani K, Truong KN (2010) Sensing foot gestures from the pocket. In: Proceedings of the 23nd annual ACM symposium on user interface software and technology. New York, New York, USA ACM, NEW York, pp 199–208Google Scholar
  36. 36.
    Terziman L, Marchal M, Multon F, Arnaldi B, Ĺecuyer A (2012) The king-Kong effects: improving sensation of walking in vr with visual and tactile vibrations at each step. In: 3D user interfaces (3DUI), 2012 IEEE symposium on. IEEE, pp 19–26Google Scholar
  37. 37.
    Topf M (2000) Hospital noise pollution: an environmental stress model to guide research and clinical interventions. J Adv Nurs 31(3):520–528CrossRefGoogle Scholar
  38. 38.
    Tudor-Locke C, Bassett DR (2004) How many steps/day are enough? Sports Med 34(1):1–8.  https://doi.org/10.2165/00007256-200434010-00001 CrossRefGoogle Scholar
  39. 39.
    Turchet L, Burelli P, Serafin S (2013) Haptic feedback for enhancing realism of walking simulations. IEEE Trans Haptics 6(1):35–45CrossRefGoogle Scholar
  40. 40.
    Velloso E, Alexander J, Bulling A, Gellersen H (2015) Interactions under the desk: a characterisation of foot movements for input in a seated position. In: Human-computer interaction. Springer, pp 384–401Google Scholar
  41. 41.
    Velloso E, Schmidt D, Alexander J, Gellersen H, Bulling A (2015) The feet in human-computer interaction: a survey of foot-based interaction. ACM Comput Surv 48(2):21:1–21:35.  https://doi.org/10.1145/2816455 CrossRefGoogle Scholar
  42. 42.
    Visell Y, Law A, Cooperstock JR (2009) Touch is everywhere: floor surfaces as ambient haptic interfaces. IEEE Trans Haptics 2(3):148–159.  https://doi.org/10.1109/TOH.2009.31 CrossRefGoogle Scholar
  43. 43.
    Visell Y, Law A, Ip J, Smith S, Cooperstock JR (2010) Interaction capture in immersive virtual environments via an intelligent floor surface. In: Virtual Reality Conference (VR). IEEE, Waltham, MA, USA, pp 313–314Google Scholar
  44. 44.
    Watanabe J, Ando H (2010) Pace-sync shoes: intuitive walking-pace guidance based on cyclic vibro-tactile stimulation for the foot. Virtual Reality 14(3):213–219CrossRefGoogle Scholar
  45. 45.
    Weinstein, S (1968) Intensive and extensive aspects of tactile sensitivity as a function of body part, sex and Laterality the First Int’l symp. on the Skin Senses, 1968. URL: http://ci.nii.ac.jp/naid/10017541995/en/
  46. 46.
    Yokota T, Ohtake M, Nishimura Y, Yui T, Uchikura R, Hashida T (2015) Snow walking: motion-limiting device that reproduces the experience of walking in deep snow. In: Proceedings of the 6th augmented human international conference, AH ‘15. Singapore, Singapore, ACM, New York, pp 45–48.  https://doi.org/10.1145/2735711.2735829 CrossRefGoogle Scholar
  47. 47.
    Zhong K, Tian F, Wang H (2011) Foot menu: using heel rotation information for menu selection. In: Wearable computers (ISWC), 2011 15th annual international symposium on. IEEE, pp 115–116Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2019

Authors and Affiliations

  • Taeyong Kim
    • 1
    Email author
  • Jeffrey R. Blum
    • 1
  • Parisa Alirezaee
    • 1
  • Andre G. Arnold
    • 1
  • Pascal E. Fortin
    • 1
  • Jeremy R. Cooperstock
    • 1
  1. 1.Shared Reality LabMcGill UniversityMontrealCanada

Personalised recommendations