Advertisement

The intuitive grasp interface: design and evaluation of micro-gestures on the steering wheel for driving scenario

  • Yiqi XiaoEmail author
  • Renke He
Long Paper
  • 42 Downloads

Abstract

Gestural inputs are nowadays widely applied to in-car interactive systems. The emerging sensing technologies allow for micro-gestures that can be achieved with less energy and are performed probably when the driver is grasping the steering wheel, thereby reducing the user attention to human–vehicle interaction. The movability of hands and fingers has to be considered before designing micro-gestures for driving scenarios, thus the newly defined gestures may be utterly different to the familiar multi-touch or body gestures. This paper presents a set of micro-gestures which are designed for intuitively commanding the in-car information system by taking both the gesture meanings and physical limitations into account. The study sets out the results of evaluating the feasibility of gesture sets with its performance in dual-task situation. The learnability and intuitiveness of micro-gestures as well as the effects they have on driving tasks are evaluated, and the effects of different grasping postures on the performance of novice drivers are also compared. It is concluded that the micro-gestures particularly designed for the grasp interface have advantages in multi-tasking, and they are appreciated by users who participated in the evaluation test.

Keywords

Micro-gesture Grasp Gesture set Evaluation Dual-task 

Notes

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

The study protocol conforms to the ethnical standard of the institutional committee and to the declaration of ethical principles adopted by the 18th WMA General Assembly, Helsinki, Finland, June 1964 and its later amendments. The Queen Mary Ethics of Research Committee approved the study protocol which includes all procedures performed in this study with the following reference number QMREC1749a.

Informed consent

Informed consent was obtained from all individual participants involved in the study. This paper does not include any studies with animals conducted by any of the authors.

References

  1. 1.
    Alpern, M., Minardo, K.: (2003). Developing a car gesture interface for use as a secondary task. In: CHI EA’03, Florida, USA, pp 932–933Google Scholar
  2. 2.
    Angelini, L., Caon, M., Carrino, F., Carrino, S., Lalanne, D., Khaled, O.A., Mugellini, E.: WheelSense: Enabling Tangible Gestures on the Steering Wheel for In-Car Natural Interaction, 8005, pp. 531–540. Springer, Berlin Heidelberg (2013)Google Scholar
  3. 3.
    Angelini, L., Carrino, F., Carrino, S., et al.: (2014). Gesturing on the steering wheel: a user-elicited taxonomy. In: Proceedings of AutomotiveUI’14, pp. 1–8, Seattle, WA, USAGoogle Scholar
  4. 4.
    Angelini, L., Baumgartner, J., Carrino, F., et al.: (2016). Comparing gesture, speech and touch interaction modalities for in-vehicle infotainment system. In: IHM’16, Fribourg, SwitzerlandGoogle Scholar
  5. 5.
    Ashbrook, D.: (2007). Supporting mobile microinteractions. Georgia Institute of Technology. DissertationGoogle Scholar
  6. 6.
    Bach, K.M., Jæger, M.G., Skov, M.S., Thomassen, N.G.: (2008). You can touch, but you can’t look: Interacting with in-vehicle systems. In: Proceedings of CHI’ 08, pp. 1139–1148, Florence, ItalyGoogle Scholar
  7. 7.
    Chan, E., Seyed, T., Stuerzlinger, W., Yang, X.D., Maurer, F.: (2016). User elicitation on single-hand microgestures. In: Proceedings of CHI’16, pp. 3403–3414, San Jose, CA, USAGoogle Scholar
  8. 8.
    Döring, T., Kern, D., Marshall, P., et al.: (2011). Gestural interaction on the steering wheel: reducing the visual demand. In: Proc.CHI’11, pp. 483–492, Vancouver, CanadaGoogle Scholar
  9. 9.
    Endres, C., Dimitrov., S.: (2010). Using a Theremin for micro-gesture recognition in an automotive environment. In: Proceedings of AutomotiveUI’10, Pittsburgh, PennsylvaniaGoogle Scholar
  10. 10.
    Geiser, G.: (1985). Man machine interaction in vehicles. ATZ, 87, pp. 74–77Google Scholar
  11. 11.
    González, I.E., Wobbrock, J.O., Chau, D.H., Faulring, A., Myers, B.A.: (2007). Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In: Graphics interface conference 2007, pp. 95–102, Montréal, CanadaGoogle Scholar
  12. 12.
    Gupta, S., Molchanov, P., Yang, X., Kim, K., Tyree, S., Kautz, J.: (2016). Towards selecting robust hand gestures for automotive interfaces. In: Intelligent vehicles symposium, IEEE, pp. 1350–1357Google Scholar
  13. 13.
    Häuslschmid, R., Menrad, B., Butz, A.: (2015). Freehand vs. micro gestures in the car: driving performance and user experience. In: 3D user interfaces, 2015 IEEE symposium, pp. 159–160, FranceGoogle Scholar
  14. 14.
    Horst, A.R.A., van der Martens, M.H.: The peripheral detection task (PDT): on-line measurement of driver cognitive workload and selective attention. SAE Int. 634(4), 73–89 (2010)Google Scholar
  15. 15.
    Hoven, V.D.E., Mazalek, A.: Grasping gestures: gesturing with physical artifacts. Artif. Intell. Eng. Des. Anal. Manuf. 25, 255–271 (2011)CrossRefGoogle Scholar
  16. 16.
    Koyama, S., Sugiura, Y., Ogata, M., Withana, A., Uema, Y., Honda, M.: (2014). Multi-touch steering wheel for in-car tertiary applications using infrared sensors. In: Proceedings of AH’14, pp. 1–4, Kobe, JapanGoogle Scholar
  17. 17.
    Lee, S.H., Yoon, S.O., Shin, J.H.: (2015). On-wheel finger gesture control for in-vehicle systems on central consoles. In: Proceedings of Automotiveui’15, pp. 94–99, Nottingham, United KingdomGoogle Scholar
  18. 18.
    Loclair, C., Gustafson, S., Baudisch, P.: (2010). PinchWatch: A wearable device for one-handed microinteractions. In: Proceedings of MobileHCI 2010, Lisbon, PortugalGoogle Scholar
  19. 19.
    Loehmann, S., Knobel, M., Lamara, M., Butz, A.: (2013). Culturally independent gestures for in-car interactions. In: Kotzé, P., et al. (eds.): INTERACT 2013, Part III, LNCS 8119, pp. 538–545Google Scholar
  20. 20.
    Ma, T., Wee, W., Han, C.Y., Zhou, X.: (2011). A method for single hand fist gesture input to enhance human computer interaction. In: Proceedings of HCI’13, pp. 291–300, Las Vegas, USAGoogle Scholar
  21. 21.
    Mahr, A., Endres, C., Schneeberger, T., Müller, C.: (2011). Determining human-centered parameters of ergonomic micro-gesture interaction for drivers using the theater approach. In: Proceedings of AutomotiveUI’11, pp. 151–158, Salzburg, AustriaGoogle Scholar
  22. 22.
    Matthies, D.J.C., Lecolinet, E., Perrault, S.T., Zhao, S.: (2014). Peripheral microinteraction for wearable computing. In: Workshop on peripheral interaction: shaping the research and design space at CHI 2014Google Scholar
  23. 23.
    Müller, C., Weinburg, G.: Multimodal input in the car, today and tomorrow. IEEE Multimed. 18(1), 98–103 (2011)CrossRefGoogle Scholar
  24. 24.
    Murata, Y., Yoshida, K., Suzuki, K., Takahashi, D.: (2013). Proposal of an automobile driving interface using gesture operation for disabled people. ACHI, pp. 472–478Google Scholar
  25. 25.
    Neßelrath, R., Moniri, M.M., Feld, M.: (2016). Combining speech, gaze, and micro-gestures for the multimodal control of in-car functions. In: 12th International Conference on Intelligent Environment, pp. 190–193, London, UKGoogle Scholar
  26. 26.
    Nielson, M., Störring, M., Moeslund, T.B., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for HCI. Lect. Notes Comput. Sci. 17(17), 1445–1453 (2003)Google Scholar
  27. 27.
    Oh, B.H., Hong, K.S.: Finger gesture-based three dimension mobile user interaction using a rear-facing camera. Int. J. Multimed. Ubiquitous Eng. 8(5), 119–130 (2013)CrossRefGoogle Scholar
  28. 28.
    Penn, F.: (2015). Micro-gesture interaction on the gear shift. Universität Passau, DissertationGoogle Scholar
  29. 29.
    Pfleging, B., Schneegass, S., Schmidt, A.: (2012). Multimodal interaction in the car: Combining speech and gestures on the steering wheel. In: Proceedings of AutomotiveUI’12, Portsmouth, NH, USAGoogle Scholar
  30. 30.
    Riener, A., Ferscha, A., Bachmair, F., Hagmüller, P., Lemme, A., Muttenthaler, D.: (2013). Standardization of the in-car gesture interaction space. Proceedings of AutomotiveUI’13, pp. 155–162, Eindhoven, NetherlandsGoogle Scholar
  31. 31.
    Stecher, M., Baseler, E., Draxler, L., Fricke, L., Michel, B., Zimmermann, A., Bengler, K.: (2015). Tracking down the intuitiveness of gesture interaction in the truck domain. In: 6th International conference on applied human factors and ergonomics (AHFE 2015) and the affiliated conferences, Vol. 3, pp. 3176–3183Google Scholar
  32. 32.
    Tan, Y., Yoon, S.H., Ramani, K.: (2017). BikeGesture: user elicitation and performance of micro hand gesture input for cycling. In: CHI EA’17, pp. 2147–2154, Denver, CO, USAGoogle Scholar
  33. 33.
    Wagner, J., Lecolinet, E., Selker, T.: (2014). Multi-finger chords for hand-held tablets: recognizable and memorable. In: Proceedings of CHI’14, pp. 2883–2892, Toronto, ON, CanadaGoogle Scholar
  34. 34.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: (2009). User-defined gestures for surface computing. In: Proceedings of CHI’09, pp. 1083–1092Google Scholar
  35. 35.
    Wolf, K., Naumann, A., Rohs, M., Müller, J.: (2011). A taxonomy of microinteractions: defining microgestures based on ergonomic and scenario-dependent requirements. In: Campos, P., et al. (eds.), INTERACT 2011, Part I, LNCS 6946, pp. 559–575Google Scholar
  36. 36.
    Wolf, K., Schleicher, R., Kratz, S., Rohs, M.: (2013). Tickle: a surface-independent interaction technique for grasp interfaces. In: Proceedings of TEI’13, pp. 185–192, Barcelona, SpainGoogle Scholar
  37. 37.
    Wolf, K.: Microgestures—enabling gesture input with busy hands. In: Bakker, S., et al. (eds.) Peripheral Interaction, pp. 95–116. Springer, Switzerland (2016)CrossRefGoogle Scholar
  38. 38.
    Zaiti, I.A., Pentiuc, S., G. and Vatavu, R.D.: On free-hand TV control: experimental results on user-elicited gestures with leap motion. Pers. Ubiquitous Comput. 19, 821–838 (2015)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.University of Shanghai for Science and TechnologyShanghaiPeople’s Republic of China
  2. 2.Hunan UniversityChangshaPeople’s Republic of China

Personalised recommendations