Advertisement

User-defined gesture interaction for in-vehicle information systems

  • Huiyue WuEmail author
  • Yu Wang
  • Jiayi Liu
  • Jiali Qiu
  • Xiaolong (Luke) Zhang
Article
  • 21 Downloads

Abstract

Gesture elicitation study, a technique emerging from the field of participatory design, has been extensively applied in emerging interaction and sensing technologies in recent years. However, traditional gesture elicitation study often suffers from the gesture disagreement and legacy bias problem and may not generate optimal gestures for a target system. This paper reports a research project on user-defined gestures for interacting with in-vehicle information systems. The main contribution of our research lies in a 3-stage, participatory design method we propose for deriving more reliable gestures than traditional gesture elicitation methods. Using this method, we generated a set of user-defined gestures for secondary tasks in an in-vehicle information system. Drawing on our research, we develop a set of design guidelines for freehand gesture design. We highlight the implications of this work for the gesture elicitation for all gestural interfaces.

Keywords

User-defined gestures Elicitation study Legacy bias Gesture disagreement Gesture-based user interaction for in-vehicle information system 

Notes

Acknowledgements

The authors would like to thank the anonymous reviewers for their insightful comments. This work was supported by the National Natural Science Foundation of China under Grant No. 61772564, 61772468.

References

  1. 1.
    Akyol S, Canzler U, Bengler K, Hahn W (2000) Gesture control for use in automobiles. In: IAPR Workshop on Machine Vision Applications, p 349-352Google Scholar
  2. 2.
    Alpern M, Minardo K (2003) Developing a car gesture interface for use as a secondary task. In: ACM CHI’03, p 932-933Google Scholar
  3. 3.
    Angelini L, Carrino F, Carrino S, Caon M, Khaled OA, Baumgaartner J, Sonderegger A, Lalanne D, Mugellini E (2014) Gesturing on the steering wheel: a user-elicited taxonomy. In: AutomotiveUI’14, p 1–8Google Scholar
  4. 4.
    Bach K M, Jæger M G, Skov M B, Thomassen N G (2008) You can touch, but you can't look: interacting with in-vehicle systems. In: CHI’08. p 1139–1148Google Scholar
  5. 5.
    Buchanan S, Bourke Floyd CH IV, Holderness W (2013) Towards user-defined multi-touch gestures for 3D objects. In: ITS’13. p 231-240Google Scholar
  6. 6.
    Cai ZY, Han JG, Li L, Ling S (2017) RGB-D datasets using microsoft kinect or similar sensors: a survey. Multimed Tools Appl 76(3):4313–4355CrossRefGoogle Scholar
  7. 7.
    Chan E, Seyed T, Stuerzlinger W, Yang X D, Maurer F (2016) User elicitation on single-hand microgestures. In: ACM CHI’16. p 3403-3411Google Scholar
  8. 8.
    Chen Z, Ma XC, Peng ZY, Zhou Y, Yao MG, Ma Z, Wang C, Gao ZF, Shen MW (2018) User-defined gestures for gestural interaction: extending from hands to other body parts. Int J Hum Comput Interact 34(3):238–250CrossRefGoogle Scholar
  9. 9.
    Cheng H, Yang L, Liu ZC (2016) Survey on 3D hand gesture recognition. IEEE Trans Circuits Syst Video Technol 26(9):1659–1673CrossRefGoogle Scholar
  10. 10.
    Choi E, Kwon S, Lee D, Lee H, Chung MK (2014) Towards successful user interaction with systems: focusing on user-derived gestures for smart home systems. Appl Ergon 45:1196–1207CrossRefGoogle Scholar
  11. 11.
    Connell S, Kuo P Y, Piper A M (2013) A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface. In: IDC’13, p 277-280Google Scholar
  12. 12.
    Dong HW, Danesh A, Figueroa N, Saddik AE (2015) An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. IEEE Access:543–555Google Scholar
  13. 13.
    Döring T, Kern D, Marshall P, Pfeiffer M, Schöning J, Gruhn V, Schmidt A (2011) Gestural interaction on the steering wheel – reducing the visual demand. In: ACM CHI’11. p 483-492Google Scholar
  14. 14.
    England D, Ruperez M, Botto C, Nimoy J, Poulter S (2007) Creative technology and HCI. Proceedings of HCI edGoogle Scholar
  15. 15.
    Feng ZQ, Yang B, Li Y, Zheng YW, Zhao XY, Yin JQ, Meng QF (2013) Real-time oriented behavior-driven 3D freehand tracking for direct interaction. Pattern Recogn 46:590–608CrossRefzbMATHGoogle Scholar
  16. 16.
    Findlater L, Lee B, Wobbrock J (2012) Beyond QWERTY: augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input. In: ACM CHI’12. p 2679–2682Google Scholar
  17. 17.
    Freeman FG, Mikulka PJ, Scerbo MW, Prinzel LJ, Clouatre K (2000) Evaluation of a psychophysiologically controlled adaptive automation system using performance on a tracking task. Appl Psychophysiol Biofeedback 25(2):103–115CrossRefGoogle Scholar
  18. 18.
    Gheran B F, Vanderdonckt J, Vatavu R D (2018) Gestures for smart rings: empirical results, insights, and design implications. In: DIS’18. p 623–635Google Scholar
  19. 19.
    Grijincu D, Nacenta M A, Kristensson P O (2014) User-defined interface gestures: dataset and analysis.” In: ITS’14. p 25–34Google Scholar
  20. 20.
    Hoff L, Hornecker E, Bertel S (2016) Modifying gesture elicitation: do kinaesthetic priming and increased production reduce legacy bias? In: TEI’16. p 86–91Google Scholar
  21. 21.
    Kray C, Nesbitt D, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: MobileHCI’10. p 239-248Google Scholar
  22. 22.
    Kühnel C, Westermann T, Hemmer F, Kratz S (2011) I’m home: defining and evaluating a gesture set for smart-home control. Int J Hum Comput Stud 69:693–704CrossRefGoogle Scholar
  23. 23.
    Kulshreshth A, LaViola Jr, JJ (2014) Exploring the usefulness of finger-based 3D gesture menu selection. In: ACM CHI’14. p 1093–1102Google Scholar
  24. 24.
    Kurdyukova E, Redlin M, André E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. In: ACM IUI’12. p 93-96Google Scholar
  25. 25.
    Lee G A, Choi J S, Wong J, Park C J, Park H S, Billinghurst M (2015) User defined gestures for augmented virtual mirrors: a guessability study. In: ACM CHI’15. p 959-964Google Scholar
  26. 26.
    Löcken A, Hesselmann T, Pielot M, Henze N, Boll S (2011) User-centered process for the definition of freehand gestures applied to controlling music playback. Multimedia Systems 18(1):15–31CrossRefGoogle Scholar
  27. 27.
    Loehmann S, Knobel M, Lamara M, Butz A (2013) Culturally independent gestures for in-car interactions. In: INTERACT’13. p 538–545Google Scholar
  28. 28.
    Lou XL, Peng R, Hansen P, Li XDA (2018) Effects of user’s hand orientation and spatial movements on free hand interactions with large displays. Int J Hum Comput Interact 34(6):519–532CrossRefGoogle Scholar
  29. 29.
    Miller GA (1955) The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol Rev 101(2):343–352CrossRefGoogle Scholar
  30. 30.
    Montero C S, Alexander J, Marshall M, Subramanian S (2010) Would you do that? – Understanding social acceptance of gestural interfaces. In: MobileHCI’10. p 275–278Google Scholar
  31. 31.
    Morris M R (2012) Web on the wall: insights from a multimodal interaction elicitation study. In: ITS’12. p 95–104Google Scholar
  32. 32.
    Morris MR, Danielescu A, Drucker S, Fisher D, Lee B, Schraefel MC, Wobbrock JO (2014) Reducing legacy bias in gesture elicitation studies. Interactions. 21(3):40–45CrossRefGoogle Scholar
  33. 33.
    Nebeling M, Huber A, Ott D, Norrie M C (2014). Web on the wall reloaded: implementation, replication and refinement of user-defined interaction sets. In: ITS’14. p 15–24Google Scholar
  34. 34.
    Nielsen M, Störring M, Moeslund T, Granum E (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. Gesture-based Communication in Human–Computer Interaction. p 105–106Google Scholar
  35. 35.
    Obaid M, Haring M, Kistler F, Buhling R, Andre E (2012) User-defined body gestures for navigational control of a humanoid robot. In: ICSR’12. p 367-377Google Scholar
  36. 36.
    Piumsomboon, T, Billinghurst M, Clark A, Cockburn A (2013) User-defined gestures for augmented reality. In: ACM CHI’13. p 955-960Google Scholar
  37. 37.
    Rahman A S M M, Saboune J, Saddik A E I (2011) Motion-path based in car gesture control of the multimedia devices. In: ACM International Symposium on Design and Analysis of Intelligent Vehicular Networks and Applications. p 69-75Google Scholar
  38. 38.
    Randolph JJ (2005) Free-marginal multirater kappa: an alternative to Fleiss´ fixed-marginal multirater kappa. Online Submission 4(3):20Google Scholar
  39. 39.
    Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54CrossRefGoogle Scholar
  40. 40.
    Riener A (2012) Hand and finger gestures in vehicular applications. Computer. 45:6CrossRefGoogle Scholar
  41. 41.
    Riener A, Rossbory M, Ferscha A (2011) Natural DVI based on intuitive hand gestures. In: Workshop User Experience in Cars. INTERACT 2011:5Google Scholar
  42. 42.
    Rovelo G, Vanacken D, Luyten K, Abad F, Camahort E (2014) Multi-viewer gesture-based interaction for omni-directional video. In: ACM CHI’14. p 4077–4086Google Scholar
  43. 43.
    Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: ACM CHI’11. p 197-206Google Scholar
  44. 44.
    Seyed T, Burns C, Sousa MC, Maurer F, Tang A (2012) Eliciting usable gestures for multi-display environments. In: ITS’12. p 41–50Google Scholar
  45. 45.
    Shimon S S A, Lutton C, Xu Z C, Smith S M, Boucher C, Ruiz J (2016) Exploring non-touchscreen gestures for smartwatches. In: CHI’16. p 3822-3833Google Scholar
  46. 46.
    Takahashi M, Fujii M, Naemura M, Satoh S (2013) Human gesture recognition system for TV viewing using time-of-flight camera. Multimed Tools Appl 62:761–783CrossRefGoogle Scholar
  47. 47.
    Tung Y C, Hsu C Y, Wang H Y, Chyou S, Lin J W, Wu P J, Valstar A, Chen M Y (2015) User-defined game input for smart glasses in public space. In: ACM CHI’15. p 3327–3336Google Scholar
  48. 48.
    Valdes C, Eastman D, Grote C, Thatte S, Shaer O, Mazalek A, Ullmer B, Konkel M K (2014) Exploring the design space of gestural interaction with active tokens through user-defined gestures. In: ACM CHI’14. p 4107–4116Google Scholar
  49. 49.
    Vatavu R D (2012) User-defined gestures for free-hand TV control. In: EuroITV’12. p 45–48Google Scholar
  50. 50.
    Vatavu RD, Wobbrock JO (2015). Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: CHI’15. p 1325–1334Google Scholar
  51. 51.
    Vatavu RD, Wobbrock JO (2016). Between-subjects elicitation studies: formalization and tool support. In: CHI’16. p 3390–3402Google Scholar
  52. 52.
    Wahl H, Groh R (2016) User interface and interaction design in future auto-mobility. Design, User Experience, and Usability: Design Thinking and Methods. Springer International PublishingGoogle Scholar
  53. 53.
    Wan MH, Li M, Yang GW, Gai S, Jin Z (2014) Feature extraction using two-dimensional maximum embedding difference. Inf Sci 274:55–69CrossRefGoogle Scholar
  54. 54.
    Wan MH, Lai ZH, Yang GW, Yang ZJ, Zhang FL, Zheng H (2017) Local graph embedding based on maximum margin criterion via fuzzy set. Fuzzy Sets Syst 318:120–131MathSciNetCrossRefGoogle Scholar
  55. 55.
    Wan MH, Yang GW, Gai S, Yang ZJ (2017) Two-dimensional discriminant locality preserving projections (2DDLPP) and its application to feature extraction via fuzzy set. Multimed Tools Appl 76:355–371CrossRefGoogle Scholar
  56. 56.
    Wobbrock JO, Aung HH, Rothrock B, Myers BA (2005) Maximizing the guessability of symbolic input. In: ACM CHI’05. p 1869–1872Google Scholar
  57. 57.
    Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: ACM CHI’09. p 1083–1092Google Scholar
  58. 58.
    Wu HY, Wang JM, Zhang XL (2016) User-centered gesture development in TV viewing environment. Multimed Tools Appl 75(2):733–760CrossRefGoogle Scholar
  59. 59.
    Wu HY, Zhang SK, Qiu JL, Liu JY, Zhang XL (2018) The gesture disagreement problem in freehand gesture interaction. Int J Hum Comput Interact.  https://doi.org/10.1080/10447318.2018.1510607
  60. 60.
    Yang C, Jang Y, Beh J, Han D (2012) Gesture recognition using depth-based hand tracking for contactless controller application. In: IEEE International Conference on Consumer Electronics. p 297-298Google Scholar
  61. 61.
    Yee W (2009) Potential limitations of multi-touch gesture vocabulary: differentiation, adoption, fatigue. In: Proceeding of the 13th International Conference on Human Computer Interaction. p 291–300Google Scholar
  62. 62.
    Zaiţi IA, Pentiuc SG, Vatavu RD (2015) On free-hand TV control: experimental results on user-elicited gestures with leap motion. Pers Ubiquit Comput 19:821–838CrossRefGoogle Scholar
  63. 63.
    Zobl M, Geiger M, Schuller B, Lang W, Rigoll G (2003) A real-time system for hand gesture controlled operation of in-car devices. In: International Conference on Multimedia & Expo. 3. p 541-544Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.The School of Communication and DesignSun Yat-sen UniversityGuangzhouChina
  2. 2.Guangdong Key Laboratory for Big Data Analysis and Simulation of Public OpinionGuangzhouChina
  3. 3.Department of Medical Oncology, Sun Yat-sen University Cancer Center, State Key Laboratory of Oncology, South ChinaCollaborative Innovation Center for Cancer MedicineGuangzhouChina
  4. 4.College of Information Sciences and TechnologyPennsylvania State UniversityState CollegeUSA

Personalised recommendations