Evaluating User-Elicited Gestures for Physical Peripheral Interaction with Smart Devices

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1217)


We evaluated the user-defined gestures for interacting with personal devices which are positioned in the periphery of attention. The evaluation process includes two phases: selecting user-elicited gestures and assessing their learnability and intuitiveness in a dual-task test. We first made use of image schemas to interpret user gestures and then group them into several cross-cultural mental models of defining gestures. Through this step we selected the most usable gesture for each mental model. A user test was then conducted to evaluate the performances of selected gestures when participants were concentrating on a concurrent typing task on the PC. These gestures were at the same level of distracting users from main task and the ease of use, in doing so the feasibility of the evaluation method were partly verified.


User-elicited gestures Peripheral interaction Evaluation Image schemas 


  1. 1.
    Asikhia, O.K., Setchi, R., Hicks, Y., Walters, A.: Conceptual framework for evaluating intuitive interaction based on image schemas. Interact. Comput. 27(3), 287–310 (2015)CrossRefGoogle Scholar
  2. 2.
    Bakker, S., van den Hoven, E.: Peripheral interaction: characteristics and considerations. Pers. Ubiquitous Comput. 19, 239–254 (2015)CrossRefGoogle Scholar
  3. 3.
    Bakker, S., Niemantsverdriet, K.: The interaction-attention continuum: considering various levels of human attention in interaction design. Int. J. Des. 10(2), 1–14 (2016)Google Scholar
  4. 4.
    Blackler, A., Hurtienne, J.: Towards a unified view of intuitive interaction: definitions, models and tools across the world. MMI Interaktiv - User Exp. 1(13), 36–54 (2007)Google Scholar
  5. 5.
    Dim, N.K., Silpasuwanchai, C., Sarcar, S., Ren, X.: Designing mid-air gestures for blind people using user- and choice-based elicitation approaches. In: Proceedings of DIS, Brisbane, Australia (2016)Google Scholar
  6. 6.
    Heijboer, M., Van den Hoven, E., Bongers, B., Bakker, S.: Facilitating peripheral interaction: design and evaluation of peripheral interaction for a gesture-based lighting control with multimodal feedback. Pers. Ubiquitous Comput. 20, 1–22 (2016)CrossRefGoogle Scholar
  7. 7.
    Hurtienne, J., Klöckner, K., Diefenbach, S., Nass, C., Maier, A.: Designing with image schemas: resolving the tension between innovation, inclusion and intuitive use. Interact. Comput. 27(3), 235–255 (2015)CrossRefGoogle Scholar
  8. 8.
    Johnson, M.: The Body in the Mind: The Bodily Basis of Meaning, Imagination, and Reason. The University of Chicago Press, Chicago and London (1987)CrossRefGoogle Scholar
  9. 9.
    Lakoff, G.: Women, Fire, and Dangerous Things: What Categories Reveal About the Mind. The University of Chicago Press, Chicago and London (1987)CrossRefGoogle Scholar
  10. 10.
    Pohl, H., Rohs, M.: Around-device devices: my coffee mug is a volume dial. In: Proceedings of MobileHCI 2014, pp. 81–90 (2014)Google Scholar
  11. 11.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of CHI 2009, pp. 1083–1092 (2009)Google Scholar

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.College of Communication and Art DesignUniversity of Shanghai for Science and TechnologyShanghaiPeople’s Republic of China
  2. 2.School of DesignHunan UniversityChangshaPeople’s Republic of China

Personalised recommendations