Advertisement

Two-Handed Interactive Menu: An Application of Asymmetric Bimanual Gestures and Depth Based Selection Techniques

  • Hani Karam
  • Jiro Tanaka
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8521)

Abstract

In this paper, we propose a Two Handed Interactive Menu as an evaluation of asymmetric bimanual gestures. The menu is split into two parts, one for each hand. The actions are started with the non-dominant hand and continued with the dominant one. Handedness is taken into consideration, and a different interface is generated depending on the handedness. The results of our experiments show that two hands are more efficient than one; however the handedness itself did not affect the results in a significant way. We also introduce the Three Fingers Click, a selection mechanism that explores the possibility of using a depth-sensing camera to create a reliable clicking mechanism. Though difficult to maintain, our Three Fingers Clicking gesture is shown in the experiments to be reliable and efficient.

Keywords

bimanual gestures depth-based click 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hinckley, K., Pausch, R., Proffitt, D., Kassell, N.: Two-handed virtual manipulation. ACM Transactions on Computer-Human Interaction 5, 260–302 (1998)CrossRefGoogle Scholar
  2. 2.
    Veit, M., Capobianco, A., Bechmann, D.: Consequence of two-handed manipulation on speed, precision and perception on spatial input task in 3D modelling applications. Universal Comp. Science 14, 3174–3187 (2008)Google Scholar
  3. 3.
    Guiard, Y.: Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Motor Behavior 19, 486–517 (1987)CrossRefGoogle Scholar
  4. 4.
    Tomita, A., Kambara, K., Siio, I.: Slant menu: novel GUI widget with ergonomic design. In: Proceedings of CHI 2012 Extended Abstracts on Human Factors in Computing Systems, pp. 2051–2056 (2012)Google Scholar
  5. 5.
    Song, P., Boon Goh, W., Hutama, W., Fu, C., Liu, X.: A Handle Bar Metaphor for Virtual Object Manipulation with Mid-Air Interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1297–1306 (2012)Google Scholar
  6. 6.
    Yang, R., Strozzi, A., Lau, A., Lutteroth, C., Chan, Y., Delmas, P.: Bimanual natural user interaction for 3D modelling application using stereo computer vision. In: Proceedings of the 13th International Conference of the NZ Chapter of the ACM’s Special Interest Group on Human-Computer Interaction, pp. 44–51 (2012)Google Scholar
  7. 7.
    Wagner, J., Huot, S., Mackay, W.: BiTouch and BiPad: designing bimanual interaction for hand-held tablets. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2317–2326 (2012)Google Scholar
  8. 8.
    Guimbretière, F., Martin, A., Winograd, T.: Benefits of merging command selection and direct manipulation. ACM Transactions on Computer-Human Interaction 12, 460–476 (2005)CrossRefGoogle Scholar
  9. 9.
    Yang, Z., Li, Y., Zheng, Y., Chen, W., Zheng, X.: An Interaction System Using Mixed Hand Gestures. In: Proceedings of the 10th Asia Pacific Conference on Computer Human Interaction, pp. 125–132 (2012)Google Scholar
  10. 10.
    Boussemart, Y., Rioux, F., Rudzicz, F., Wozniewski, M., Cooperstock, J.: A framework for 3D visualisation and manipulation in an immersive space using an untethered bimanual gestural interface. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, pp. 162–165 (2004)Google Scholar
  11. 11.
    Kabbash, P., Buxton, W., Sellen, A.: Two-Handed Input in a Compound Task. In: Proceedings of ACM CHI Conference, pp. 417–423 (1994)Google Scholar
  12. 12.
    Hinckley, K., Pausch, R., Proffitt, D., Patten, J., Kassell, N.: Cooperative Bimanual Action. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 27–34 (1997)Google Scholar
  13. 13.
    Lévesque, J.C., Laurendeau, D., Mokhtari, M.: Bimanual Gestural Interface for Immersive Virtual Environments. In: Proceedings of the IEEE Virtual Reality Conference, pp. 223–224 (2011)Google Scholar
  14. 14.
    Chen, N., Guimbretière, F., Löckenhoff, C.: Relative role of merging and two-handed operation on command selection speed. International Journal of Human-Computer Studies 66(10), 729–740 (2008)CrossRefGoogle Scholar
  15. 15.
    Odell, D., Davis, R., Smith, A., Wright, P.: Toolglasses, marking menus, and hotkeys: a comparison of one and two-handed command selection techniques. In: Proceedings of Graphics Interface, pp. 17–24 (2004)Google Scholar
  16. 16.
    Guimbretière, F., Nguyen, C.: Bimanual marking menu for near surface interactions. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 825–828 (2012)Google Scholar
  17. 17.
    Wilson, A.: Using a depth camera as a touch sensor. In: ACM International Conference on Interactive Tabletops and Surfaces, pp. 69–72 (2010)Google Scholar
  18. 18.
  19. 19.
  20. 20.
    Harrison, C., Benko, H., Wilson, A.: Omnitouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 441–450 (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Hani Karam
    • 1
  • Jiro Tanaka
    • 1
  1. 1.Department of Computer ScienceUniversity of TsukubaTsukubaJapan

Personalised recommendations