Advertisement

Seeing, Hearing and Feeling Through the Body: The Emerging Science of Human-Somatosensory Interactions

  • Maria KaramEmail author
  • Patrick Langdon
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9176)

Abstract

Research involving the human skin and its potential to be used as a versatile, practical and highly effective communication channel for receiving information has been explored for over a century. But while the body’s ability to perceive and process information is relatively well understood, touch is only one of the senses of the somatosensory system. Although the communication potential is great, the body is still not represented in the computer interactions we have come to accept in our everyday lives. In this paper, the domain of physical display systems and interactions are surveyed with a view to developing a framework that can offer a more principled and cohesive perspective of this multi-disciplined field of interaction research and development. The paper presents a brief survey of physical displays, and proposes a framework that combines critical parameters from four areas of research to help understand this field of somatosensory-based computer interactions: application, physiology, technology, and psychology.

Keywords

Somatosensory interactions Tactile displays Haptic interfaces Sensory substitution Framework development Touch-based displays Kinesthesia 

References

  1. 1.
    Popular Mechanics Magazine, Hearing through your fingers, vol 51, issue no 5, pp. 755–760, May 1929. http://books.google.ca/books?id=wN4DAAAAMBAJ&lpg=PP1&pg=PA3#v=onepage&q&f=false
  2. 2.
    Klatzky, R.L., Lederman, S.J.: Touch. In: Healy, A.F., Proctor, R.W. (eds.) Experimental Psychology ,vol. 4 in I. B. Weiner (Editor-in-Chief) Handbook of psychology, pp. 147–176. Wiley, New York (2002)Google Scholar
  3. 3.
    Gustav, T.F.: Elements of psychophysics, Sections VII (“Measurement of sensation”) and XIV (“The fundamental formula and the measurement formula”) (1860). (Trans. by Herbert S. Langfeld, first appearing in B. Rand (Ed.) (1912), The classical psychologists)Google Scholar
  4. 4.
    Ward, T.B., Foley, C.M., Cole, J.: Classifying multidimensional stimuli: Stimulus, task, and observer factors. J. Exp. Psychol. Hum. Percept. Perform. 12, 211–225 (1986)CrossRefGoogle Scholar
  5. 5.
    Pham, H.-P., Ammi, M., Fontaine, J.-G., Bourdot, P.: A framework for building haptic interactions for teleoperation systems. In: Proceedings of the 2008 Ambi-Sys workshop on Haptic user interfaces in ambient media systems (HAS 2008). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), Brussels, Article 5, p. 10 (2008)Google Scholar
  6. 6.
    Ledo, D., Nacenta, M.A., Marquardt, N., Boring, S., Greenberg, S.: The HapticTouch toolkit: enabling exploration of haptic interactions. In: Stephen N. Spencer (ed.) Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction (TEI 2012), pp. 115–122. ACM, New York, NY, USA (2012) doi: 10.1145/2148131.2148157. http://doi.acm.org/10.1145/2148131.2148157
  7. 7.
    Wall, S.A., Brewster, S.: Sensory substitution using tactile pin arrays: human factors technology and applications. Sig. Process. 86, 3674–3695 (2006)CrossRefGoogle Scholar
  8. 8.
    Loomis, J.M. Lederman, S.J.: What utility is there in distinguishing between active and passive touch. In: Talk presented at the annual meeting of the Psychonomic Society, San Antonio Texas, November 1984Google Scholar
  9. 9.
    Jones, L.A., Lederman, S.J.: Human Hand function. OU Press, Maidenhead (2006)CrossRefGoogle Scholar
  10. 10.
    Hix, D., Hartson, H.R.: Developing User Interfaces: Ensuring Usability Through Product & Process. Wiley, New York (1993)Google Scholar
  11. 11.
    Brewster, S., Brown, L.M.: Tactons: Structured Tactile Messages for Non-Visual Information Display, 5th Australian User Interface Conference (AUIC2004). Dunedin, New Zealand (2004)Google Scholar
  12. 12.
    Van Erp, J.B.F., Van Veen, H.A.H.C., Jansen, C., Dobbins, T.: Waypoint navigation with a vibrotactile waist belt. ACM Trans. Appl. Percept. (TAP) 2(2), 106–117 (2005). doi: 10.1145/1060581.1060585 CrossRefGoogle Scholar
  13. 13.
    White, B.W., et al.: Seeing with the skin. Percept. Psychophysics 7(1), 23–27 (1970)CrossRefGoogle Scholar
  14. 14.
    Karam, M., Russo, F.A., Fels, D.I.: Designing the model human cochlea: an ambient crossmodal audio-tactile display. IEEE Trans. Haptics 2(3), 160–169 (2009). doi: 10.1109/TOH.2009. http://dx.doi.org/10.1109/TOH.2009.32 CrossRefGoogle Scholar
  15. 15.
    Bach-y-Rita, P.: Tactile vision substitution: past and future. Int. J. Neurosci. 19(1-4), 29–36 (1983)CrossRefGoogle Scholar
  16. 16.
    Kaczmarek, K.A., Webster, J.G., Bach-y-Rita, P., Tompkins, W.J.: Electrotactile and vibrotactile displays for sensory substitution system. IEEE Transactions on Biomedical Engineering 38(1), 1–16 (1991). doi: 10.1109/10.68204 CrossRefGoogle Scholar
  17. 17.
    Brooks, P.L., Frost, B.J.: Evaluation of a tactile vocoder for word recognition. J. Acoust. Soc. America 74, 34–39 (1983). http://dx.doi.org/10.1121/1.389685 CrossRefGoogle Scholar
  18. 18.
    Sodhi, R., Poupyrev, I., Glisson, M., Israr, A.: AIREAL: interactive tactile experiences in free air. ACM Trans. Graph 32(4), 10 (2013). doi: 10.1145/2461912.2462007. Article 134. http://doi.acm.org/10.1145/2461912.2462007 CrossRefzbMATHGoogle Scholar
  19. 19.
    Burdea, G.C.: Force and Touch Feedback for Virtual Reality. Wiley, New York (1996)Google Scholar
  20. 20.
    Hwang, F., Keates, S., Langdon, P., Clarkson, P.J.: Multiple haptic targets for motion-impaired computer users. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2003), pp. 41–48. ACM, New York NY, USA (2003) doi:  10.1145/642611.642620. http://doi.acm.org/10.1145/642611.642620
  21. 21.
    Eid, M., Orozco, M., El Saddik, A.: A guided tour in haptic audio visual environments and applications. Int. J. Adv. Media Commun. 1(3), 265–297 (2007). doi: 10.1504/IJAMC.2007.013918. http://dx.doi.org/10.1504/IJAMC.2007.013918 CrossRefzbMATHGoogle Scholar
  22. 22.
    Bach-y-Rita, P., Webster, J.G., Tompkins, W.J., Crabb, T.: Sensory substitution for space gloves and space robots. In: Space Telerobotics Workshop, 20–22 1987, pp. 51–57. Jet Propulsio Lab, Pasadena (1987)Google Scholar
  23. 23.
    Yu, W., Brewster, S.: Comparing two haptic interfaces for multimodal graph rendering. In: IEEE VR2002, 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Orlando, FL (2002)Google Scholar
  24. 24.
    Sensable technologies LLC. Phantom OMNI haptic device. http://www.dentsable.com/haptic-phantom-omni.htm
  25. 25.
    Metropolis Entertainment. 360 interactive flight simulator. http://www.metropolisav.com/
  26. 26.
    Newman, W.M.: Better or just different? on the benefits of designing interactive systems in terms of critical parameters. In: Coles, S. (ed.) Proceedings of the 2nd conference on Designing Interactive Systems: Processes, Practices, Methods, And Techniques (DIS 1997), pp. 239–245. ACM, New York, NY, USA (1997). doi: 10.1145/263552.263615. http://doi.acm.org/10.1145/263552.263615
  27. 27.
    Jinnai, A., Otsuka, A., Nakagawa, S., Kotani, K., Asao, T., Suzuki, S.: Evaluation of somatosensory evoked responses when multiple tactile information was given to the palm: a meg study. In: Yamamoto, S. (ed.) HCI 2013, Part I. LNCS, vol. 8016, pp. 594–603. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-39209-2_66. http://dx.doi.org/10.1007/978-3-642-39209-2_66 CrossRefGoogle Scholar
  28. 28.
    Wu, Q., Li, X.R., Wu, G.S.: Interface design for somatosensory interaction. In: Marcus, A. (ed.) DUXU 2014, Part II. LNCS, vol. 8518, pp. 794–801. Springer, Heidelberg (2014). doi: 10.1007/978-3-319-07626-3_75. http://dx.doi.org/10.1007/978-3-319-07626-3_75 Google Scholar
  29. 29.
    Murphy, E., Moussette, C., Verron, C., Guastavino, C.: Supporting sounds: design and evaluation of an audio-haptic interface. In: Magnusson, C., Szymczak, D., Brewster, S. (eds.) HAID 2012. LNCS, vol. 7468, pp. 11–20. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  30. 30.
    Kayser, C., Petkov, C., Augath, M., Logothetis, N.: Integration of touch and sound in auditory cortex. Neuron 48(2), 373–384 (2005)CrossRefGoogle Scholar
  31. 31.
    McGuirl, J.M., Sarter, N.B.: Presenting in-flight icing information: a comparison of visual and tactile cues. Digit. Avionics Syst. DASC 1, 2A2/1–2A2l/8 (2001)Google Scholar
  32. 32.
    Politis, I., Brewster, S.A., and Pollick, F.: Evaluating multimodal driver displays under varying situational urgency. In: CHI 2014, pp. 4067–4076. ACM Press (2014b)Google Scholar
  33. 33.
    Form perception with a 49-point electrotactile stimulus array on the tongue: a technical note. J. Rehabil. Res. Dev. vol. 35, pp. 427–430Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1. Department of InformaticsKings College LondonLondonUK
  2. 2.Engineering Design CentreUniversity of CambridgeCambridgeUK

Personalised recommendations