Skip to main content

Assessment of Binaural–Proprioceptive Interaction in Human-Machine Interfaces

  • Chapter
The Technology of Binaural Listening

Part of the book series: Modern Acoustics and Signal Processing ((MASP))

Abstract

Binaural models help to predict human localization under the assumption that a corresponding localization process is based on acoustic signals, thus, on unimodal information. However, what happens if this localization process is realized in an environment with available bimodal or even multimodal sensory input? Do we still consider the auditory modality in the localization process? Can binaural models help to predict human localization in bimodal or multimodal scenes? At the beginning, this chapter focuses on binaural-visual localization and demonstrates that binaural models are definitely required for modeling human localization even when visual information is available. The main part of this chapter dedicates to binaural-proprioceptive localization. First, an experiment is described with which the proprioceptive localization performance was quantitatively measured. Second, the influence of binaural signals on proprioception was investigated to reveal whether synthetically generated spatial sound can improve human proprioceptive localization. The results demonstrate that it is indeed possible to auditorily guide proprioception. In conclusion, binaural models can not only be used for modeling human binaural-visual, but also for modeling human binaural-proprioceptive localization. It is shown that binaural-modeling algorithms, thus, play an important role for further technical developments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Manufactured by SensAble Technologies

  2. 2.

    Chai3D, Stanford University, and Pure Data, Open-Source Project, respectively. For details on the C++ haptic-rendering framework, particularly, the algorithms for collision detection, force control and force response, the reader is referred to [13]. The details of Pure Data are outlined in [33]

References

  1. D. Alais and D. Burr. The ventriloquist effect results from near-optimal bimodal integration. Curr Biol, 14:257–262, 2004.

    Google Scholar 

  2. S. Argentieri, A. Portello, M. Bernard, P. Danės, and B. Gas. Binaural systems in robotics. In J. Blauert, editor, The technology of binaural listening, chapter 9. Springer, Berlin-Heidelberg-New York NY, 2013.

    Google Scholar 

  3. R. Baumgartner, P. Majdak, and B. Laback. Assessment of sagittal-plane sound-localization performance in spatial-audio applications. In J. Blauert, editor, The technology of binaural listening, chapter 4. Springer, Berlin-Heidelberg-New York NY, 2013.

    Google Scholar 

  4. A. Berkhout, D. de Vries, and P. Vogel. Acoustic control by wave field synthesis. J. Audio Eng. Soc., 93:2764–2778, 1993.

    Google Scholar 

  5. J. Blauert. Sound localization in the median plane. Acustica, 22:205–213, 1969.

    Google Scholar 

  6. J. Blauert. Spatial hearing. Revised Edition. The MIT Press, Cambridge, London, 1997.

    Google Scholar 

  7. J. Blauert and J. Braasch. Binaural signal processing. In Proc. Intl. Conf. Digital Signal Processing, pages 1–11, 2011.

    Google Scholar 

  8. J. Braasch. Modelling of binaural hearing. In J. Blauert, editor, Communication Acoustics, chapter 4, pages 75–108. Springer, 2005.

    Google Scholar 

  9. S. Brewster. Using non-speech sound to overcome information overload. Displays, 17:179–189, 1997.

    Google Scholar 

  10. F. Clark and K. Horch. Kinesthesia. In L. K. K. Boff and J. Thomas, editors, Handbook of Perception and Human Performance, chapter 13, pages 1–62. Willey-Interscience, 1986.

    Google Scholar 

  11. F. J. Clark. How accurately can we perceive the position of our limbs? Behav. Brain Sci., 15:725–726, 1992.

    Google Scholar 

  12. C. Colwell, H. Petrie, and D. Kornbrot. Use of a haptic device by blind and sighted people: Perception of virtual textures and objects. In I. Placencia and E. Porrero, editors, Improving the Quality of Life for the European Citizen: Technology for Inclusive Design and Equality, pages 243–250. IOS Press, Amsterdam, 1998.

    Google Scholar 

  13. F. Conti, F. Barbagli, D. Morris, and C. Sewell. CHAI 3D - Documentation, 2012. last viewed on 12–09-29.

    Google Scholar 

  14. M. O. Ernst and M. S. Banks. Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415:429–433, 2002.

    Google Scholar 

  15. A. Faeth, M. Oren, and C. Harding. Combining 3-D geovisualization with force feedback driven user interaction. In Proc. Intl. Conf. Advances in Geographic Information Systems, pages 1–9, Irvine, California, USA, 2008.

    Google Scholar 

  16. E. Fastl and H. Zwicker. Psychoacoustics - Facts and Models. Springer, 2007.

    Google Scholar 

  17. B. Gardner and K. Martin. HRTF Measurements of a KEMAR Dummy-Head Microphone. MIT Media Lab Perceptual Computing, (280):1–7, 1994.

    Google Scholar 

  18. T. Haidegger, J. Sándor, and Z. Benyó. Surgery in space: the future of robotic telesurgery. Surg. Endosc., 25:681–690, 2011.

    Google Scholar 

  19. L. A. Hall and D. I. McCloskey. Detections of movements imposed on finger, elbow and shoulder joints. J. Physiol., 335:519–533, 1983.

    Google Scholar 

  20. K. L. Holland, R. L. Williams II, R. R. Conatser Jr., J. N. Howell, and D. L. Cade. The implementation and evaluation of a virtual haptic back. Virtual Reality, 7:94–102, 2004.

    Google Scholar 

  21. G. Jansson, M. Bergamasco, and A. Frisoli. A new option for the visually impaired to experience 3D art at museums: Manual exploration of virtual copies. Vis. Impair. Res., 5:1–12, 2003.

    Google Scholar 

  22. L. A. Jeffress. A place theory of sound localization. J. Comp. Physiol. Psychol., 41:35–39, 1948.

    Google Scholar 

  23. L. Jones and I. Hunter. Differential thresholds for limb movement measured using adaptive techniques. Percept. Psychophys., 52:529–535, 1992.

    Google Scholar 

  24. M. Keehner and R. K. Lowe. Seeing with the hands and with the eyes: The contributions of haptic cues to anatomical shape recognition in surgery. In Proc. Symposium Cognitive Shape Processing, pages 8–14, 2009.

    Google Scholar 

  25. V. Khalidov, F. Forbes, M. Hansard, E. Arnaud, and R. Horaud. Audio-visual clustering for multiple speaker localization. In Proc. Intl. Worksh. Machine Learning for Multimodal Interaction, pages 86–97, Utrecht, Netherlands, 2008. Springer-Verlag.

    Google Scholar 

  26. S. J. Lederman and L. A. Jones. Tactile and haptic illusions. IEEE Trans. Haptics, 4:273–294, 2011.

    Google Scholar 

  27. S. J. Lederman and R. L. Klatzky. Haptic identification of common objects: Effects of constraining the manual exploration process. Percept. Psychophys., 66:618–628, 2004.

    Google Scholar 

  28. C. Magnusson and K. Rassmus-Grohn. Audio haptic tools for navigation in non visual environments. In Proc. Intl. Conf. Enactive Interfaces, pages 17–18, Genoa, Italy, 2005.

    Google Scholar 

  29. C. Magnusson and K. Rassmus-Grohn. A virtual traffic environment for people with visual impairment. Vis. Impair. Res., 7:1–12, 2005.

    Google Scholar 

  30. D. Malham and A. Myatt. 3D sound spatialization using ambisonic techniques. Comp. Music J., 19:58–70, 1995.

    Google Scholar 

  31. I. Oakley, M. R. McGee, S. Brewster, and P. Gray. Putting the feel in look and feel. In Proc. Intl. Conf. Human Factors in Computing Systems, pages 415–422, Den Haag, Niederlande, 2000.

    Google Scholar 

  32. A. M. Okamura. Methods for haptic feedback in teleoperated robot-assisted surgery. Industrial Robot, 31:499–508, 2004.

    Google Scholar 

  33. Open Source Project. Pure Data - Documentation, 2012. last viewed on 12–09-29.

    Google Scholar 

  34. V. Pulkki. Virtual sound source positioning using vector base amplitude panning. J. Audio Eng. Soc., 45:456–466, 1997.

    Google Scholar 

  35. W. Qi. Geometry based haptic interaction with scientific data. In Proc. Intl. Conf. Virtual Reality Continuum and its Applications, pages 401–404, Hong Kong, 2006.

    Google Scholar 

  36. SensAble Technologies. Specifications for the PHANTOM Omni \({{\rm R}\!\!\!\!\!\bigcirc }\) haptic device, 2012. last viewed on 12–09-29.

    Google Scholar 

  37. G. Sepulveda-Cervantes, V. Parra-Vega, and O. Dominguez-Ramirez. Haptic cues for effective learning in 3d maze navigation. In Proc. Intl. Worksh. Haptic Audio Visual Environments and Games, pages 93–98, Ottawa, Canada, 2008.

    Google Scholar 

  38. L. Shams, Y. Kamitani, and S. Shimojo. Illusions: What you see is what you hear. Nature, 408:788, 2000.

    Google Scholar 

  39. M. Stamm, M. Altinsoy, and S. Merchel. Identification accuracy and efficiency of haptic virtual objects using force-feedback. In Proc. Intl. Worksh. Perceptual Quality of Systems, Bautzen, Germany, 2010.

    Google Scholar 

  40. H. Z. Tan, M. A. Srinivasan, B. Eberman, and B. Cheng. Human factors for the design of force-reflecting haptic interfaces. Control, 55:353–359, 1994.

    Google Scholar 

  41. R. J. van Beers, D. M. Wolpert, and P. Haggard. When feeling is more important than seeing in sensorimotor adaptation. Curr Biol, 12:834–837, 2002.

    Google Scholar 

  42. F. L. Van Scoy, T. Kawai, M. Darrah, and C. Rash. Haptic display of mathematical functions for teaching mathematics to students with vision disabilities: Design and proof of concept. In S. Brewster and R. Murray-Smith, editors, Proc. Intl. Worksh. Haptic Human Computer Interaction, volume 2058, pages 31–40, Glasgow, UK, 2001. Springer-Verlag.

    Google Scholar 

  43. P. Xiang, D. Camargo, and M. Puckette. Experiments on spatial gestures in binaural sound display. In Proc. Intl. Conf. Auditory Display, pages 1–5, Limerick, Ireland, 2005.

    Google Scholar 

Download references

Acknowledgments

The authors wish to thank the Deutsche Forschungsgemeinschaft for supporting this work under the contract DFG 156/1-1. The authors are indebted to S. Argentieri, P. Majdak, S. Merchel, A. Kohlrausch and two anonymous reviewers for helpful comments on an earlier version of the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. E. Altinsoy .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Stamm, M., Altinsoy, M.E. (2013). Assessment of Binaural–Proprioceptive Interaction in Human-Machine Interfaces. In: Blauert, J. (eds) The Technology of Binaural Listening. Modern Acoustics and Signal Processing. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-37762-4_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-37762-4_17

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-37761-7

  • Online ISBN: 978-3-642-37762-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics