Abstract
Binaural models help to predict human localization under the assumption that a corresponding localization process is based on acoustic signals, thus, on unimodal information. However, what happens if this localization process is realized in an environment with available bimodal or even multimodal sensory input? Do we still consider the auditory modality in the localization process? Can binaural models help to predict human localization in bimodal or multimodal scenes? At the beginning, this chapter focuses on binaural-visual localization and demonstrates that binaural models are definitely required for modeling human localization even when visual information is available. The main part of this chapter dedicates to binaural-proprioceptive localization. First, an experiment is described with which the proprioceptive localization performance was quantitatively measured. Second, the influence of binaural signals on proprioception was investigated to reveal whether synthetically generated spatial sound can improve human proprioceptive localization. The results demonstrate that it is indeed possible to auditorily guide proprioception. In conclusion, binaural models can not only be used for modeling human binaural-visual, but also for modeling human binaural-proprioceptive localization. It is shown that binaural-modeling algorithms, thus, play an important role for further technical developments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Manufactured by SensAble Technologies
- 2.
References
D. Alais and D. Burr. The ventriloquist effect results from near-optimal bimodal integration. Curr Biol, 14:257–262, 2004.
S. Argentieri, A. Portello, M. Bernard, P. Danės, and B. Gas. Binaural systems in robotics. In J. Blauert, editor, The technology of binaural listening, chapter 9. Springer, Berlin-Heidelberg-New York NY, 2013.
R. Baumgartner, P. Majdak, and B. Laback. Assessment of sagittal-plane sound-localization performance in spatial-audio applications. In J. Blauert, editor, The technology of binaural listening, chapter 4. Springer, Berlin-Heidelberg-New York NY, 2013.
A. Berkhout, D. de Vries, and P. Vogel. Acoustic control by wave field synthesis. J. Audio Eng. Soc., 93:2764–2778, 1993.
J. Blauert. Sound localization in the median plane. Acustica, 22:205–213, 1969.
J. Blauert. Spatial hearing. Revised Edition. The MIT Press, Cambridge, London, 1997.
J. Blauert and J. Braasch. Binaural signal processing. In Proc. Intl. Conf. Digital Signal Processing, pages 1–11, 2011.
J. Braasch. Modelling of binaural hearing. In J. Blauert, editor, Communication Acoustics, chapter 4, pages 75–108. Springer, 2005.
S. Brewster. Using non-speech sound to overcome information overload. Displays, 17:179–189, 1997.
F. Clark and K. Horch. Kinesthesia. In L. K. K. Boff and J. Thomas, editors, Handbook of Perception and Human Performance, chapter 13, pages 1–62. Willey-Interscience, 1986.
F. J. Clark. How accurately can we perceive the position of our limbs? Behav. Brain Sci., 15:725–726, 1992.
C. Colwell, H. Petrie, and D. Kornbrot. Use of a haptic device by blind and sighted people: Perception of virtual textures and objects. In I. Placencia and E. Porrero, editors, Improving the Quality of Life for the European Citizen: Technology for Inclusive Design and Equality, pages 243–250. IOS Press, Amsterdam, 1998.
F. Conti, F. Barbagli, D. Morris, and C. Sewell. CHAI 3D - Documentation, 2012. last viewed on 12–09-29.
M. O. Ernst and M. S. Banks. Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415:429–433, 2002.
A. Faeth, M. Oren, and C. Harding. Combining 3-D geovisualization with force feedback driven user interaction. In Proc. Intl. Conf. Advances in Geographic Information Systems, pages 1–9, Irvine, California, USA, 2008.
E. Fastl and H. Zwicker. Psychoacoustics - Facts and Models. Springer, 2007.
B. Gardner and K. Martin. HRTF Measurements of a KEMAR Dummy-Head Microphone. MIT Media Lab Perceptual Computing, (280):1–7, 1994.
T. Haidegger, J. Sándor, and Z. Benyó. Surgery in space: the future of robotic telesurgery. Surg. Endosc., 25:681–690, 2011.
L. A. Hall and D. I. McCloskey. Detections of movements imposed on finger, elbow and shoulder joints. J. Physiol., 335:519–533, 1983.
K. L. Holland, R. L. Williams II, R. R. Conatser Jr., J. N. Howell, and D. L. Cade. The implementation and evaluation of a virtual haptic back. Virtual Reality, 7:94–102, 2004.
G. Jansson, M. Bergamasco, and A. Frisoli. A new option for the visually impaired to experience 3D art at museums: Manual exploration of virtual copies. Vis. Impair. Res., 5:1–12, 2003.
L. A. Jeffress. A place theory of sound localization. J. Comp. Physiol. Psychol., 41:35–39, 1948.
L. Jones and I. Hunter. Differential thresholds for limb movement measured using adaptive techniques. Percept. Psychophys., 52:529–535, 1992.
M. Keehner and R. K. Lowe. Seeing with the hands and with the eyes: The contributions of haptic cues to anatomical shape recognition in surgery. In Proc. Symposium Cognitive Shape Processing, pages 8–14, 2009.
V. Khalidov, F. Forbes, M. Hansard, E. Arnaud, and R. Horaud. Audio-visual clustering for multiple speaker localization. In Proc. Intl. Worksh. Machine Learning for Multimodal Interaction, pages 86–97, Utrecht, Netherlands, 2008. Springer-Verlag.
S. J. Lederman and L. A. Jones. Tactile and haptic illusions. IEEE Trans. Haptics, 4:273–294, 2011.
S. J. Lederman and R. L. Klatzky. Haptic identification of common objects: Effects of constraining the manual exploration process. Percept. Psychophys., 66:618–628, 2004.
C. Magnusson and K. Rassmus-Grohn. Audio haptic tools for navigation in non visual environments. In Proc. Intl. Conf. Enactive Interfaces, pages 17–18, Genoa, Italy, 2005.
C. Magnusson and K. Rassmus-Grohn. A virtual traffic environment for people with visual impairment. Vis. Impair. Res., 7:1–12, 2005.
D. Malham and A. Myatt. 3D sound spatialization using ambisonic techniques. Comp. Music J., 19:58–70, 1995.
I. Oakley, M. R. McGee, S. Brewster, and P. Gray. Putting the feel in look and feel. In Proc. Intl. Conf. Human Factors in Computing Systems, pages 415–422, Den Haag, Niederlande, 2000.
A. M. Okamura. Methods for haptic feedback in teleoperated robot-assisted surgery. Industrial Robot, 31:499–508, 2004.
Open Source Project. Pure Data - Documentation, 2012. last viewed on 12–09-29.
V. Pulkki. Virtual sound source positioning using vector base amplitude panning. J. Audio Eng. Soc., 45:456–466, 1997.
W. Qi. Geometry based haptic interaction with scientific data. In Proc. Intl. Conf. Virtual Reality Continuum and its Applications, pages 401–404, Hong Kong, 2006.
SensAble Technologies. Specifications for the PHANTOM Omni \({{\rm R}\!\!\!\!\!\bigcirc }\) haptic device, 2012. last viewed on 12–09-29.
G. Sepulveda-Cervantes, V. Parra-Vega, and O. Dominguez-Ramirez. Haptic cues for effective learning in 3d maze navigation. In Proc. Intl. Worksh. Haptic Audio Visual Environments and Games, pages 93–98, Ottawa, Canada, 2008.
L. Shams, Y. Kamitani, and S. Shimojo. Illusions: What you see is what you hear. Nature, 408:788, 2000.
M. Stamm, M. Altinsoy, and S. Merchel. Identification accuracy and efficiency of haptic virtual objects using force-feedback. In Proc. Intl. Worksh. Perceptual Quality of Systems, Bautzen, Germany, 2010.
H. Z. Tan, M. A. Srinivasan, B. Eberman, and B. Cheng. Human factors for the design of force-reflecting haptic interfaces. Control, 55:353–359, 1994.
R. J. van Beers, D. M. Wolpert, and P. Haggard. When feeling is more important than seeing in sensorimotor adaptation. Curr Biol, 12:834–837, 2002.
F. L. Van Scoy, T. Kawai, M. Darrah, and C. Rash. Haptic display of mathematical functions for teaching mathematics to students with vision disabilities: Design and proof of concept. In S. Brewster and R. Murray-Smith, editors, Proc. Intl. Worksh. Haptic Human Computer Interaction, volume 2058, pages 31–40, Glasgow, UK, 2001. Springer-Verlag.
P. Xiang, D. Camargo, and M. Puckette. Experiments on spatial gestures in binaural sound display. In Proc. Intl. Conf. Auditory Display, pages 1–5, Limerick, Ireland, 2005.
Acknowledgments
The authors wish to thank the Deutsche Forschungsgemeinschaft for supporting this work under the contract DFG 156/1-1. The authors are indebted to S. Argentieri, P. Majdak, S. Merchel, A. Kohlrausch and two anonymous reviewers for helpful comments on an earlier version of the manuscript.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Stamm, M., Altinsoy, M.E. (2013). Assessment of Binaural–Proprioceptive Interaction in Human-Machine Interfaces. In: Blauert, J. (eds) The Technology of Binaural Listening. Modern Acoustics and Signal Processing. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-37762-4_17
Download citation
DOI: https://doi.org/10.1007/978-3-642-37762-4_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-37761-7
Online ISBN: 978-3-642-37762-4
eBook Packages: EngineeringEngineering (R0)