Abstract
In the 360\(^\circ \) videos, the role of sound became crucial as it not only contributes to the participant’s level of Presence (the feeling of being in the virtual environment) but can also provide viewers with a periodical awareness of their surroundings; therefore, audio can guide user attention toward desired points. In this sense, the sonic elements of a 360\(^\circ \) video assume an interactive role, as sounds become notifying elements or icons. In the paper, we describe Cue Control, an audio editor that facilitates the creation of soundtracks for 360\(^\circ \) videos. The user can control the location of the sonic elements by positioning the sounds in the virtual 3D space following the desired timeline; Cue Control automatically creates a cue list of the spatial soundtrack events for playback. The software also allows for different interactive modalities of playback, adapting the cue list to the viewpoint of the user. We conducted a small pilot study where Cue Control was used to assemble the soundtrack of two 360\(^\circ \) videos. According to the data gathered, we present some preliminary reflections about the use of sound to guide users’ attention in 360\(^\circ \) videos towards points of interest.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Pope, V.C., Dawes, R., Schweiger, F., Sheikh, A.: The geometry of storytelling: theatrical use of space for 360-degree videos and virtual reality. In: CHI Conference on Human Factors in Computing Systems 2017, pp. 4468–4478. ACM, Denver (2017)
Garner, T.A.: Sound and the virtual. In: Echoes of Other Worlds: Sound in Virtual Reality. PSS, pp. 47–82. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-65708-0_3
Serafin, S., Geronazzo, M., Nilsson, N., Erkut, C., Nordahl, R.: Sonic interactions in virtual reality: state of the art, current challenges and future directions. IEEE Compu. Graph. Appl. 38, 31–43 (2018)
Pasnau, R.: What is sound? Philos. Q. 49, 309–324 (1999)
OCallaghan, C.: Sounds and events. In: Sounds and Perception, pp. 26–49. Oxford University Press, Oxford (2009)
Nudds, M.: Sounds and Space. In: Sounds and Perception, pp. 69–96. Oxford University Press, Oxford (2009)
Gaver, W.W.: What in the world do we hear? An ecological approach to auditory event perception. Ecol. Psychol. 5, 1–29 (1993)
Chion, M., Murch, W.: Audio-Vision: Sound on Screen. Columbia University Press, New York (1994)
Gaver, W.W.: Synthesizing auditory icons. In: Proceedings of the INTERACT93 and CHI93 Conference on Human Factors in Computing Systems, pp. 228–235. ACM (1993)
Marks, A.: The Complete Guide to Game Audio: For Composers, Musicians, Sound Designers, Game Developers. CRC Press, Berkeley (2012)
Acknowledgments
The project has been developed as part of MITIExcell (M1420-01-0145-FEDER-000002). The author Paulo Bala wishes to acknowledge FCT for supporting his research through the Ph.D. Grant PD/BD/128330/2017.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Bala, P., Masu, R., Nisi, V., Nunes, N. (2018). Cue Control: Interactive Sound Spatialization for 360\(^\circ \) Videos. In: Rouse, R., Koenitz, H., Haahr, M. (eds) Interactive Storytelling. ICIDS 2018. Lecture Notes in Computer Science(), vol 11318. Springer, Cham. https://doi.org/10.1007/978-3-030-04028-4_36
Download citation
DOI: https://doi.org/10.1007/978-3-030-04028-4_36
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04027-7
Online ISBN: 978-3-030-04028-4
eBook Packages: Computer ScienceComputer Science (R0)