Advertisement

Cue Control: Interactive Sound Spatialization for 360\(^\circ \) Videos

  • Paulo BalaEmail author
  • Raul Masu
  • Valentina Nisi
  • Nuno Nunes
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11318)

Abstract

In the 360\(^\circ \) videos, the role of sound became crucial as it not only contributes to the participant’s level of Presence (the feeling of being in the virtual environment) but can also provide viewers with a periodical awareness of their surroundings; therefore, audio can guide user attention toward desired points. In this sense, the sonic elements of a 360\(^\circ \) video assume an interactive role, as sounds become notifying elements or icons. In the paper, we describe Cue Control, an audio editor that facilitates the creation of soundtracks for 360\(^\circ \) videos. The user can control the location of the sonic elements by positioning the sounds in the virtual 3D space following the desired timeline; Cue Control automatically creates a cue list of the spatial soundtrack events for playback. The software also allows for different interactive modalities of playback, adapting the cue list to the viewpoint of the user. We conducted a small pilot study where Cue Control was used to assemble the soundtrack of two 360\(^\circ \) videos. According to the data gathered, we present some preliminary reflections about the use of sound to guide users’ attention in 360\(^\circ \) videos towards points of interest.

Keywords

Spatial sound 360\(^\circ \) video Sonic interaction design 

Notes

Acknowledgments

The project has been developed as part of MITIExcell (M1420-01-0145-FEDER-000002). The author Paulo Bala wishes to acknowledge FCT for supporting his research through the Ph.D. Grant PD/BD/128330/2017.

References

  1. 1.
    Pope, V.C., Dawes, R., Schweiger, F., Sheikh, A.: The geometry of storytelling: theatrical use of space for 360-degree videos and virtual reality. In: CHI Conference on Human Factors in Computing Systems 2017, pp. 4468–4478. ACM, Denver (2017)Google Scholar
  2. 2.
    Garner, T.A.: Sound and the virtual. In: Echoes of Other Worlds: Sound in Virtual Reality. PSS, pp. 47–82. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-65708-0_3Google Scholar
  3. 3.
    Serafin, S., Geronazzo, M., Nilsson, N., Erkut, C., Nordahl, R.: Sonic interactions in virtual reality: state of the art, current challenges and future directions. IEEE Compu. Graph. Appl. 38, 31–43 (2018)CrossRefGoogle Scholar
  4. 4.
    Pasnau, R.: What is sound? Philos. Q. 49, 309–324 (1999)CrossRefGoogle Scholar
  5. 5.
    OCallaghan, C.: Sounds and events. In: Sounds and Perception, pp. 26–49. Oxford University Press, Oxford (2009)Google Scholar
  6. 6.
    Nudds, M.: Sounds and Space. In: Sounds and Perception, pp. 69–96. Oxford University Press, Oxford (2009)CrossRefGoogle Scholar
  7. 7.
    Gaver, W.W.: What in the world do we hear? An ecological approach to auditory event perception. Ecol. Psychol. 5, 1–29 (1993)CrossRefGoogle Scholar
  8. 8.
    Chion, M., Murch, W.: Audio-Vision: Sound on Screen. Columbia University Press, New York (1994)Google Scholar
  9. 9.
    Gaver, W.W.: Synthesizing auditory icons. In: Proceedings of the INTERACT93 and CHI93 Conference on Human Factors in Computing Systems, pp. 228–235. ACM (1993)Google Scholar
  10. 10.
    Marks, A.: The Complete Guide to Game Audio: For Composers, Musicians, Sound Designers, Game Developers. CRC Press, Berkeley (2012)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Paulo Bala
    • 1
    • 2
    Email author
  • Raul Masu
    • 1
    • 2
  • Valentina Nisi
    • 1
    • 3
  • Nuno Nunes
    • 1
    • 4
  1. 1.Madeira-ITIFunchalPortugal
  2. 2.FCT/U. Nova de LisboaLisbonPortugal
  3. 3.UMAFunchalPortugal
  4. 4.IST/U. of LisbonLisbonPortugal

Personalised recommendations