Cue Control: Interactive Sound Spatialization for 360\(^\circ \) Videos
In the 360\(^\circ \) videos, the role of sound became crucial as it not only contributes to the participant’s level of Presence (the feeling of being in the virtual environment) but can also provide viewers with a periodical awareness of their surroundings; therefore, audio can guide user attention toward desired points. In this sense, the sonic elements of a 360\(^\circ \) video assume an interactive role, as sounds become notifying elements or icons. In the paper, we describe Cue Control, an audio editor that facilitates the creation of soundtracks for 360\(^\circ \) videos. The user can control the location of the sonic elements by positioning the sounds in the virtual 3D space following the desired timeline; Cue Control automatically creates a cue list of the spatial soundtrack events for playback. The software also allows for different interactive modalities of playback, adapting the cue list to the viewpoint of the user. We conducted a small pilot study where Cue Control was used to assemble the soundtrack of two 360\(^\circ \) videos. According to the data gathered, we present some preliminary reflections about the use of sound to guide users’ attention in 360\(^\circ \) videos towards points of interest.
KeywordsSpatial sound 360\(^\circ \) video Sonic interaction design
The project has been developed as part of MITIExcell (M1420-01-0145-FEDER-000002). The author Paulo Bala wishes to acknowledge FCT for supporting his research through the Ph.D. Grant PD/BD/128330/2017.
- 1.Pope, V.C., Dawes, R., Schweiger, F., Sheikh, A.: The geometry of storytelling: theatrical use of space for 360-degree videos and virtual reality. In: CHI Conference on Human Factors in Computing Systems 2017, pp. 4468–4478. ACM, Denver (2017)Google Scholar
- 5.OCallaghan, C.: Sounds and events. In: Sounds and Perception, pp. 26–49. Oxford University Press, Oxford (2009)Google Scholar
- 8.Chion, M., Murch, W.: Audio-Vision: Sound on Screen. Columbia University Press, New York (1994)Google Scholar
- 9.Gaver, W.W.: Synthesizing auditory icons. In: Proceedings of the INTERACT93 and CHI93 Conference on Human Factors in Computing Systems, pp. 228–235. ACM (1993)Google Scholar