Abstract
In this paper we introduce a novel interface combining spatial and continuous tangible interaction for creating and manipulating audio-visual effects. Our goal is to provide a ready-to-use, “hands-on” interface that does not need protracted explanation to the user, yet provides possibilities for expression. Therefore, our interface exploits the three-dimensional topology of physical sand, which is distributed over a tabletop surface. We discovered, that users of the system were engaged by the natural interaction and playful manner of the installation, as it resembles the play in a sandbox. We demonstrate an artistic setup that produces ambient soundscapes using a Lattice Boltzmann based particle simulation running through a deformable landscape. Visual feedback is front-projected onto the sand as well as the user’s hand. The user can explore and change the landscape by using his or her hands and use spatial gestures via on-body projection to control AR content and further settings. The focus of this work lies on the simultaneous interaction with sand and the user’s own body, and it’s contribution to audio-visual installations.
Our prototype system was tested with potential users in a small informal study and was overall well received. Users had fun exploring the different forms of interaction techniques to control the particle simulation and soundscape, and were amazed by the possibilities of on-body interaction. In future we plan to further evaluate our system in a formal study and compare interaction and user experience to similar interfaces. The system was successfully deployed as an indoor room installation, reducing it’s components to a minimum. Further deployments are planned.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Beckhaus, S., Schroeder-Kroll, R., Berghoff, M.: Back to the sandbox-playful interaction with granules landscapes (2008)
Colter, A., Davivongsa, P., Haddad, D.D., Moore, H., Tice, B., Ishii, H.: SoundFORMS: manipulating sound through touch. In: CHI Extended Abstracts, pp. 2425–2430 (2016)
Harrison, C., Benko, H., Wilson, A.D.: OmniTouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 441–450. ACM (2011)
Harrison, C., Ramamurthy, S., Hudson, S.E.: On-body interaction: armed and dangerous. In: Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, pp. 69–76. ACM (2012)
Harrison, C., Tan, D., Morris, D.: Skinput: appropriating the body as an input surface. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 453–462. ACM (2010)
Jordà, S.: Sonigraphical instruments: from FMOL to the reacTable. In: Proceedings of the 2003 Conference on New Interfaces for Musical Expression, pp. 70–76 (2003)
Levin, G.: The table is the score: an augmented-reality interface for real-time, tangible, spectrographic performance. In: Proceedings of the International Computer Music Conference (ICMC 2006), New Orleans, USA (2006)
Mistry, P., Maes, P.: SixthSense: a wearable gestural interface. In: ACM SIGGRAPH ASIA 2009 Sketches, p. 11. ACM (2009)
Piper, B., Ratti, C., Ishii, H.: Illuminating clay: a 3-D tangible interface for landscape analysis. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 355–362 (2002)
Razzaghian, M., Pourtousi, M., Darus, A.N.: Simulation of flow in lid driven cavity by MRT and SRT. In: Thailand: International Conference on Mechanical and Robotics Engineering (2012)
The Green Cat Collective: Sand Noise Device An Augmented Reality Musical Sandbox. http://www.sandnoisedevice.com/
Wilson, A.D., Benko, H.: Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, pp. 273–282. ACM (2010)
Yoruk, E., Konukoglu, E., Sankur, B., Darbon, J.: Shape-based hand recognition. IEEE Trans. Image Process. 15(7), 1803–1815 (2006)
Acknowledgments
This work was partially supported by the Creative Europe EU Program (project The Peoples Smart Sculpture).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
Dewitz, B., Wiche, R., Geiger, C., Steinicke, F., Feitsch, J. (2018). AR Sound Sandbox: A Playful Interface for Musical and Artistic Expression. In: Chisik, Y., Holopainen, J., Khaled, R., Luis Silva, J., Alexandra Silva, P. (eds) Intelligent Technologies for Interactive Entertainment. INTETAIN 2017. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 215. Springer, Cham. https://doi.org/10.1007/978-3-319-73062-2_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-73062-2_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-73061-5
Online ISBN: 978-3-319-73062-2
eBook Packages: Computer ScienceComputer Science (R0)