Structure of Sensory Signals: Icons and Messages

  • Péter Baranyi
  • Adam Csapo
  • Gyula Sallai


In this chapter, the motivations behind CogInfoCom channels are discussed. As a first step towards their formal development, a unified view is provided of the structural elements that have in the past been used in interfaces designed for various (human) sensory systems. It is demonstrated that not only are these structural elements analogous to each other, and therefore amenable to conceptual unification, but that their interpretation can also be extended to the artificial modalities of any kind of cognitive entity in general.


Dynamic Phenomenon Cognitive Capability Emotional Representation Multimodal Interaction Auditory Domain 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. Balata J, Franc J, Mikovec Z, Slavik P (2014) Collaborative navigation of visually impaired. J Multimodal User Interfaces 8:175–185CrossRefGoogle Scholar
  2. Baranyi P, Galambos P, Csapo A, Varlaki P (2012) Stabilization and synchronization of dynamicons through CogInfoCom channels. In: 2012 IEEE 3rd international conference on cognitive infocommunications (CogInfoCom), pp 33–36Google Scholar
  3. Berthoz A, Pavard B, Young LR (1975) Perception of linear horizontal self-motion induced by peripheral vision (linearvection) basic characteristics and visual-vestibular interactions. Exp Brain Res 23(5):471–489CrossRefGoogle Scholar
  4. Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum Comput Interact 4(1):11–44CrossRefGoogle Scholar
  5. Blum JR, Eichhorn A, Smith S, Sterle-Contala M, Cooperstock JR (2014) Real-time emergency response: improved management of real-time information during crisis situations. J Multimodal User Interfaces 8:161–173CrossRefGoogle Scholar
  6. Brewster S, Brown L (2004) Tactons: structured tactile messages for non-visual information display. In: Proceedings of the 5th conference on Australasian user interface (AUIC’04), vol 28. Dunedin, pp 15–23Google Scholar
  7. Brunger-Koch M, Briest S, Vollrath M (2006) Virtual driving with different motion characteristics: braking manoeuvre analysis and validation. In: Proceedings of the driving simulation conference, pp 69–78Google Scholar
  8. Chernoff H (1973) The use of faces to represent points in k-dimensional space graphically. J Am Stat Assoc 68(342):361–368CrossRefGoogle Scholar
  9. Csapo A, Baranyi P (2012d) A unified terminology for the structure and semantics of CogInfoCom channels. Acta Polytech Hung 9(1):85–105Google Scholar
  10. Csapo A, Wersenyi G (2014) Overview of auditory representations in human-machine interfaces. ACM Comput Surv 46(2)Google Scholar
  11. De Groot S, De Winter JCF, Mulder M, Wieringa PA (2011) Nonvestibular motion cueing in a fixed-base driving simulator: effects on driver braking and cornering performance. Presence Teleoperators Virtual Environ 20(2):117–142CrossRefGoogle Scholar
  12. Enriquez M, MacLean K (2003) The hapticon editor: a tool in support of haptic communication research. In: Proceedings of the 11th symposium on haptic interfaces for virtual environment and teleoperator systems (HAPTICS’03). IEEE Computer Society, Los Angeles, pp 356–362Google Scholar
  13. Enriquez M, Maclean K, Chita C (2006) Haptic phonemes: basic building blocks of haptic communication. In: Proceedings of the 8th international conference on multimodal interfaces (ICMI 2006). ACM Press, Banff, pp 302–309Google Scholar
  14. Flanagan JR, Vetter P, Johansson RS, Wolpert DM (2003) Prediction precedes control in motor learning. Curr Biol 13(2):146–150CrossRefGoogle Scholar
  15. Flanagan JR, Bowman MC, Johansson RS (2006) Control strategies in object manipulation tasks. Curr Opin Neurobiol 16(6):650–659CrossRefGoogle Scholar
  16. Gaver W (1986) Auditory icons: using sound in computer interfaces. Hum Comput Interact 2(2):167–177CrossRefGoogle Scholar
  17. Gaver W (1988) Everyday listening and auditory icons. Ph.D. thesis, University of California, San DiegoGoogle Scholar
  18. Gaver W (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4(1):67–94MathSciNetCrossRefGoogle Scholar
  19. Gaver WW (1997) Auditory interfaces. In: Handbook of human-computer interaction, vol 1. Elsevier, Amsterdam, pp 1003–1041Google Scholar
  20. Hearst MA (1997) Dissonance on audio interfaces. IEEE Expert 12(5):10–16CrossRefGoogle Scholar
  21. Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: 14th international conference on auditory display, pp 1–8Google Scholar
  22. Hermann T, Ritter H (1999) Listen to your data: model-based sonification for data analysis. In: Lasker GE (ed) Advances in intelligent computing and multimedia systems. The International Institute for Advanced Studies in System Research and Cybernetics, Baden-Baden, pp 189–194Google Scholar
  23. Hermann T, Hunt A, Neuhoff J (2011) The sonification handbook. Logos Verlag, BerlinGoogle Scholar
  24. Jokinen K (2008) User interaction in mobile navigation applications. In: Meng L, Zipf A, Winter S (eds) Map-based mobile services. Lecture Notes in geoinformation and cartography. Springer, Berlin/Heidelberg, pp 168–197CrossRefGoogle Scholar
  25. Kaye J (2004) Making scents: aromatic output for HCI. Interactions 11:48–61. doi:
  26. Lederman S (2004) Haptic identification of common objects: effects of constraining the manual exploration process. Percept Psychophys 66(4):618–628CrossRefGoogle Scholar
  27. Lemmens P, Bussemakers M, De Haan A (2001) Effects of auditory icons and earcons on visual categorization: the bigger picture. In: Proceedings of the international conference on auditory display (ICAD’01), Helsinki, pp 117–125Google Scholar
  28. Maclean K, Enriquez M (2003) Perceptual design of haptic icons. In: Proceedings of eurohaptics 2003, pp 351–363Google Scholar
  29. McGee M (2002) Investigating a multimodal solution for improving force feedback generated textures. Ph.D. thesis, University of GlasgowGoogle Scholar
  30. Mustonen M (2008) A review-based conceptual analysis of auditory signs and their design. In: Proceeding of ICADGoogle Scholar
  31. Pinto M, Cavallo V, Ohimann T, Espie S, Roge J (2004) The perception of longitudinal accelerations: what factors influence braking manoeuvres in driving simulators? In: Conférence simulation de conduite, pp 139–151Google Scholar
  32. Pirhonen A (2006) Non-speech sounds as elements of a use scenario: a semiotic perspective. In: Proceedings of the 12th international conference on auditory display (ICAD2006, LondonGoogle Scholar
  33. Riecke BE, Schulte-Pelkum J, Caniard F, Bulthoff HH (2005) Towards lean and elegant self-motion simulation in virtual reality. In: Proceedings of virtual reality, 2005 (VR 2005). IEEE, pp 131–138Google Scholar
  34. Shneiderman B (1998) Designing the user interface: strategies for effective human-computer interaction, 3rd edn. Addison-Wesley, ReadingGoogle Scholar
  35. Smith D (1975) Pygmalion: a computer program to model and stimulate creative thought. Ph.D. thesis, Stanford University, Department of Computer ScienceGoogle Scholar
  36. Sun R, Merrill E, Peterson T (2001) From implicit skills to explicit knowledge: a bottom-up model of skill learning. Cognit Sci 25(2):203–244CrossRefGoogle Scholar
  37. Todorov E (2004) Optimality principles in sensorimotor control. Nat Neurosci 7(9):907–915CrossRefGoogle Scholar
  38. Vernier F, Nigay L (2001) A framework for the combination and characterization of output modalities. In: Palanque P, Fabio P (eds) Interactive systems design, specification, and verification. Lecture notes in computer science, vol 1946. Springer, Berlin/Heidelberg, pp 35–50CrossRefGoogle Scholar
  39. Voisin J, Lamarre Y, Chapman C (2002) Haptic discrimination of object shape in humans: contribution of cutaneous and proprioceptive inputs. Exp Brain Res 145(2):251–260CrossRefGoogle Scholar
  40. Wolpert DM, Ghahramani Z (2000) Computational principles of movement neuroscience. Nat Neurosci 3:1212–1217CrossRefGoogle Scholar
  41. Wolpert DM, Kawato M (1998) Multiple paired forward and inverse models for motor control. Neural Netw 11(7):1317–1329CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Péter Baranyi
    • 1
    • 2
  • Adam Csapo
    • 2
    • 1
  • Gyula Sallai
    • 3
    • 4
  1. 1.Széchenyi István University GyőrBudapestHungary
  2. 2.Institute for Computer Science and Control of the Hungarian Academy of SciencesBudapestHungary
  3. 3.Budapest University of Technology and EconomicsBudapestHungary
  4. 4.Future Internet Research Coordination CentreUniversity of DebrecenDebrecenHungary

Personalised recommendations