Multimedia Tools and Applications

, Volume 73, Issue 3, pp 1103–1127 | Cite as

STIMONT: a core ontology for multimedia stimuli description

  • Marko HorvatEmail author
  • Nikola Bogunović
  • Krešimir Ćosić


Affective multimedia documents such as images, sounds or videos elicit emotional responses in exposed human subjects. These stimuli are stored in affective multimedia databases and successfully used for a wide variety of research in psychology and neuroscience in areas related to attention and emotion processing. Although important all affective multimedia databases have numerous deficiencies which impair their applicability. These problems, which are brought forward in the paper, result in low recall and precision of multimedia stimuli retrieval which makes creating emotion elicitation procedures difficult and labor-intensive. To address these issues a new core ontology STIMONT is introduced. The STIMONT is written in OWL-DL formalism and extends W3C EmotionML format with an expressive and formal representation of affective concepts, high-level semantics, stimuli document metadata and the elicited physiology. The advantages of ontology in description of affective multimedia stimuli are demonstrated in a document retrieval experiment and compared against contemporary keyword-based querying methods. Also, a software tool Intelligent Stimulus Generator for retrieval of affective multimedia and construction of stimuli sequences is presented.


Ontology Multimedia OWL Emotion Stimulus 


  1. 1.
    Agius H (2010) Emotion description with MPEG-7. Affective computing and intelligent interaction. Emotion in HCI–designing for people, 13Google Scholar
  2. 2.
    Arndt R, Troncy R, Staab S, Hardman L, Vacuram M (2007) COMM: designing a well-founded multimedia ontology for the web. The semantic web, lecture notes in computer science, vol. 4825. Springer, Berlin, pp 30–43Google Scholar
  3. 3.
    Baader F, Nutt W (2002) Basic description logics (description logic handbook). In: Baader F, Calvanese D, McGuinness DL, Nardi D, Patel-Schneider PF. Cambridge University Press, pp 47–100Google Scholar
  4. 4.
    Baccianella S, Esuli A, Sebastiani F (2010) SentiWordNet 3.0: an enhanced lexical resource for sentiment analysis and opinion mining. In: Proceedings of LREC-10, 7th Conference on Language Resources and Evaluation, Valletta, MT, pp 2200–2204Google Scholar
  5. 5.
    Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25:49–59CrossRefGoogle Scholar
  6. 6.
    Bradley MM, Lang PJ (1999) Affective norms for English words (ANEW): stimuli, instruction manual and affective ratings. Technical report C-1. The Center for Research in Psychophysiology, University of Florida, GainesvilleGoogle Scholar
  7. 7.
    Bradley MM, Lang PJ (2000) Measuring emotion: behavior, feeling and physiology. In: Lane R, Nadel L (eds) Cognitive neuroscience of emotion. Oxford University Press, New York, pp 242–276Google Scholar
  8. 8.
    Bradley MM, Lang PJ (2007) The international affective digitized sounds (2nd edition; IADS-2): affective ratings of sounds and instruction manual. Technical report B-3. University of Florida, GainesvilleGoogle Scholar
  9. 9.
    Bradley MM, Lang PJ (2007) Affective Norms for English Text (ANET): affective ratings of text and instruction manual. (Tech. Rep. No. D-1). University of Florida, GainesvilleGoogle Scholar
  10. 10.
    Coan JA, Allen JJB (eds) (2007) The handbook of emotion elicitation and assessment. Oxford University Press series in affective science. Oxford University Press, USAGoogle Scholar
  11. 11.
    Dan-Glauser ES, Scherer KR (2011) The Geneva Affective PicturE Database (GAPED): a new 730 picture database focusing on valence and normative significance. Behav Res Methods 43(2):468–477CrossRefGoogle Scholar
  12. 12.
    Ekman P (1992) Are there basic emotions? Psychol Rev 99:550–553CrossRefGoogle Scholar
  13. 13.
    Fellbaum C (1998) WordNet: an electronic lexical database. MIT PressGoogle Scholar
  14. 14.
    Gangemi A, Guarino N, Masolo C, Oltramari A, Schneider L (2002) Sweetening ontologies with DOLCE. Knowledge engineering and knowledge management: ontologies and the semantic web, lecture notes in computer science, vol. 2473. Springer, Berlin, pp 166–181CrossRefGoogle Scholar
  15. 15.
    Grandjean D, Sander D, Scherer KR (2008) Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization. Conscious Cogn 17:484–495CrossRefGoogle Scholar
  16. 16.
    Gross R (2005) Face databases. In: Li S, Jain A (eds) Handbook of face recognition. Springer-Verlag, The Robotics Inistitute, Carnegie Mellon University Forbes Avenue, PittsburghGoogle Scholar
  17. 17.
    Hauptmann A, Yan R, Lin W-H, Christel M, Wactlar H (2007) Can high-level concepts fill the semantic gap in video retrieval? a case study with broadcast news. IEEE Trans Multimed 9(5):958–966CrossRefGoogle Scholar
  18. 18.
    Hillmann D (2001) Using Dublin core. Recommendation. Dublin Core Metadata InitiativeGoogle Scholar
  19. 19.
    Hliaoutakis A, Varelas G, Voutsakis E, Petrakis EG, Milios E (2006) Information retrieval by semantic similarity. Int J Semant Web Inf Syst (IJSWIS) 2(3):55–73CrossRefGoogle Scholar
  20. 20.
    Horrocks I (2011) Tool support for ontology engineering. In: Fensel D (ed) Foundations for the web of information and services. Springer, pp 103–112Google Scholar
  21. 21.
    Horvat M (2013) STIMONT: a core ontology for multimedia stimuli description–Supplementary Online Data.
  22. 22.
    Horvat M, Grbin A, Gledec G (2012) WNtags: a web-based tool for image labeling and retrieval with lexical ontologies. Front Artif Intell Appl 243:585–594Google Scholar
  23. 23.
    Horvat M, Popović S, Bogunović N, Ćosić K (2009) Tagging multimedia stimuli with ontologies. Proceedings of the 32nd International Convention MIPRO 2009, Croatian Society for Information and Communication Technology, Electronics and Microelectronics–MIPRO, Opatija, Croatia, pp 203–208Google Scholar
  24. 24.
    Horvat M, Popović S, Ćosić K. Multimedia stimuli databases usage patterns: a survey report. Proceedings of the 36th International Convention MIPRO 2013, Croatian Society for Information and Communication Technology, Electronics and Microelectronics–MIPRO, Opatija, Croatia (accepted for publication, 2013)Google Scholar
  25. 25.
    Koelstra S, Muehl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I. DEAP: a database for emotion analysis using physiological signals. IEEE Transaction on Affective Computing, Special Issue on Naturalistic Affect Resources for System Building and Evaluation, in pressGoogle Scholar
  26. 26.
    Lang PJ, Bradley MM, Cuthbert BN (2008) International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical report A-8. University of Florida, GainesvilleGoogle Scholar
  27. 27.
    Libkuman TM, Otani H, Kern R, Viger SG, Novak N (2007) Multidimensional normative ratings for the international affective picture system. Behav Res Methods 39(2):326–334CrossRefGoogle Scholar
  28. 28.
    Liu H, Singh P (2004) ConceptNet: a practical commonsense reasoning toolkit. BT Technol J 22:211–226CrossRefGoogle Scholar
  29. 29.
    López JM, Gil R, García R, Cearreta I, Garay N (2008) Towards an ontology for describing emotions. Lect Notes Comput Sci 5288:96–104CrossRefGoogle Scholar
  30. 30.
    Machajdik J, Hanbury A (2010) Affective image classification using features inspired by psychology and art theory. ACM Multimedia 2010–Multimedia Content Track Full Paper, Florence, ItalyGoogle Scholar
  31. 31.
    Marchewka A, Żurawski Ł, Jednoróg K, Grabowska A. The Nencki affective picture system (NAPS). Introduction to a novel standardized wide range high quality realistic pictures database (resubmitted, 2013)Google Scholar
  32. 32.
    Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292CrossRefMathSciNetGoogle Scholar
  33. 33.
    Naphade M, Smith JR, Tesic J, Chang SF, Hsu W, Hauptmann A, Curtis J (2006) Large-scale concept ontology for multimedia. IEEE Multimed 13(3):86–91CrossRefGoogle Scholar
  34. 34.
    Neumann B, Möller R (2008) Ontology-based reasoning techniques for multimedia interpretation and retrieval. In: Semantic multimedia and ontologies: theory and applications. SpringerGoogle Scholar
  35. 35.
    Niles I, Pease A (2003) Linking lexicons and ontologies: mapping WordNet to the suggested upper merged ontology. Proceedings of the IEEE International Conference on Information and Knowledge Engineering, pp 412–416Google Scholar
  36. 36.
    Niles I, Pease A (2003) Linking lexicons and ontologies: mapping WordNet to the suggested upper merged ontology. Proceedings of the IEEE International Conference on In-formation and Knowledge Engineering, pp 412–416Google Scholar
  37. 37.
    O’Connor M (2005) Writing rules for the semantic web using SWRL and Jess. In: 8th International Protege Conference, Protégé with Rules Workshop. MadridGoogle Scholar
  38. 38.
    Pease A, Niles I, Li J (2002) The suggested upper merged ontology: a large ontology for the semantic web and its applications. In: Working notes of the AAAI-2002 workshop on ontologies and the semantic webGoogle Scholar
  39. 39.
    Peter C, Herbon A (2006) Emotion representation and physiology assignments in digital systems. Interact Comput 18:139–170CrossRefGoogle Scholar
  40. 40.
    Popović S, Horvat M, Kukolja D, Dropuljić B, Ćosić K (2009) Stress inoculation training supported by physiology-driven adaptive virtual reality stimulation. Stud Health Technol Inform 144:50–54Google Scholar
  41. 41.
    Prud’hommeaux E, Seaborne A (2007) SPARQL query language for RDF, W3C candidate recommendation,
  42. 42.
    Rottenberg J, Ray RD, Gross JJ (2007) Emotion elicitation using films. In: The handbook of emotion elicitation and assessment. Oxford University Press, USA, pp 9–28Google Scholar
  43. 43.
    Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178CrossRefGoogle Scholar
  44. 44.
    Schröder M, Baggia P, Burkhardt F, Pelachaud C, Peter C, Zovato E (2011) EmotionML–an upcoming standard for representing emotions and related states. Affective computing and intelligent interaction. Lect Notes Comput Sci 6974:316–325CrossRefGoogle Scholar
  45. 45.
    Sikora T (2001) The MPEG-7 visual standard for content description-an overview. IEEE Trans Circ Syst Video Technol 11(6):696–702CrossRefMathSciNetGoogle Scholar
  46. 46.
    Staab S, Studer R (eds) (2009) Handbook on ontologies. SpringerGoogle Scholar
  47. 47.
    Stevenson RA, James TW (2008) Affective auditory stimuli: characterization of the International Affective Digitized Sounds (IADS) by discrete emotional categories. Behav Res Methods 40(1):315–321CrossRefGoogle Scholar
  48. 48.
    Stevenson RA, Mikels JA, James TW (2007) Characterization of affective norms for English words by discrete emotional categories. Behav Res Methods 39:1020–1024CrossRefGoogle Scholar
  49. 49.
    Strapparava C, Valitutti A (2004) WordNet-affect: an affective extension of WordNet. In: Proceedings of the 4th International Conference on Language Resources and Evaluation (LREC 2004), Lisbon, pp 1083–1086Google Scholar
  50. 50.
    Suárez-Figueroa MC, Atemezing GA, Corcho O (2011) The landscape of multimedia ontologies in the last decade. Multimed Tools Appl 55(3)Google Scholar
  51. 51.
  52. 52.
    Tottenham N, Tanaka JW, Leon AC, McCarry T, Nurse M, Hare TA, Marcus DJ, Westerlund A, Casey BJ, Nelson C (2009) The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Res 168(3):242–249CrossRefGoogle Scholar
  53. 53.
    Villon O, Antipolis S, Lisetti C (2007) Toward recognizing individual’s subjective emotion from physiological signals in practical application. In: Twentieth IEEE International Symposium on Computer-Based Medical Systems, pp 357–362Google Scholar
  54. 54.
    Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Marko Horvat
    • 1
    Email author
  • Nikola Bogunović
    • 1
  • Krešimir Ćosić
    • 1
  1. 1.Faculty of Electrical Engineering and ComputingUniversity of ZagrebZagrebCroatia

Personalised recommendations