Skip to main content
Log in

STIMONT: a core ontology for multimedia stimuli description

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Affective multimedia documents such as images, sounds or videos elicit emotional responses in exposed human subjects. These stimuli are stored in affective multimedia databases and successfully used for a wide variety of research in psychology and neuroscience in areas related to attention and emotion processing. Although important all affective multimedia databases have numerous deficiencies which impair their applicability. These problems, which are brought forward in the paper, result in low recall and precision of multimedia stimuli retrieval which makes creating emotion elicitation procedures difficult and labor-intensive. To address these issues a new core ontology STIMONT is introduced. The STIMONT is written in OWL-DL formalism and extends W3C EmotionML format with an expressive and formal representation of affective concepts, high-level semantics, stimuli document metadata and the elicited physiology. The advantages of ontology in description of affective multimedia stimuli are demonstrated in a document retrieval experiment and compared against contemporary keyword-based querying methods. Also, a software tool Intelligent Stimulus Generator for retrieval of affective multimedia and construction of stimuli sequences is presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Notes

  1. http://www.ontologyportal.org/SUMO.owl

  2. http://www.fao.org/docrep/008/af243e/af243e00.htm

  3. http://www.linkeddatatools.com/downloads/jena-net

References

  1. Agius H (2010) Emotion description with MPEG-7. Affective computing and intelligent interaction. Emotion in HCI–designing for people, 13

  2. Arndt R, Troncy R, Staab S, Hardman L, Vacuram M (2007) COMM: designing a well-founded multimedia ontology for the web. The semantic web, lecture notes in computer science, vol. 4825. Springer, Berlin, pp 30–43

    Google Scholar 

  3. Baader F, Nutt W (2002) Basic description logics (description logic handbook). In: Baader F, Calvanese D, McGuinness DL, Nardi D, Patel-Schneider PF. Cambridge University Press, pp 47–100

  4. Baccianella S, Esuli A, Sebastiani F (2010) SentiWordNet 3.0: an enhanced lexical resource for sentiment analysis and opinion mining. In: Proceedings of LREC-10, 7th Conference on Language Resources and Evaluation, Valletta, MT, pp 2200–2204

  5. Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25:49–59

    Article  Google Scholar 

  6. Bradley MM, Lang PJ (1999) Affective norms for English words (ANEW): stimuli, instruction manual and affective ratings. Technical report C-1. The Center for Research in Psychophysiology, University of Florida, Gainesville

    Google Scholar 

  7. Bradley MM, Lang PJ (2000) Measuring emotion: behavior, feeling and physiology. In: Lane R, Nadel L (eds) Cognitive neuroscience of emotion. Oxford University Press, New York, pp 242–276

    Google Scholar 

  8. Bradley MM, Lang PJ (2007) The international affective digitized sounds (2nd edition; IADS-2): affective ratings of sounds and instruction manual. Technical report B-3. University of Florida, Gainesville

    Google Scholar 

  9. Bradley MM, Lang PJ (2007) Affective Norms for English Text (ANET): affective ratings of text and instruction manual. (Tech. Rep. No. D-1). University of Florida, Gainesville

    Google Scholar 

  10. Coan JA, Allen JJB (eds) (2007) The handbook of emotion elicitation and assessment. Oxford University Press series in affective science. Oxford University Press, USA

    Google Scholar 

  11. Dan-Glauser ES, Scherer KR (2011) The Geneva Affective PicturE Database (GAPED): a new 730 picture database focusing on valence and normative significance. Behav Res Methods 43(2):468–477

    Article  Google Scholar 

  12. Ekman P (1992) Are there basic emotions? Psychol Rev 99:550–553

    Article  Google Scholar 

  13. Fellbaum C (1998) WordNet: an electronic lexical database. MIT Press

  14. Gangemi A, Guarino N, Masolo C, Oltramari A, Schneider L (2002) Sweetening ontologies with DOLCE. Knowledge engineering and knowledge management: ontologies and the semantic web, lecture notes in computer science, vol. 2473. Springer, Berlin, pp 166–181

    Book  Google Scholar 

  15. Grandjean D, Sander D, Scherer KR (2008) Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization. Conscious Cogn 17:484–495

    Article  Google Scholar 

  16. Gross R (2005) Face databases. In: Li S, Jain A (eds) Handbook of face recognition. Springer-Verlag, The Robotics Inistitute, Carnegie Mellon University Forbes Avenue, Pittsburgh

    Google Scholar 

  17. Hauptmann A, Yan R, Lin W-H, Christel M, Wactlar H (2007) Can high-level concepts fill the semantic gap in video retrieval? a case study with broadcast news. IEEE Trans Multimed 9(5):958–966

    Article  Google Scholar 

  18. Hillmann D (2001) Using Dublin core. Recommendation. Dublin Core Metadata Initiative

  19. Hliaoutakis A, Varelas G, Voutsakis E, Petrakis EG, Milios E (2006) Information retrieval by semantic similarity. Int J Semant Web Inf Syst (IJSWIS) 2(3):55–73

    Article  Google Scholar 

  20. Horrocks I (2011) Tool support for ontology engineering. In: Fensel D (ed) Foundations for the web of information and services. Springer, pp 103–112

  21. Horvat M (2013) STIMONT: a core ontology for multimedia stimuli description–Supplementary Online Data. http://goo.gl/PocuK

  22. Horvat M, Grbin A, Gledec G (2012) WNtags: a web-based tool for image labeling and retrieval with lexical ontologies. Front Artif Intell Appl 243:585–594

    Google Scholar 

  23. Horvat M, Popović S, Bogunović N, Ćosić K (2009) Tagging multimedia stimuli with ontologies. Proceedings of the 32nd International Convention MIPRO 2009, Croatian Society for Information and Communication Technology, Electronics and Microelectronics–MIPRO, Opatija, Croatia, pp 203–208

  24. Horvat M, Popović S, Ćosić K. Multimedia stimuli databases usage patterns: a survey report. Proceedings of the 36th International Convention MIPRO 2013, Croatian Society for Information and Communication Technology, Electronics and Microelectronics–MIPRO, Opatija, Croatia (accepted for publication, 2013)

  25. Koelstra S, Muehl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I. DEAP: a database for emotion analysis using physiological signals. IEEE Transaction on Affective Computing, Special Issue on Naturalistic Affect Resources for System Building and Evaluation, in press

  26. Lang PJ, Bradley MM, Cuthbert BN (2008) International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical report A-8. University of Florida, Gainesville

    Google Scholar 

  27. Libkuman TM, Otani H, Kern R, Viger SG, Novak N (2007) Multidimensional normative ratings for the international affective picture system. Behav Res Methods 39(2):326–334

    Article  Google Scholar 

  28. Liu H, Singh P (2004) ConceptNet: a practical commonsense reasoning toolkit. BT Technol J 22:211–226

    Article  Google Scholar 

  29. López JM, Gil R, García R, Cearreta I, Garay N (2008) Towards an ontology for describing emotions. Lect Notes Comput Sci 5288:96–104

    Article  Google Scholar 

  30. Machajdik J, Hanbury A (2010) Affective image classification using features inspired by psychology and art theory. ACM Multimedia 2010–Multimedia Content Track Full Paper, Florence, Italy

  31. Marchewka A, Żurawski Ł, Jednoróg K, Grabowska A. The Nencki affective picture system (NAPS). Introduction to a novel standardized wide range high quality realistic pictures database (resubmitted, 2013)

  32. Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292

    Article  MathSciNet  Google Scholar 

  33. Naphade M, Smith JR, Tesic J, Chang SF, Hsu W, Hauptmann A, Curtis J (2006) Large-scale concept ontology for multimedia. IEEE Multimed 13(3):86–91

    Article  Google Scholar 

  34. Neumann B, Möller R (2008) Ontology-based reasoning techniques for multimedia interpretation and retrieval. In: Semantic multimedia and ontologies: theory and applications. Springer

  35. Niles I, Pease A (2003) Linking lexicons and ontologies: mapping WordNet to the suggested upper merged ontology. Proceedings of the IEEE International Conference on Information and Knowledge Engineering, pp 412–416

  36. Niles I, Pease A (2003) Linking lexicons and ontologies: mapping WordNet to the suggested upper merged ontology. Proceedings of the IEEE International Conference on In-formation and Knowledge Engineering, pp 412–416

  37. O’Connor M (2005) Writing rules for the semantic web using SWRL and Jess. In: 8th International Protege Conference, Protégé with Rules Workshop. Madrid

  38. Pease A, Niles I, Li J (2002) The suggested upper merged ontology: a large ontology for the semantic web and its applications. In: Working notes of the AAAI-2002 workshop on ontologies and the semantic web

  39. Peter C, Herbon A (2006) Emotion representation and physiology assignments in digital systems. Interact Comput 18:139–170

    Article  Google Scholar 

  40. Popović S, Horvat M, Kukolja D, Dropuljić B, Ćosić K (2009) Stress inoculation training supported by physiology-driven adaptive virtual reality stimulation. Stud Health Technol Inform 144:50–54

    Google Scholar 

  41. Prud’hommeaux E, Seaborne A (2007) SPARQL query language for RDF, W3C candidate recommendation, http://www.w3.org/TR/rdf-sparql-query/

  42. Rottenberg J, Ray RD, Gross JJ (2007) Emotion elicitation using films. In: The handbook of emotion elicitation and assessment. Oxford University Press, USA, pp 9–28

  43. Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178

    Article  Google Scholar 

  44. Schröder M, Baggia P, Burkhardt F, Pelachaud C, Peter C, Zovato E (2011) EmotionML–an upcoming standard for representing emotions and related states. Affective computing and intelligent interaction. Lect Notes Comput Sci 6974:316–325

    Article  Google Scholar 

  45. Sikora T (2001) The MPEG-7 visual standard for content description-an overview. IEEE Trans Circ Syst Video Technol 11(6):696–702

    Article  MathSciNet  Google Scholar 

  46. Staab S, Studer R (eds) (2009) Handbook on ontologies. Springer

  47. Stevenson RA, James TW (2008) Affective auditory stimuli: characterization of the International Affective Digitized Sounds (IADS) by discrete emotional categories. Behav Res Methods 40(1):315–321

    Article  Google Scholar 

  48. Stevenson RA, Mikels JA, James TW (2007) Characterization of affective norms for English words by discrete emotional categories. Behav Res Methods 39:1020–1024

    Article  Google Scholar 

  49. Strapparava C, Valitutti A (2004) WordNet-affect: an affective extension of WordNet. In: Proceedings of the 4th International Conference on Language Resources and Evaluation (LREC 2004), Lisbon, pp 1083–1086

  50. Suárez-Figueroa MC, Atemezing GA, Corcho O (2011) The landscape of multimedia ontologies in the last decade. Multimed Tools Appl 55(3)

  51. The Paul Ekman Group, LLC: https://face.paulekman.com/face/productdetail.aspx?pid=1

  52. Tottenham N, Tanaka JW, Leon AC, McCarry T, Nurse M, Hare TA, Marcus DJ, Westerlund A, Casey BJ, Nelson C (2009) The NimStim set of facial expressions: judgments from untrained research participants. Psychiatry Res 168(3):242–249

    Article  Google Scholar 

  53. Villon O, Antipolis S, Lisetti C (2007) Toward recognizing individual’s subjective emotion from physiological signals in practical application. In: Twentieth IEEE International Symposium on Computer-Based Medical Systems, pp 357–362

  54. Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marko Horvat.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Horvat, M., Bogunović, N. & Ćosić, K. STIMONT: a core ontology for multimedia stimuli description. Multimed Tools Appl 73, 1103–1127 (2014). https://doi.org/10.1007/s11042-013-1624-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-013-1624-4

Keywords

Navigation