Skip to main content
Log in

A group arousal analysis based on the movement synchronization of audiences

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Recent years have witnessed the rapid growth of performing arts in Korea as well as worldwide. Advances in performing arts technologies have allowed for a shift from one-way performances to interactive ones. The audience’s arousal is one of the most important features in interactive performances. An audience consists of a group of people responding collectively to a stimulus. Here arousal is a critical factor influencing audience satisfaction and can be measured based on the audience’s behavior. The total group arousal of an audience is formed by exchanging emotional effects with surroundings. In this regard, empirical approaches are not sufficient in comparison to various theoretical approaches to group arousal. Previous studies have generally evaluated group arousal by the sum of group members’ emotions recognized from their faces, gestures, or voice. However, it is not easy to apply real-time data from individuals to performing arts. In addition, it is difficult to set sensors for audiences to retrieve human data. In this regard, this paper proposes a method for empirically measuring group arousal based on the rapid movement synchronization of a given group. In the proposed method, the extent to which each member’s movement response is synchronized with differential images and histograms is measured first, and then group arousal is calculated by the degree of this synchronization. The performance of the proposed method is evaluated through an experiment by setting a threshold for deciding whether there is a response to a stimulus. The experimental results for 15 groups indicate the accuracy of the proposed method to be 82 %.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Anders S, Heinzle J, Weiskopf N, Ethofer T, Haynes J-D (2011) Flow of affective information between communicating brains. Neuro Image 54(1):439–446

    Google Scholar 

  2. Barsade S, Gibson DD (1998) Group emotion: a view from top and bottom. In: Gruenfeld D, Neale M, Mannix E Eds Research on managing in groups and teams, vol. 1, Greenwich, pp. 81–102

  3. Bongcheon-dong Ghost - http://www.youtube.com/watch?v= 3Fa8Ma4PmTw (Accessed 12 May, 2013)

  4. Cockroach Eating Contest at six Flags : http://www.youtube.com/embed/gIgrJUIokXc?feature=player_embedded (Accessed 12 May, 2013)

  5. Cowie R, Douglas-Cowie E, Savvidou S, McMahon E, Sawey M, Schröder V (2000) Feeltrace: an instrument for recording perceived emotion in real time. In Proc. of the ISCA Workshop on Speech and Emotion, pp. 19–24

  6. Csikszentmihalyi M (1990) Flow: the psychology of optimal experience. Harper and Row, New York

    Google Scholar 

  7. Cultwo Show - http://www.youtube.com/watch?v=Eq0hJQ5qdnc (Accessed 12 May, 2013)

  8. Duell R, Memon ZA, Treur J, van der Wal CN (2009) An ambient agent model for group emotion support. In Proceedings of 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1–8

  9. Ebisawa Y, Satoh S-I (1993) Effectiveness of pupil area detection technique using two light sources and image difference method. In Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1268–1269

  10. Ekman P, Friesen WV, Scherer K (1976) Body movement and voice pitch in deceptive interaction. Semiotica 16(1):23–27

    Article  Google Scholar 

  11. Haag A, Goronzy S, Schaich P, Williams J (2004) Emotion recognition using bio-sensors: first steps towards an automatic system. Affect Dialogue Syst Lect Notes Comput Sci 3068:36–48

    Article  Google Scholar 

  12. Jung MK (2012) Intelligent human emotion determination model using facial expression and gesture. M. A. diss., The School of Management, Kyung Hee Univ

  13. Kang YH (2008) Encyclopedia of life science. Academybooks

  14. Lauri N, Enrico G, Mikko V, Jaaskelainen LP, Riitta H, Mikko S (2012) Emotions promote social interaction by synchronizing brain activity across individuals. Proc Natl Acad Sci 109(24):9599–9604

    Article  Google Scholar 

  15. Lee C-S (2012) Non-linear factorised dynamic shape and appearance models for facial expression analysis and tracking. Comput Vision IET 6(6):567–580

    Article  Google Scholar 

  16. Lee JS, Lee MG (1997) A study on pattern recognition using DCT and neural network. Korean Inst Commun Inf Sci 22(3):481–492

    Google Scholar 

  17. Leung MK (1987) Human body motion segmentation in a complex scene. Pattern Recogn 20(1):55–64

    Article  Google Scholar 

  18. Park S-B, Ryu JM, Oh MG, Kim JK (2013) The measurement of group arousal via movement synchronization. Proc Int Conf Inf Sci Appl, pp. 226–229

  19. Sanchez-Burks J, Huy QN (2007) Emotional aperture and strategic renewal: the accurate recognition of collective emotions. Ross School of Business Working Paper Series, University of Michigan, April 2007

  20. Yoo JK (2007) A study on the effects of the arousal experience on satisfaction level : the case of tourist visiting TV drama location. Tour Res Assoc 15:389–400

    Google Scholar 

  21. Park S-B, Yoo E, Kim H, Jo G-S (2011) Automatic emotion annotation of movie dialogue using wordNet. In Proceeding of the 3rd Asian Conference on Intelligent Information and Database System (ACIIDS 2011), vol. 2, pp. 130–139

  22. S-B Park, You E, Jung JJ (2014) A cinemetric approach to sentimental processing on story-oriented contents. Qual Quant 48(1):49–62

  23. Park S-B, You E, Jung JJ (2011) Potential Emotion Word in Movie Dialog. In Proceedings of the International Conference on IT Convergence and Security 2011 (ICTICS 2011), Lecture Notes in Electrical Engineering, Springer, Netherlands, pp. 507–516

  24. S-B Park, You E, Jung JJ (2013) WordNet-based Emotion Words Extraction Considering Sense Distance, Information-An International Interdisciplinary Journal, International Information institute, Vol. 16, No. 8 (A), pp. 5441–5452

  25. Kim H-J, Park S-B, Jo G-S (2014) Affective social network - happiness inducing social media platform. Multimedia Tools and Applications 68(2):355–374

  26. Park S-B, Lee J-D, You E, Lee D (2014) Movie browsing system based on character and emotion. Multimedia Tools and Applications 68(2):391–400

  27. You E, Park S-B (2011) Improved performance of emotion extraction through banned words. In Proceedings of the International Conference on IT Convergence and Security 2011 (ICTICS 2011), Lecture Notes in Electrical Engineering, Springer, Netherlands, pp. 495–505

Download references

Acknowledgments

This work was supported by the Ministry of Culture, Sports and Tourism (MCST) and the Korea Creative Content Agency (KOCCA) of the Culture Technology (CT) Research & Development Program 2011. This work was supported by a grant from Kyung Hee University in 2013 (KHU-20130525).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Seung-Bo Park or Jae Kyeong Kim.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Park, SB., Ryu, J.M. & Kim, J.K. A group arousal analysis based on the movement synchronization of audiences. Multimed Tools Appl 74, 6431–6442 (2015). https://doi.org/10.1007/s11042-014-2088-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-014-2088-x

Keywords

Navigation