Abstract
Recent years have witnessed the rapid growth of performing arts in Korea as well as worldwide. Advances in performing arts technologies have allowed for a shift from one-way performances to interactive ones. The audience’s arousal is one of the most important features in interactive performances. An audience consists of a group of people responding collectively to a stimulus. Here arousal is a critical factor influencing audience satisfaction and can be measured based on the audience’s behavior. The total group arousal of an audience is formed by exchanging emotional effects with surroundings. In this regard, empirical approaches are not sufficient in comparison to various theoretical approaches to group arousal. Previous studies have generally evaluated group arousal by the sum of group members’ emotions recognized from their faces, gestures, or voice. However, it is not easy to apply real-time data from individuals to performing arts. In addition, it is difficult to set sensors for audiences to retrieve human data. In this regard, this paper proposes a method for empirically measuring group arousal based on the rapid movement synchronization of a given group. In the proposed method, the extent to which each member’s movement response is synchronized with differential images and histograms is measured first, and then group arousal is calculated by the degree of this synchronization. The performance of the proposed method is evaluated through an experiment by setting a threshold for deciding whether there is a response to a stimulus. The experimental results for 15 groups indicate the accuracy of the proposed method to be 82 %.
Similar content being viewed by others
References
Anders S, Heinzle J, Weiskopf N, Ethofer T, Haynes J-D (2011) Flow of affective information between communicating brains. Neuro Image 54(1):439–446
Barsade S, Gibson DD (1998) Group emotion: a view from top and bottom. In: Gruenfeld D, Neale M, Mannix E Eds Research on managing in groups and teams, vol. 1, Greenwich, pp. 81–102
Bongcheon-dong Ghost - http://www.youtube.com/watch?v= 3Fa8Ma4PmTw (Accessed 12 May, 2013)
Cockroach Eating Contest at six Flags : http://www.youtube.com/embed/gIgrJUIokXc?feature=player_embedded (Accessed 12 May, 2013)
Cowie R, Douglas-Cowie E, Savvidou S, McMahon E, Sawey M, Schröder V (2000) Feeltrace: an instrument for recording perceived emotion in real time. In Proc. of the ISCA Workshop on Speech and Emotion, pp. 19–24
Csikszentmihalyi M (1990) Flow: the psychology of optimal experience. Harper and Row, New York
Cultwo Show - http://www.youtube.com/watch?v=Eq0hJQ5qdnc (Accessed 12 May, 2013)
Duell R, Memon ZA, Treur J, van der Wal CN (2009) An ambient agent model for group emotion support. In Proceedings of 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1–8
Ebisawa Y, Satoh S-I (1993) Effectiveness of pupil area detection technique using two light sources and image difference method. In Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1268–1269
Ekman P, Friesen WV, Scherer K (1976) Body movement and voice pitch in deceptive interaction. Semiotica 16(1):23–27
Haag A, Goronzy S, Schaich P, Williams J (2004) Emotion recognition using bio-sensors: first steps towards an automatic system. Affect Dialogue Syst Lect Notes Comput Sci 3068:36–48
Jung MK (2012) Intelligent human emotion determination model using facial expression and gesture. M. A. diss., The School of Management, Kyung Hee Univ
Kang YH (2008) Encyclopedia of life science. Academybooks
Lauri N, Enrico G, Mikko V, Jaaskelainen LP, Riitta H, Mikko S (2012) Emotions promote social interaction by synchronizing brain activity across individuals. Proc Natl Acad Sci 109(24):9599–9604
Lee C-S (2012) Non-linear factorised dynamic shape and appearance models for facial expression analysis and tracking. Comput Vision IET 6(6):567–580
Lee JS, Lee MG (1997) A study on pattern recognition using DCT and neural network. Korean Inst Commun Inf Sci 22(3):481–492
Leung MK (1987) Human body motion segmentation in a complex scene. Pattern Recogn 20(1):55–64
Park S-B, Ryu JM, Oh MG, Kim JK (2013) The measurement of group arousal via movement synchronization. Proc Int Conf Inf Sci Appl, pp. 226–229
Sanchez-Burks J, Huy QN (2007) Emotional aperture and strategic renewal: the accurate recognition of collective emotions. Ross School of Business Working Paper Series, University of Michigan, April 2007
Yoo JK (2007) A study on the effects of the arousal experience on satisfaction level : the case of tourist visiting TV drama location. Tour Res Assoc 15:389–400
Park S-B, Yoo E, Kim H, Jo G-S (2011) Automatic emotion annotation of movie dialogue using wordNet. In Proceeding of the 3rd Asian Conference on Intelligent Information and Database System (ACIIDS 2011), vol. 2, pp. 130–139
S-B Park, You E, Jung JJ (2014) A cinemetric approach to sentimental processing on story-oriented contents. Qual Quant 48(1):49–62
Park S-B, You E, Jung JJ (2011) Potential Emotion Word in Movie Dialog. In Proceedings of the International Conference on IT Convergence and Security 2011 (ICTICS 2011), Lecture Notes in Electrical Engineering, Springer, Netherlands, pp. 507–516
S-B Park, You E, Jung JJ (2013) WordNet-based Emotion Words Extraction Considering Sense Distance, Information-An International Interdisciplinary Journal, International Information institute, Vol. 16, No. 8 (A), pp. 5441–5452
Kim H-J, Park S-B, Jo G-S (2014) Affective social network - happiness inducing social media platform. Multimedia Tools and Applications 68(2):355–374
Park S-B, Lee J-D, You E, Lee D (2014) Movie browsing system based on character and emotion. Multimedia Tools and Applications 68(2):391–400
You E, Park S-B (2011) Improved performance of emotion extraction through banned words. In Proceedings of the International Conference on IT Convergence and Security 2011 (ICTICS 2011), Lecture Notes in Electrical Engineering, Springer, Netherlands, pp. 495–505
Acknowledgments
This work was supported by the Ministry of Culture, Sports and Tourism (MCST) and the Korea Creative Content Agency (KOCCA) of the Culture Technology (CT) Research & Development Program 2011. This work was supported by a grant from Kyung Hee University in 2013 (KHU-20130525).
Author information
Authors and Affiliations
Corresponding authors
Rights and permissions
About this article
Cite this article
Park, SB., Ryu, J.M. & Kim, J.K. A group arousal analysis based on the movement synchronization of audiences. Multimed Tools Appl 74, 6431–6442 (2015). https://doi.org/10.1007/s11042-014-2088-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-014-2088-x