Advertisement

An Experience Oriented Video Digesting Method Using Heart Activity and Its Applicable Video Types

  • Satoshi Toyosawa
  • Takashi Kawai
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6297)

Abstract

An experience oriented, physiology based video digesting method is proposed, and its applicable video types are experimentally examined. The proposed method extracts shots that have made audiences most attentive by analysing two attention measures obtained from heart activity while watching. To assess its applicable types, three original videos that convey distinctive emotional quantity were prepared, and three test digests—shots selected randomly, subjectively and by the proposed method—were generated from each original. Then, the proposed method was evaluated not only by its precisions against the subjective selection, but also by digest viewing experience from subjective scores and a psychophysiological measure. The experiment showed that the proposed method was promising for those with arousing, event-driven contents. It was also suggested that use of multiple evaluation measures is important to exhibit applicability of a digesting method.

Keywords

Video digestion viewing experience heart rate heart rate variability (HRV) evaluation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Truong, B.T., Venkatesh, S.: A video abstraction: A systematic review and classification. ACM Trans Multimedia Computing, Communications & Applications 3(1), Article 3 (2007)Google Scholar
  2. 2.
    KDDI R&D Labs: Highlight creater, http://mmm.kddilabs.jp/en/hc/gui.html
  3. 3.
    Money, A.G., Agius, H.: Automating the extraction of emotion-related multimedia semantics. In: 19th British Computer Society Human Computer Interaction (BCS HCI) Group Annual Conf. (2005)Google Scholar
  4. 4.
    Arifin, S., Cheung, P.Y.K.: User attention based arousal content modeling. In: IEEE Int. Conf. Image Processing, ICIP 2006, pp. 433–436 (2006)Google Scholar
  5. 5.
    Hanjalic, A., Xu, L.Q.: User-oriented affective video content analysis. In: IEEE Workshop on Content-Based Access of Image and Video Libraries, CBAIVL 2001, pp. 50–57 (2001)Google Scholar
  6. 6.
    Kang, H.B.: Affective content detection using HMMs. In: ACM Int. Conf. Multimedia, MM 2003, pp. 259–262 (2003)Google Scholar
  7. 7.
    Xu, M., Chia, L.T., Jin, J.: Affective content analysis in comedy and horror videos by audio emotional event detection. In: IEEE Int. Conf. Multimedia and Expo. 2005, pp. 622–625 (2005)Google Scholar
  8. 8.
    Irie, G., Hidaka, K., Miyasato, N., Satou, T., Taniguchi, Y.: A video skimming method for detecting “laughter” scenes in consumer generated videos. J. Institute of Image Information and Television Emgineers 62(2), 227–233 (2008) (in Japanese)Google Scholar
  9. 9.
    Toyosawa, S., Kawai, T.: An assemblage of impressive shots - a video digesting method based on viewer’s heart activity. In: IASTED Int. Conf. Internet and Multimedia Systems and Applications, IMSA 2008, pp. 107–112 (2008)Google Scholar
  10. 10.
    Healey, J., Picard, R.W.: Startlecam: a cybernetic wearable camera. In: Proc. Second Int. Symposium on Wearable Computing, pp. 42–49 (1998)Google Scholar
  11. 11.
    Aizawa, K., Ishijima, K., Shiina, M.: Summarizing wearable video. In: Proc. 2001 Int. Conf. Image Processing, ICIP 2001, vol. 3, pp. 398–401 (2001)Google Scholar
  12. 12.
    Shibata, Y., Takahashi, Y., Kamada, M., Osawa, E., Kimura, H., Miura, M.: Weight DS extension for affect-based content characterization. MPEG 88/M5481, ISO/IEC JTC1/SC29/WG11 (1999)Google Scholar
  13. 13.
    Joho, H., Jose, J.M., Valenti, R.: Exploiting facial expression for affective video summarisation. In: ACM Int. Conf. on Image and Video Retrieval, CIVR 2009, pp. 1–8 (2009)Google Scholar
  14. 14.
    Ong, K.M., Kameyama, W.: Classification of video shots based on human affect. J. Institute of Image Information and Television Engineers 63(6), 847–856 (2009)Google Scholar
  15. 15.
    Anttonen, J., Surakka, V.: Emotions and heart rate while sitting on a chair. In: Proc. ACM Conf. Computer Human Interaction, CHI 2005, pp. 491–499 (2005)Google Scholar
  16. 16.
    Niizeki, K., Nishidate, I., Uchida, K., Kuwahara, M.: Unconstrained cardiorespiratory and body movements monitoring system for home care. Medical and Biological Engineering and Computing 43(6), 716–724 (2005)CrossRefGoogle Scholar
  17. 17.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical report, University of Florida, Gainesville, FL (2008)Google Scholar
  18. 18.
    Bradley, M.M., Codispoti, M., Cuthbert, B.N., Lang, P.J.: Emotion and motivation I: Defensive and appetitive reactions in picture processing. Emotion 1(3), 276–298 (2001)CrossRefGoogle Scholar
  19. 19.
    Leinhart, R.: Comparison of automatic shot boundary detection algorithms. In: Proc. SPIE Storage and Retrieval for Image and Video Database VII, pp. 290–301 (1999)Google Scholar
  20. 20.
    Öhman, A., Hamm, A., Hugdahl, K.: Cognition and the automatic nervous system. In: Cacioppo, J.T., Louis, L., Berntson, G.G. (eds.) Handbook of Psychophysiology, 2nd edn. pp. 533–575. Cambridge University Press, New York (2000)Google Scholar
  21. 21.
    Task Force of the European Society of Cardiology & The Northern American Society of Pacing, Electrophysiology: Heart rate variability: Standards of measurements, physiological interpretation, and clinical use. European Heart Journal 17, 354–381 (1996)Google Scholar
  22. 22.
    Toyosawa, S., Kawai, T.: Video digest based on heart rate. In: IASTED Int. Conf. Visualization, Imaging, and Image Processing, VIIP 2007, pp. 15–20 (2007)Google Scholar
  23. 23.
    Nakajima, Y.: Psychology of Visual Perception—Approaches to Effective Multimedia Application. Saiensu Sha, Tokyo (1996) (in Japanese)Google Scholar
  24. 24.
    Andreassi, J.L.: Psychophysiology: Human Behavior & Physiological Response, 5th edn. Lawrence Erlbaum, New Jersey (2007)Google Scholar
  25. 25.
    Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. & Mach. Intell. 23, 1175–1191 (2001)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Satoshi Toyosawa
    • 1
  • Takashi Kawai
    • 1
  1. 1.Global Information and Telecommunication InstituteWaseda UniversitySaitamaJapan

Personalised recommendations