Advertisement

Attention Monitoring for Music Contents Based on Analysis of Signal-Behavior Structures

  • Masatoshi Ohara
  • Akira Utsumi
  • Hirotake Yamazoe
  • Shinji Abe
  • Noriaki Katayama
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4843)

Abstract

In this paper, we propose a method to estimate user attention to displayed content signals with temporal analysis of their exhibited behavior. Detecting user attention and controlling contents are key issues in our “networked interaction therapy system” that effectively attracts the attention of memory-impaired people. In our proposed method, user behavior, including body motions (beat actions), is detected with auditory/vision-based methods. This design is based on our observations of the behavior of memory-impaired people under video watching conditions. User attention to the displayed content is then estimated based on body motions synchronized to auditorial signals. Estimated attention levels can be used for content control to attract deeper attention of viewers to the display system. Experimental results suggest that the proposed method effectively extracts user attention to musical signals.

Keywords

User Behavior Video Content User Motion Music Signal User Attention 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Boiman, O., Irani, M.: Detecting irregularities in images and in video. In: Proceedings of International Conference on Computer Vision, pp. 462–469 (2005)Google Scholar
  2. 2.
    Osawa, T., Wu, X., Wakabayashi, K., Yasuno, T.: Human tracking by particle filtering using full 3d model of both target and environment. In: Porceedings of International Conference on Pattern Recognition, pp. 25–28 (2006)Google Scholar
  3. 3.
    Kuwahara, N., Kuwabara, K., Utsumi, A., Yasuda, K., Tetsutani, N.: Networked interaction therapy: Relieving stress in memory-impaired people and their family members. In: Proc. of IEEE Engineering in Medicine and Biology Society, IEEE Computer Society Press, Los Alamitos (2004)Google Scholar
  4. 4.
    Utsumi, A., Kawato, S., Abe, S.: Attention monitoring based on temporal signal-behavior structures. In: Sebe, N., Lew, M.S., Huang, T.S. (eds.) Computer Vision in Human-Computer Interaction. LNCS, vol. 3766, pp. 100–109. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  5. 5.
    Kuwahara, N., Kuwabara, K., Tetsutani, N., Yasuda, K.: Reminiscence video - helping at-home caregivers of people with dementia. In: HOIT 2005, pp. 145–154 (2005)Google Scholar
  6. 6.
    Scheirer, E.D.: Tempo and beat analysis of acoustic musical signals. Journal of Accoustical Society of America 103(1), 588–601 (1998)CrossRefGoogle Scholar
  7. 7.
    Goto, M.: An audio-based real-time beat tracking system for music with or without drum-sounds. Journal of New Music Research 30(2), 159–171 (2001)CrossRefGoogle Scholar
  8. 8.
    Kanade, T., Rander, P., Narayanan, P.J.: Virtualized reality: Constructing virtual worlds from real scenes. IEEE MultiMedia 4(1), 34–47 (1997)CrossRefGoogle Scholar
  9. 9.
    Waggg, D.K., Nixon, M.S.: On automated model-based extraction and analysis of gait. In: Proc. of the 6th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 11–16. IEEE Computer Society Press, Los Alamitos (2004)CrossRefGoogle Scholar
  10. 10.
    Lim, J., Kriegman, D.: Tracking humans using prior and learned representations of shape and appearance. In: Proc. of the 6th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 869–874. IEEE Computer Society Press, Los Alamitos (2004)Google Scholar
  11. 11.
    Widrow, B., Hoff, M.E.: Adaptive switching circuit, 96–104 (1960)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Masatoshi Ohara
    • 1
    • 2
  • Akira Utsumi
    • 1
  • Hirotake Yamazoe
    • 1
  • Shinji Abe
    • 1
  • Noriaki Katayama
    • 2
  1. 1.ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seikacho, Sorakugun, Kyoto 619-0288Japan
  2. 2.Osaka Prefectural College of Technology, 26-12 Saiwaicho Neyagawashi, Osaka 572-8572Japan

Personalised recommendations