Advertisement

Mimicking adaptation processes in the human brain with neural network retraining

  • Lori Malatesta
  • Amaryllis Raouzaiou
  • George Caridakis
  • Kostas Karpouzis
Part of the IFIP The International Federation for Information Processing book series (IFIPAICT, volume 247)

Abstract

Human brain processes undergo cycles of adaptation in order to meet the requirements of novel conditions. In affective state recognition, brain processes tend to adapt to new subjects as well as environmental changes. By using adaptive neural network architectures and by collecting and analysing data from specific environments we present an effective approach in mimicking these processes and modelling the way the need for adaptation is detected as well as the actual adaptation. Video sequences of subjects displaying emotions are used as data for our classifier. Facial expressions and body gestures are used as system input and system output quality is monitored in order to identify when retraining is required. This architecture can be used as an automatic analyzer of human affective feedback in human computer interaction applications.

Keywords

Facial Expression Emotion Recognition Emotional Intelligence Network Weight Facial Expression Recognition 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    N. Doulamis, A. Doulamis and S. Kollias, On-Line Retrainable Neural Networks: Improving Performance of Neural Networks in Image Analysis Problems, IEEE Transactions on Neural Networks, vol. 11, no.l, pp. 1–20, January 2000.Google Scholar
  2. 2.
    D. Goleman, Emotional Intelligence. Bantam Books, New York, NY, USA, 1995.Google Scholar
  3. 3.
    S. Ioannou, A. Raouzaiou, V. Tzouvaras, T. Mailis, K. Karpouzis, S. Kollias, Emotion recognition through facial expression analysis based on a neurofuzzy network, Neural Networks, Elsevier, Volume 18, Issue 4, May 2005, Pages 423–435CrossRefGoogle Scholar
  4. 4.
    Humaine Network of Excellence on Emotions, http://emotion-research.netGoogle Scholar
  5. 5.
    D. Park, M. A. EL-Sharkawi, and R. J. Marks II, An adaptively trained neural network, IEEE Trans. Neural Networks, vol. 2, pp. 334–345, 1991.CrossRefGoogle Scholar
  6. 6.
    R. Picard, Affective Computing, The MIT Press, Cambridge, MA, USA, 1997.Google Scholar
  7. 7.
    J. Taylor, N. Fragopanagos, The interaction of attention and emotion, Neural Networks, Volume 18, Issue 4, May 2005, pp. 353–369.CrossRefGoogle Scholar
  8. 8.
    R. Adolphs, D. Tranel, H. Damasio and A. Damasio, Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdale, Nature, 372, 669–672, 1994.CrossRefGoogle Scholar
  9. 9.
    M.L Phillips, A. W. Young, C. Senior, M. Brammer, C. Andrew, AJ. Calder, E.T. Bullmore, D.I. Perrett, D. Rowland, S.C.R. Williams, J.A. Gray & A.S. David, A specific neural substrate for perceiving facial expressions of disgust, Nature 389, 495–498, 1997.CrossRefGoogle Scholar

Copyright information

© International Federation for Information Processing 2007

Authors and Affiliations

  • Lori Malatesta
    • 1
  • Amaryllis Raouzaiou
    • 1
  • George Caridakis
    • 1
  • Kostas Karpouzis
    • 1
  1. 1.Image, Video and Multimedia Systems Laboratory, National TechnicalUniversity of AthensZografouGreece

Personalised recommendations