Skip to main content

A Survey on Databases of Facial Macro-expression and Micro-expression

  • Conference paper
  • First Online:
Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2018)

Abstract

A crucial step for developing and testing a system of facial expression analysis is to choose the database which suits best the targeted context application. We propose in this paper a survey based on the review of 69 databases, taking into account both macro- and micro-expressions. To the best of our knowledge, there are no other surveys with so many databases. We review the existing facial expression databases according to 18 characteristics grouped in 6 categories (population, modalities, data acquisition hardware, experimental conditions, experimental protocol and annotations). These characteristics are meant to be helpful for researchers when they are choosing a database which suits their context application. We bring to light the trends between posed, spontaneous and in-the-wild databases, as well as micro-expression databases. We finish with future directions, including crowd sourcing and databases with groups of people.

Supported by China Scholarship Council and ANR French Reflet (AAP Generique 2017).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. http://mplab.ucsd.edu/grants/project1/research/rufacs1-dataset.html (2006)

  2. http://www.cse.oulu.fi/CMV/Downloads/Oulu-CASIA (2009)

  3. http://pics.stir.ac.uk (2013)

  4. Abrilian, S., Devillers, L., Buisine, S., Martin, J.C.: EmoTV1: annotation of real-life emotions for the specification of multimodal affective interfaces. In: HCI International (2005)

    Google Scholar 

  5. Aifanti, N., Papachristou, C., Delopoulos, A.: The mug facial expression database. In: 2010 11th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS), pp. 1–4. IEEE (2010)

    Google Scholar 

  6. Anitha, C., Venkatesha, M., Adiga, B.S.: A survey on facial expression databases. Int. J. Eng. Sci. Technol. 2(10), 5158–5174 (2010)

    Google Scholar 

  7. Bänziger, T., Pirker, H., Scherer, K.: GEMEP-GEneva multimodal emotion portrayals: a corpus for the study of multimodal emotional expressions. In: Proceedings of LREC, vol. 6, pp. 15–019 (2006)

    Google Scholar 

  8. Black, M.J., Yacoob, Y.: Recognizing facial expressions in image sequences using local parameterized models of image motion. Int. J. Comput. Vision 25(1), 23–48 (1997)

    Article  Google Scholar 

  9. Busso, C., et al.: IEMOCAP: interactive emotional dyadic motion capture database. Lang. Resour. Eval. 42(4), 335–359 (2008)

    Article  Google Scholar 

  10. Cohn, J.F., Schmidt, K.L.: The timing of facial motion in posed and spontaneous smiles. Int. J. Wavelets Multiresolut. Inf. Process. 2(02), 121–132 (2004)

    Article  Google Scholar 

  11. Cosker, D., Krumhuber, E., Hilton, A.: A FACS valid 3D dynamic action unit database with applications to 3D dynamic Morphable facial modeling. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 2296–2303. IEEE (2011)

    Google Scholar 

  12. Cowie, R., Douglas-Cowie, E., Cox, C.: Beyond emotion archetypes: databases for emotion modelling using neural networks. Neural Netw. 18(4), 371–388 (2005)

    Article  Google Scholar 

  13. Davison, A.K., Lansley, C., Costen, N., Tan, K., Yap, M.H.: SAMM: a spontaneous micro-facial movement dataset. IEEE Trans. Affect. Comput. 9(1), 116–129 (2018)

    Article  Google Scholar 

  14. Davison, A.K., Merghani, W., Yap, M.H.: Objective classes for micro-facial expression recognition. arXiv preprint arXiv:1708.07549 (2017)

  15. Dhall, A., Goecke, R., Lucey, S., Gedeon, T.: Collecting large, richly annotated facial-expression databases from movies (2012)

    Article  Google Scholar 

  16. Dhall, A., Goecke, R., Gedeon, T.: Automatic group happiness intensity analysis. IEEE Trans. Affect. Comput. 6(1), 13–26 (2015)

    Article  Google Scholar 

  17. Dhall, A., Goecke, R., Lucey, S., Gedeon, T.: Static facial expression analysis in tough conditions: data, evaluation protocol and benchmark. In: 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 2106–2112. IEEE (2011)

    Google Scholar 

  18. Douglas-Cowie, E., Cowie, R., Cox, C., Amier, N., Heylen, D.: The sensitive artificial listner: an induction technique for generating emotionally coloured conversation (2008)

    Google Scholar 

  19. Douglas-Cowie, E., Cowie, R., Schröder, M.: A new emotion database: considerations, sources and scope. In: ISCA Tutorial and Research Workshop (ITRW) on Speech and Emotion (2000)

    Google Scholar 

  20. Ebner, N.C., Riediger, M., Lindenberger, U.: FACES–a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav. Res. Methods 42(1), 351–362 (2010)

    Article  Google Scholar 

  21. Ekman, P.: Lie catching and microexpressions. In: The Philosophy of Deception, pp. 118–133 (2009)

    Chapter  Google Scholar 

  22. Ekman, P., Friesen, W.V.: Nonverbal leakage and clues to deception. Psychiatry 32(1), 88–106 (1969)

    Article  Google Scholar 

  23. Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17(2), 124 (1971)

    Article  Google Scholar 

  24. Ekman, P., Friesen, W.V.: Facial action coding system (1977)

    Google Scholar 

  25. Fanelli, G., Gall, J., Romsdorfer, H., Weise, T., Van Gool, L.: A 3-D audio-visual corpus of affective communication. IEEE Trans. Multimedia 12(6), 591–598 (2010)

    Article  Google Scholar 

  26. Fu, S., Yang, G., Kuai, X., Zheng, R.: A parametric survey for facial expression database. In: Zhang, H., Hussain, A., Liu, D., Wang, Z. (eds.) BICS 2012. LNCS (LNAI), vol. 7366, pp. 373–381. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31561-9_42

    Chapter  Google Scholar 

  27. Girard, J.M., Chu, W.S., Jeni, L.A., Cohn, J.F., De la Torre, F.: Sayette group formation task (GFT) spontaneous facial expression database (2017)

    Google Scholar 

  28. Grimm, M., Kroschel, K., Narayanan, S.: The Vera am Mittag German audio-visual emotional speech database. In: 2008 IEEE International Conference on Multimedia and Expo, pp. 865–868. IEEE (2008)

    Google Scholar 

  29. Gross, R., Matthews, I., Cohn, J., Kanade, T., Baker, S.: Multi-pie. Image Vis. Comput. 28(5), 807–813 (2010)

    Article  Google Scholar 

  30. Gunes, H., Piccardi, M.: A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior. In: 18th International Conference on Pattern Recognition, ICPR 2006, vol. 1, pp. 1148–1153. IEEE (2006)

    Google Scholar 

  31. Husák, P., C̆ech, J., Matas, J.: Spotting facial micro-expressions “in the wild”. In: Proceedings of the 22nd Computer Vision Winter Workshop, Pattern Recognition and Image Processing Group (PRIP) and PRIP Club (2017). http://cvww2017.prip.tuwien.ac.at/papers/CVWW2017_paper_17.pdf

  32. Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Proceedings, pp. 46–53. IEEE (2000)

    Google Scholar 

  33. Kaulard, K., Cunningham, D.W., Bülthoff, H.H., Wallraven, C.: The MPI facial expression database a validated database of emotional and conversational facial expressions. PLoS ONE 7(3), e32321 (2012)

    Article  Google Scholar 

  34. Kim, E., Vangala, S.: Vinereactor: crowdsourced spontaneous facial expression data. In: International Conference on Multimedia Retrieval (ICMR). IEEE (2016)

    Google Scholar 

  35. Koelstra, S., et al.: DEAP: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)

    Article  Google Scholar 

  36. Kossaifi, J., Tzimiropoulos, G., Todorovic, S., Pantic, M.: AFEW-VA database for valence and arousal estimation in-the-wild. Image Vis. Comput. 65, 23–36 (2017)

    Article  Google Scholar 

  37. Krumhuber, E.G., Skora, L., Küster, D., Fou, L.: A review of dynamic datasets for facial expression research. Emot. Rev. 9(3), 280–292 (2017)

    Article  Google Scholar 

  38. Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D.H., Hawk, S.T., van Knippenberg, A.: Presentation and validation of the radboud faces database. Cogn. Emot. 24(8), 1377–1388 (2010)

    Article  Google Scholar 

  39. Li, X., et al.: Reading hidden emotions: spontaneous micro-expression spotting and recognition. arXiv preprint arXiv:1511.00423 (2015)

  40. Li, X., Pfister, T., Huang, X., Zhao, G., Pietikäinen, M.: A spontaneous micro-expression database: inducement, collection and baseline. In: 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), pp. 1–6. IEEE (2013)

    Google Scholar 

  41. Li, Y., Tao, J., Chao, L., Bao, W., Liu, Y.: CHEAVD: a Chinese natural emotional audio-visual database. J. Ambient Intell. Humaniz. Comput. 8, 1–12 (2016)

    Google Scholar 

  42. Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 94–101. IEEE (2010)

    Google Scholar 

  43. Lucey, P., Cohn, J.F., Prkachin, K.M., Solomon, P.E., Matthews, I.: Painful data: the UNBC-McMaster shoulder pain expression archive database. In: 2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011), pp. 57–64. IEEE (2011)

    Google Scholar 

  44. Lundqvist, D., Flykt, A., Öhman, A.: The Karolinska Directed Emotional Faces - KDEF, CD ROM from Department of Clinical Neuroscience, Psychology Section, Karolinska Institutet (1998)

    Google Scholar 

  45. Lyons, M., Akamatsu, S., Kamachi, M., Gyoba, J.: Coding facial expressions with Gabor wavelets. In: Third IEEE International Conference on Automatic Face and Gesture Recognition, Proceedings, pp. 200–205. IEEE (1998)

    Google Scholar 

  46. Mahmoud, M., Baltrušaitis, T., Robinson, P., Riek, L.D.: 3D corpus of spontaneous complex mental states. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011. LNCS, vol. 6974, pp. 205–214. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-24600-5_24

    Chapter  Google Scholar 

  47. Martinez, B., Valstar, M.F.: Advances, challenges, and opportunities in automatic facial expression recognition. In: Kawulok, M., Celebi, M.E., Smolka, B. (eds.) Advances in Face Detection and Facial Image Analysis, pp. 63–100. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-25958-1_4

    Chapter  Google Scholar 

  48. Mavadati, M., Sanger, P., Mahoor, M.H.: Extended DISFA dataset: investigating posed and spontaneous facial expressions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 1–8 (2016)

    Google Scholar 

  49. Mavadati, S.M., Mahoor, M.H., Bartlett, K., Trinh, P., Cohn, J.F.: DISFA: a spontaneous facial action intensity database. IEEE Trans. Affect. Comput. 4(2), 151–160 (2013)

    Article  Google Scholar 

  50. McDuff, D., Amr, M., El Kaliouby, R.: AM-FED+: an extended dataset of naturalistic facial expressions collected in everyday settings. IEEE Trans. Affect. Comput. 10, 7–17 (2018)

    Article  Google Scholar 

  51. McDuff, D., El Kaliouby, R., Senechal, T., Amr, M., Cohn, J.F., Picard, R.: Affectiva-MIT facial expression dataset (AM-FED): naturalistic and spontaneous facial expressions collected “in-the-wild”. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 881–888. IEEE (2013)

    Google Scholar 

  52. McKeown, G., Valstar, M.F., Cowie, R., Pantic, M.: The semaine corpus of emotionally coloured character interactions. In: 2010 IEEE International Conference on Multimedia and Expo (ICME), pp. 1079–1084. IEEE (2010)

    Google Scholar 

  53. Merghani, W., Davison, A.K., Yap, M.H.: A review on facial micro-expressions analysis: datasets, features and metrics. arXiv preprint arXiv:1805.02397 (2018)

  54. Mollahosseini, A., Hasani, B., Mahoor, M.H.: AffectNet: a database for facial expression, valence, and arousal computing in the wild. arXiv preprint arXiv:1708.03985 (2017)

  55. Pantic, M., Valstar, M., Rademaker, R., Maat, L.: Web-based database for facial expression analysis. In: IEEE International Conference on Multimedia and Expo, ICME 2005, p. 5. IEEE (2005)

    Google Scholar 

  56. Polikovsky, S., Kameda, Y., Ohta, Y.: Facial micro-expressions recognition using high speed camera and 3D-gradient descriptor (2009)

    Google Scholar 

  57. Qu, F., Wang, S.J., Yan, W.J., Li, H., Wu, S., Fu, X.: CAS(ME)2: a database for spontaneous macro-expression and micro-expression spotting and recognition. IEEE Trans. Affect. Comput. 9, 424–436 (2017)

    Article  Google Scholar 

  58. Ringeval, F., Sonderegger, A., Sauer, J., Lalanne, D.: Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions. In: Proceedings of EmoSPACE 2013, Held in Conjunction with FG 2013. IEEE, Shanghai, April 2013

    Google Scholar 

  59. Russell, J.A., Pratt, G.: A description of the affective quality attributed to environments. J. Pers. Soc. Psychol. 38(2), 311 (1980)

    Article  Google Scholar 

  60. Savran, A., et al.: Bosphorus database for 3D face analysis. In: Schouten, B., Juul, N.C., Drygajlo, A., Tistarelli, M. (eds.) BioID 2008. LNCS, vol. 5372, pp. 47–56. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-89991-4_6

    Chapter  Google Scholar 

  61. Savran, A., et al.: Emotion detection in the loop from brain signals and facial images (2006)

    Google Scholar 

  62. Schmidt, K.L., Ambadar, Z., Cohn, J.F., Reed, L.I.: Movement differences between deliberate and spontaneous facial expressions: Zygomaticus major action in smiling. J. Nonverbal Behav. 30(1), 37–52 (2006)

    Article  Google Scholar 

  63. Schmidt, K.L., Cohn, J.F.: Dynamics of facial expression: normative characteristics and individual differences. In: ICME. Citeseer (2001)

    Google Scholar 

  64. Shreve, M., Godavarthy, S., Goldgof, D., Sarkar, S.: Macro-and micro-expression spotting in long videos using spatio-temporal strain. In: 2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011), pp. 51–56. IEEE (2011)

    Google Scholar 

  65. Sneddon, I., McRorie, M., McKeown, G., Hanratty, J.: The belfast induced natural emotion database. IEEE Trans. Affect. Comput. 3(1), 32–41 (2012)

    Article  Google Scholar 

  66. Soleymani, M., Lichtenauer, J., Pun, T., Pantic, M.: A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3(1), 42–55 (2012)

    Article  Google Scholar 

  67. Stratou, G., Ghosh, A., Debevec, P., Morency, L.P.: Effect of illumination on automatic expression recognition: a novel 3D relightable facial database. In: 2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011), pp. 611–618. IEEE (2011)

    Google Scholar 

  68. Tcherkassof, A., Dupré, D., Meillon, B., Mandran, N., Dubois, M., Adam, J.M.: DynEmo: a video database of natural facial expressions of emotions. Int. J. Multimedia Appl. 5(5), 61–80 (2013)

    Article  Google Scholar 

  69. Toole, A.J., et al.: A video database of moving faces and people. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 812–816 (2005)

    Article  Google Scholar 

  70. Valstar, M., Pantic, M.: Induced disgust, happiness and surprise: an addition to the MMI facial expression database. In: Proceedings 3rd International Workshop on EMOTION (satellite of LREC): Corpora for Research on Emotion and Affect, p. 65 (2010)

    Google Scholar 

  71. Valstar, M., et al.: AVEC 2013: the continuous audio/visual emotion and depression recognition challenge. In: Proceedings of the 3rd ACM International Workshop on Audio/Visual Emotion Challenge, pp. 3–10. ACM (2013)

    Google Scholar 

  72. Valstar, M.F., Gunes, H., Pantic, M.: How to distinguish posed from spontaneous smiles using geometric features. In: Proceedings of the 9th International Conference on Multimodal Interfaces, pp. 38–45. ACM (2007)

    Google Scholar 

  73. Van Der Schalk, J., Hawk, S.T., Fischer, A.H., Doosje, B.: Moving faces, looking places: validation of the amsterdam dynamic facial expression set (ADFES). Emotion 11(4), 907 (2011)

    Article  Google Scholar 

  74. Vinciarelli, A., Dielmann, A., Favre, S., Salamin, H.: Canal9: a database of political debates for analysis of social interactions. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009, pp. 1–4. IEEE (2009)

    Google Scholar 

  75. Wang, S., et al.: A natural visible and infrared facial expression database for expression recognition and emotion inference. IEEE Trans. Multimedia 12(7), 682–691 (2010)

    Article  Google Scholar 

  76. Warren, G., Schertler, E., Bull, P.: Detecting deception from emotional and unemotional cues. J. Nonverbal Behav. 33(1), 59–69 (2009)

    Article  Google Scholar 

  77. Weber, R., Soladié, C., Séguier, R.: A survey on databases for facial expression analysis. In: Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2018), Volume 5, VISAPP, Funchal, Madeira, Portugal, 27–29 January 2018, pp. 73–84 (2018). https://doi.org/10.5220/0006553900730084

  78. Yan, W.J., et al.: CASME II: an improved spontaneous micro-expression database and the baseline evaluation. PLoS ONE 9(1), e86041 (2014)

    Article  Google Scholar 

  79. Yan, W.J., Wu, Q., Liu, Y.J., Wang, S.J., Fu, X.: CASME database: a dataset of spontaneous micro-expressions collected from neutralized faces. In: 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), pp. 1–7. IEEE (2013)

    Google Scholar 

  80. Yap, M.H., See, J., Hong, X., Wang, S.J.: Facial micro-expressions grand challenge 2018 summary. In: 2018 13th IEEE International Conference on Automatic Face and Gesture Recognition (FG), pp. 675–678. IEEE (2018)

    Google Scholar 

  81. Yin, L., Chen, X., Sun, Y., Worm, T., Reale, M.: A high-resolution 3D dynamic facial expression database. In: 8th IEEE International Conference on Automatic Face & Gesture Recognition, FG 2008, pp. 1–6. IEEE (2008)

    Google Scholar 

  82. Yin, L., Wei, X., Sun, Y., Wang, J., Rosato, M.J.: A 3D facial expression database for facial behavior research. In: 7th International Conference on Automatic Face and Gesture Recognition, FGR 2006, pp. 211–216. IEEE (2006)

    Google Scholar 

  83. Zafeiriou, S., et al.: Facial affect “in-the-wild”: a survey and a new database. In: International Conference on Computer Vision (2016)

    Google Scholar 

  84. Zara, A., Maffiolo, V., Martin, J.C., Devillers, L.: Collection and annotation of a corpus of human-human multimodal interactions: emotion and others anthropomorphic characteristics. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 464–475. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-74889-2_41

    Chapter  Google Scholar 

  85. Zeng, Z., Pantic, M., Roisman, G., Huang, T.S., et al.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)

    Article  Google Scholar 

  86. Zhalehpour, S., Onder, O., Akhtar, Z., Erdem, C.E.: BAUM-1: a spontaneous audio-visual face database of affective and mental states. IEEE Trans. Affect. Comput. 8, 300–313 (2016)

    Article  Google Scholar 

  87. Zhang, L., et al.: “BioVid Emo DB”: a multimodal database for emotion analyses validated by subjective ratings. In: 2016 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–6. IEEE (2016)

    Google Scholar 

  88. Zhang, X., et al.: BP4D-spontaneous: a high-resolution spontaneous 3D dynamic facial expression database. Image Vis. Comput. 32(10), 692–706 (2014)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Raphaël Weber .

Editor information

Editors and Affiliations

Appendix

Appendix

For a purpose of clarity, the references are not included in the tables of Sect. 2. We report here the corresponding references to all the databases we review (Tables 12 and 13).

Table 12. References of macro-expression databases.
Table 13. References of micro-expression databases.

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Weber, R., Li, J., Soladié, C., Séguier, R. (2019). A Survey on Databases of Facial Macro-expression and Micro-expression. In: Bechmann, D., et al. Computer Vision, Imaging and Computer Graphics Theory and Applications. VISIGRAPP 2018. Communications in Computer and Information Science, vol 997. Springer, Cham. https://doi.org/10.1007/978-3-030-26756-8_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-26756-8_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-26755-1

  • Online ISBN: 978-3-030-26756-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics