Advertisement

Mining Facial Keypoint Data: The Quest Toward Personalized Engineering Applications

  • Christian LopezEmail author
  • Conrad Tucker
Chapter

Abstract

Personalized applications have the potential to enhance the performance and motivation of individuals in a wide range of engineering tasks. Current methods focus on predicting the affective state (i.e., emotion) of individuals in order to provide personalized intervention. However, these methods may struggle to predict the affective state of an individual that it has not been trained for. Furthermore, depending on the attributes of the tasks and individuals, the affective state that correlates to good performance could vary. In light of these limitations, in previous studies, the authors proposed a machine learning method to predict the performance of individuals based on their facial expressions captured while reading the instructions of a task. This chapter presents the different steps of the method and introduces a case study in an engineering laboratory environment. Furthermore, a benchmark analysis of multiple machine learning algorithms is presented. The findings support the use of neural networks and individual-specific models that consider task information and individuals’ facial expressions to predict their performance. This work could potentially advance personalized applications in engineering environments and help provide real-time feedback to individuals.

Notes

Acknowledgements

This research is funded in part by NSF NRI # 1527148. Any opinions, findings, or conclusions found in this paper are those of the authors and do not necessarily reflect the views of the sponsors.

References

  1. 1.
    Polzin TS (2000) Verbal and non-verbal cues in the communication of emotions. In: ICASSP, IEEE international conference on acoustics, speech and signal processing—proceedings, pp 2429–2432Google Scholar
  2. 2.
    Mortensen CD (2011) Communication theory. Transaction Publishers, 478pGoogle Scholar
  3. 3.
    Ben Ammar M, Neji M, Alimi AM, Gouardres G (2010) The affective tutoring system. Expert Syst Appl 37(4):3013–3023CrossRefGoogle Scholar
  4. 4.
    Marchand GC, Gutierrez AP (2012) The role of emotion in the learning process: comparisons between online and face-to-face learning settings. Internet High Educ 15(3):150–160CrossRefGoogle Scholar
  5. 5.
    Tian F, Gao P, Li L, Zhang W, Liang H, Qian Y et al (2014) Recognizing and regulating e-learners’ emotions based on interactive Chinese texts in e-learning systems. Knowledge-Based Syst 55:148–164CrossRefGoogle Scholar
  6. 6.
    Desivilya HS, Yagil D (2005) The role of emotions in conflict management: the case of work teams. Int J Confl Manage 16(1):55–69CrossRefGoogle Scholar
  7. 7.
    Fredrickson BL (2004) The broaden-and-build theory of positive emotions. Philos Trans Soc Lond Ser B Biol Sci. 1367–1378Google Scholar
  8. 8.
    Lauche K (2005) Job design for good design practice. Des Stud 26(2):191–213CrossRefGoogle Scholar
  9. 9.
    Egan PF, Leduc PR (2013) Utilizing emergent levels to facilitate complex systems design: demonstrated in a synthetic biology domain. In: ASME IDETC/CIE conference, DETC2013-12072Google Scholar
  10. 10.
    Adams RS, Turns J, Atman CJ (2003) Educating effective engineering designers: the role of reflective practice. In: Design studies, pp 275–294CrossRefGoogle Scholar
  11. 11.
    Vest C (2008) Context and challenge for twenty-first century engineering education. J Eng Educ 97(3):235–236CrossRefGoogle Scholar
  12. 12.
    Wu CH, Huang YM, Hwang JP (2015) Review of affective computing in education/learning: trends and challenges. Br J Educ Technol 47(6):1304–1323CrossRefGoogle Scholar
  13. 13.
    Metaxas D, Zhang S (2013) A review of motion analysis methods for human nonverbal communication computing. Image Vis Comput 31(6–7):421–433CrossRefGoogle Scholar
  14. 14.
    D’Mello SK, Kory J (2015) A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv 47(3):A43CrossRefGoogle Scholar
  15. 15.
    Calvo RA, D’Mello S (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1(1):18–37CrossRefGoogle Scholar
  16. 16.
    Bezawada S, Hu Q, Gray A, Brick T, Tucker C (2017) Automatic facial feature extraction for predicting designers’ comfort with engineering equipment during prototype creation. J Mech Des 139(2):021102CrossRefGoogle Scholar
  17. 17.
    Lin HCK, Wu CH, Hsueh YP (2014) The influence of using affective tutoring system in accounting remedial instruction on learning performance and usability. Comput Human Behav 41:514–522CrossRefGoogle Scholar
  18. 18.
    Mondragon AL, Nkambou R, Poirier P (2016) Evaluating the effectiveness of an affective tutoring agent in specialized education. In: European conference on technology enhanced learning, pp 446–452Google Scholar
  19. 19.
    Bahreini K, Nadolski R, Westera W (2016) Towards real-time speech emotion recognition for affective e-learning. Educ Inf Technol 21(5):1367–1386CrossRefGoogle Scholar
  20. 20.
    Picard RW (1995) Affective computing. MIT Press, Cambridge, pp 1–16Google Scholar
  21. 21.
    Zhang L, Tjondronegoro D (2011) Facial expression recognition using facial movement features. IEEE Trans Affect Comput 2(4):219–229CrossRefGoogle Scholar
  22. 22.
    Yannakakis GN, Hallam J (2008) Real-time adaptation of augmented-reality games for optimizing player satisfaction. In: IEEE symposium on computational intelligence and games (CIG 2008), pp 103–10Google Scholar
  23. 23.
    Hu Q, Bezawada S, Gray A, Tucker C, Brick T (2016) Exploring the link between task complexity and students’ affective states during engineering laboratory activities. In: ASME IDETC/CIE conference, DETC2016-59757Google Scholar
  24. 24.
    Lopez C, Tucker C (2018) Towards personalized performance feedback: mining the dynamics of facial keypoint data in engineering lab environments. In: ASEE Mid-Atlantic Section Spring conferenceGoogle Scholar
  25. 25.
    Lopez CE, Tucker CS (2017) From mining affective states to mining facial keypoint data: the quest towards personalized feedback. In: ASME IDETC/CIE conference, DETC2017-67340Google Scholar
  26. 26.
    Kotsiantis SB (2007) Supervised machine learning: a review of classification techniques. Informatica 31:249–268MathSciNetzbMATHGoogle Scholar
  27. 27.
    Sonalkar N, Jung M, Mabogunje A (2011) Emotion in engineering design teams. In: Emotional engineering. Springer, London, pp 311–326CrossRefGoogle Scholar
  28. 28.
    Häggman A, Tsai G, Elsen C, Honda T, Yang MC (2015) Connections between the design tool, design attributes, and user preferences in early stage design. J Mech Des 137(7):071101CrossRefGoogle Scholar
  29. 29.
    Lewis K, Chen W, Schmidt L (2006) Decision-making in engineering design, vol 1Google Scholar
  30. 30.
    Toh CA, Strohmetz AA, Miller SR (2016) The effects of gender and idea goodness on ownership bias in engineering design education. J Mech Des 138(10):101105CrossRefGoogle Scholar
  31. 31.
    Viswanathan VK, Linsey JS (2012) Physical models and design thinking: a study of functionality, novelty and variety of ideas. ASME J Mech Des 134(9):091004CrossRefGoogle Scholar
  32. 32.
    Lopez C, Miller S, Tucker C (2018) Exploring biases between human and machine generated designs. ASME J Mech Des. https://doi.org/10.1115/1.4041857CrossRefGoogle Scholar
  33. 33.
    Chiu I, Shu LH (2010) Potential limitations of verbal protocols in design experiments. In: ASME IDETC/CIE conference, pp 1–10Google Scholar
  34. 34.
    Hay L, McTeague C, Duffy AHB, Pidgeon LM, Vuletic T, Grealy M (2017) A systematic review of protocol studies on conceptual design cognition. In: Design computing and cognition 2016, pp 135–53CrossRefGoogle Scholar
  35. 35.
    Thompson ER (2007) Development and validation of an internationally reliable short-form of the positive and negative affect schedule (PANAS). J Cross Cult Psychol 38(2):227–242CrossRefGoogle Scholar
  36. 36.
    Levine LJ, Safer M (2002) Sources of bias in memory for emotions. Curr Dir Psychol Sci 11:169–73CrossRefGoogle Scholar
  37. 37.
    Ekman P, Friesen WV (1978) Manual for the facial action coding system. Consult Psychol PressGoogle Scholar
  38. 38.
    Ekman P, Rosenberg EL (2012) What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS). Oxford University Press, Oxford, pp 1–672Google Scholar
  39. 39.
    Wang X-W, Nie D, Lu B-L (2011) EEG-based emotion recognition using frequency domain features and support vector machines. In: Neural information processing, pp 734–43CrossRefGoogle Scholar
  40. 40.
    Hao M, Liu GY, Ma CW, Cai J (2009) An application of electrocardiography to emotion recognition. In: 5th international conference on natural computation, pp 107–11Google Scholar
  41. 41.
    Bradley MM, Miccoli L, Escrig MA, Lang PJ (2008) The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4):602–607CrossRefGoogle Scholar
  42. 42.
    Balters S, Steinert M (2011) Capturing emotion reactivity through physiology measurement as a foundation for affective engineering in engineering design science and engineering practices. J Intell Manuf 28(7):1585–1607CrossRefGoogle Scholar
  43. 43.
    Behoora I, Tucker CS (2015) Machine learning classification of design team members’ body language patterns for real time emotional state detection. Des Stud 39:100–127CrossRefGoogle Scholar
  44. 44.
    Kotsia I, Zafeiriou S, Fotopoulos S (2013) Affective gaming: a comprehensive survey. In: IEEE Computer Society conference on computer vision and pattern recognition workshop, pp 663–70Google Scholar
  45. 45.
    Ben Ammar M, Neji M, Alimi AM, Gouardères G (2010) The affective tutoring system. Expert Syst Appl 37(4):3013–3023CrossRefGoogle Scholar
  46. 46.
    Marchand GC, Gutierrez AP (2012) The role of emotion in the learning process: comparisons between online and face-to-face learning settings. Internet High Educ 15(3):150–160CrossRefGoogle Scholar
  47. 47.
    Lin HCK, Wu CH, Hsueh YP (2014) The influence of using affective tutoring system in accounting remedial instruction on learning performance and usability. Comput Human Behav 41:514–522CrossRefGoogle Scholar
  48. 48.
    Athanasiadis C, Hortal E, Koutsoukos D, Lens CZ (2017) Personalized, affect and performance-driven Computer-based Learning. In: International conference on computer supported educationGoogle Scholar
  49. 49.
    Psaltis A, Apostolakis KC, Dimitropoulos K, Daras P (2018) Multimodal student engagement recognition in prosocial games. IEEE Trans Comput Intell AI Games ​10(3):292–303CrossRefGoogle Scholar
  50. 50.
    D’Mello SK, Kory J (2015) A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv 47(3):A43CrossRefGoogle Scholar
  51. 51.
    Kotsia I, Zafeiriou S, Goudelis G, Patras I, Karpouzis K (2016) Multimodal sensing in affective gaming. In: Emotion in games. Springer International Publishing, Berlin, pp 59–84CrossRefGoogle Scholar
  52. 52.
    Christy T, Kuncheva LI (2014) Technological advancements in affective gaming: a historical survey. GSTF Int J Comput 3(4):7–15CrossRefGoogle Scholar
  53. 53.
    Grappiolo C, Cheong YG, Togelius J, Khaled R, Yannakakis GN (2011) Towards player adaptivity in a serious game for conflict resolution. In: 3rd international conference on games and virtual worlds for serious applications, pp 192–198Google Scholar
  54. 54.
    Shaker N, Yannakakis GN, Togelius J (2010) Towards automatic personalized content generation for platform games. In: Conference on artificial intelligence and interactive digital entertainment, pp 63–68Google Scholar
  55. 55.
    Asteriadis S, Shaker N, Karpouzis K (2012) Towards player’s affective and behavioral visual cues as drives to game adaptation. In: LREC workshop on multimodal corpora for machine learning, Istanbul, pp 1–4Google Scholar
  56. 56.
    Sabourin JL, Lester JC (2014) Affect and engagement in game-based learning environments. IEEE Trans Affect Comput 5(1):45–56CrossRefGoogle Scholar
  57. 57.
    Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58CrossRefGoogle Scholar
  58. 58.
    D’Mello S, Graesser A (2011) The half-life of cognitive-affective states during complex learning. Cogn Emot 25(7):1299–1308CrossRefGoogle Scholar
  59. 59.
    Poria S, Cambria E, Bajpai R, Hussain A (2017) A review of affective computing: from unimodal analysis to multimodal fusion. Inf Fusion. 37:98–125CrossRefGoogle Scholar
  60. 60.
    Meng H, Bianchi-Berthouze N (2011) Naturalistic affective expression classification by a multi-stage approach based on hidden Markov models. In: Lecture notes in computer science, pp 378–387CrossRefGoogle Scholar
  61. 61.
    Lotfian R, Busso C (2017) Building naturalistic emotionally balanced speech corpus by retrieving emotional speech from existing podcast recordings. IEEE Trans on Affect Comput 1:1–1Google Scholar
  62. 62.
    Kapoor A, Picard RW (2005) Multimodal affect recognition in learning environments. In: Proceedings of annual ACM international conference on multimedia, pp 677–682Google Scholar
  63. 63.
    Ross A (1994) Procrustes analysis. Food Qual Prefer 5(1–2):115–120Google Scholar
  64. 64.
    Kohavi RA (2016) Study of cross-validation and bootstrap for accuracy estimation and model. Int J Conf Artif Intell 2016:1137–1143Google Scholar
  65. 65.
    Baltrusaitis T, Robinson P, Morency LP (2016) OpenFace: an open source facial behavior analysis toolkit. In: IEEE Winter Conference on applications of computer visionGoogle Scholar
  66. 66.
    R Development Core Team R. R (2011) A language and environment for statistical computing, vol 1. R Foundation for Statistical Computing, 409pGoogle Scholar
  67. 67.
    Schliep K, Hechenbichler K. (2014) kknn: weighted k-nearest neighbors. R package. version 1.2-5. 2014Google Scholar
  68. 68.
    Dimitriadou E, Hornik K, Leisch F, Meyer D, Weingessel A (2008) Misc functions of the Department of Statistics (e1071) TU Wien R Package 1:5–24Google Scholar
  69. 69.
    Roever C, Raabe N, Luebke K, Ligges U, Szepannek G, Zentgraf M (2004) klaR–classification and visualization. R package, Version 0.3-3Google Scholar
  70. 70.
    Ripley B (2013) nnet: feed-forward neural networks and multinomial log-linear models. R Package, Version 7.3-12Google Scholar
  71. 71.
    Papandreou G, Zh T, Kanazawa N, Toshev A, Tompson J, Bregler C, et al (2017) Towards accurate multi-person pose estimation in the wild. ArXiv, arXiv Prepr. 2017;1701.01779Google Scholar
  72. 72.
    Sikdar A, Behera SK, Dogra DP (2016) Computer-vision-guided human pulse rate estimation: a review. IEEE Rev Biomed Eng 9:91–105CrossRefGoogle Scholar
  73. 73.
    Arul Prakash SK, Tucker CS (2018) Bounded kalman filter method for motion-robust, non-contact heart rate estimation. Biomed Opt Express 9(2):873–897CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Industrial and Manufacturing EngineeringThe Pennsylvania State UniversityState CollegeUSA
  2. 2.Department of Industrial and Manufacturing Engineering, School of Engineering Design, Technology and Professional ProgramsThe Pennsylvania State UniversityState CollegeUSA

Personalised recommendations