International Journal of Social Robotics

, Volume 10, Issue 2, pp 179–198 | Cite as

Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism

  • Ali Ghorbandaei Pour
  • Alireza Taheri
  • Minoo Alemi
  • Ali Meghdari
Article
  • 98 Downloads

Abstract

Reciprocal interaction and facial expression are some of the most interesting topics in the fields of social and cognitive robotics. On the other hand, children with autism show a particular interest toward robots, and facial expression recognition can improve these children’s social interaction abilities in real life. In this research, a robotic platform has been developed for reciprocal interaction consisting of two main phases, namely as Non-structured and Structured interaction modes. In the Non-structured interaction mode, a vision system recognizes the facial expressions of the user through a fuzzy clustering method. The interaction decision-making unit is combined with a fuzzy finite state machine to improve the quality of human–robot interaction by utilizing the results obtained from the facial expression analysis. In the Structured interaction mode, a set of imitation scenarios with eight different posed facial behaviors were designed for the robot. As a pilot study, the effect and acceptability of our platform have been investigated on autistic children between 3 and 7 years old and the preliminary acceptance rate of \(\sim \) 78% is observed in our experimental conditions. The scenarios start with simple facial expressions and get more complicated as they continue. The same vision system and fuzzy clustering method of the Non-structured interaction mode are used for automatic evaluation of a participant’s gestures. Lastly, the automatic assessment of imitation quality was compared with the manual video coding results. The Pearson’s r on these equivalent grades were computed as \(\hbox {r}\,=\,0.89\) which shows a sufficient agreement on the automatic and manual scores.

Keywords

Human–robot interaction (HRI) Reciprocal interaction Facial expressions Autism Fuzzy finite state machine Imitation 

Notes

Acknowledgements

Our profound gratitude goes to the “Center for the Treatment of Autistic Disorders (CTAD)” and its psychologists for their contributions to the clinical trials with the children with autism. This research was funded by the “Cognitive Sciences and Technology Council” (CSTC) of Iran (http://www.cogc.ir/). We also appreciate the Iranian National Science Foundation (INSF) for their complementary support of the Social & Cognitive Robotics Laboratory (http://en.insf.org/).

Compliance with Ethical Standard

Funding

This study was funded by the “Cognitive Sciences and Technology Council” (CSTC) of Iran (Grant Number: 95p22)

Conflict of interest

Author Ali Meghdari has received research grants from the “Cognitive Sciences and Technology Council” (CSTC) of Iran. The authors Ali Ghorbandaei Pour, Alireza Taheri, and Minoo Alemi declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Ethical approval for the protocol of this study was provided by Iran University of Medical Sciences (No. IR.IUMS.REC.1395.95301469), and the certification for ABA and robot-assisted Therapy with autistic children was received from the Center for the Treatment of Autistic Disorders (CTAD), Iran.

References

  1. 1.
    Pantic M, Pentland A, Nijholt A, Huang TS (2007) Human computing and machine understanding of human behavior: a survey. In: Huang TS, Nijholt A, Pantic M, Pentland A (eds) Artificial intelligence for human computing. Lecture notes in computer science, vol 4451. Springer, Berlin.  https://doi.org/10.1007/978-3-540-72348-6_3
  2. 2.
    Valstar MF (2008) Timing is everything: a spatio-temporal approach to the analysis of facial actions. Imperial College London, LondonGoogle Scholar
  3. 3.
    Mavridis N (2015) A review of verbal and non-verbal human–robot interactive communication. Robot Auton Syst 63:22–35MathSciNetCrossRefGoogle Scholar
  4. 4.
    Tardif C, Lainé F, Rodriguez M, Gepner B (2007) Slowing down presentation of facial movements and vocal sounds enhances facial expression recognition and induces facial-vocal imitation in children with autism. J Autism Dev Disord 37(8):1469–1484CrossRefGoogle Scholar
  5. 5.
    Dawson G, Webb SJ, McPartland J (2005) Understanding the nature of face processing impairment in autism: insights from behavioral and electrophysiological studies. Dev Neuropsychol 27(3):403–424CrossRefGoogle Scholar
  6. 6.
    Baron-Cohen S, Leslie AM, Frith U (1985) Does the autistic child have a "theory of mind"? Cognition 21(1):37–46CrossRefGoogle Scholar
  7. 7.
    Baron-Cohen S (2001) Theory of mind in normal development and autism. Prisme 34(1):74–183Google Scholar
  8. 8.
    Haviland JM, Lelwica M (1987) The induced affect response: 10-week-old infants’ responses to three emotion expressions. Dev Psychol 23(1):97CrossRefGoogle Scholar
  9. 9.
    Tonks J, Williams WH, Frampton I, Yates P, Slater A (2007) Assessing emotion recognition in 9–15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj 21(6):623–629CrossRefGoogle Scholar
  10. 10.
    Pouretemad H (2011) Assessment and treatment of joint attention deficits in children with autistic spectrum disorders. Arjmand Book, Tehran (in Persian)Google Scholar
  11. 11.
    Ingersoll B (2010) Brief report: pilot randomized controlled trial of reciprocal imitation training for teaching elicited and spontaneous imitation to children with autism. J Autism Dev Disord 40(9):1154–1160CrossRefGoogle Scholar
  12. 12.
    Alemi M, Meghdari A, Ghazisaedy M (2015) The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. Int J Soc Robot 7(4):523–535CrossRefGoogle Scholar
  13. 13.
    Tamura T, Yonemitsu S, Itoh A, Oikawa D, Kawakami A, Higashi Y et al (2004) Is an entertainment robot useful in the care of elderly people with severe dementia? J Gerontol Ser Biol Sci Med Sci 59(1):M83–M85CrossRefGoogle Scholar
  14. 14.
    Alemi M, Ghanbarzadeh A, Meghdari A, Moghadam LJ (2016) Clinical application of a humanoid robot in pediatric cancer interventions. Int J Soc Robot 8(5):743–759CrossRefGoogle Scholar
  15. 15.
    Taheri A, Alemi M, Meghdari A, Pouretemad H, Basiri NM, Poorgoldooz P (2015) Impact of humanoid social robots on treatment of a pair of Iranian autistic twins. In: International conference on social robotics. Springer, pp 623–632Google Scholar
  16. 16.
    Scassellati B, Admoni H, Matarić M (2012) Robots for use in autism research. Annu Rev Biomed Eng 14:275–294CrossRefGoogle Scholar
  17. 17.
    Taheri A, Meghdari A, Alemi M, Pouretemad H, Poorgoldooz P, Roohbakhsh M (2016) Social robots and teaching music to autistic children: myth or reality? In: International conference on social robotics. Springer, pp 541–550Google Scholar
  18. 18.
    Hopkins IM, Gower MW, Perez TA, Smith DS, Amthor FR, Wimsatt FC, Biasini FJ (2011) Avatar assistant: improving social skills in students with an ASD through a computer-based intervention. J Autism Dev Disord 41(11):1543–1555CrossRefGoogle Scholar
  19. 19.
    Feil-Seifer D, Mataric MJ (2008) B 3 IA: a control architecture for autonomous robot-assisted behavior intervention for children with Autism Spectrum Disorders. In: The 17th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 328–333Google Scholar
  20. 20.
    Meghdari A, Alemi M, Pour AG, Taheri A (2016) Spontaneous human–robot emotional interaction through facial expressions. In: International conference on social robotics. Springer, pp 351–361Google Scholar
  21. 21.
    Zacharatos H, Gatzoulis C, Chrysanthou YL (2014) Automatic emotion recognition based on body movement analysis: a survey. IEEE Comput Graph Appl 34(6):35–45CrossRefGoogle Scholar
  22. 22.
    Xiao Y, Zhang Z, Beck A, Yuan J, Thalmann D (2014) Human–robot interaction by understanding upper body gestures. Presence Teleoper Virtual Environ 23(2):133–154CrossRefGoogle Scholar
  23. 23.
    Aly A, Tapus A (2015) Multimodal adapted robot behavior synthesis within a narrative human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2986–2993Google Scholar
  24. 24.
    Kwon DS, Kwak YK, Park JC, Chung MJ, Jee ES, Park KS et al (2007) Emotion interaction system for a service robot. In: The 16th ieee international symposium on robot and human interactive communication (RO-MAN), pp 351–356Google Scholar
  25. 25.
    Brown L, Howard AM (2014) Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. In: The 23rd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 471–476Google Scholar
  26. 26.
    Noh JY, Neumann U (1998) A survey of facial modeling and animation techniques. USC technical report, pp 99–705Google Scholar
  27. 27.
    Mavadati S (2015) Spontaneous facial behavior computing in human machine interaction with applications in autism treatment. Doctoral dissertation, Electrical and Computer Engineering Department, University of Denver, DenverGoogle Scholar
  28. 28.
    Halder A, Konar A, Mandal R, Chakraborty A, Bhowmik P, Pal NR, Nagar AK (2013) General and interval type-2 fuzzy face-space approach to emotion recognition. IEEE Trans Syst Man Cybern Syst 43(3):587–605CrossRefGoogle Scholar
  29. 29.
    Dahmane M, Meunier J (2014) Prototype-based modeling for facial expression analysis. IEEE Trans Multimed 16(6):1574–1584CrossRefGoogle Scholar
  30. 30.
    Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16(1):172–187MathSciNetCrossRefGoogle Scholar
  31. 31.
    Li Y, Wang S, Zhao Y, Ji Q (2013) Simultaneous facial feature tracking and facial expression recognition. IEEE Trans Image Process 22(7):2559–2573CrossRefGoogle Scholar
  32. 32.
    Holthaus P, Wachsmuth S (2013) Direct on-line imitation of human faces with hierarchical ART networks. In: The 22nd IEEE international symposium on robot and human interactive communication (RO-MAN), pp 370–371Google Scholar
  33. 33.
    Li Y, Mavadati SM, Mahoor MH, Zhao Y, Ji Q (2015) Measuring the intensity of spontaneous facial action units with dynamic Bayesian network. Pattern Recogn 48(11):3417–3427CrossRefGoogle Scholar
  34. 34.
    Chakraborty A, Konar A, Chakraborty UK, Chatterjee A (2009) Emotion recognition from facial expressions and its control using fuzzy logic. IEEE Trans Syst Man Cybern Part A Syst Hum 39(4):726–743CrossRefGoogle Scholar
  35. 35.
    Abdat F, Maaoui C, Pruski A (2011) Human–computer interaction using emotion recognition from facial expression. In: Fifth UKSim european symposium on computer modeling and simulation (EMS), pp 196–201Google Scholar
  36. 36.
    de Carvalho Santos V, Romero RAF, Coca SRDM (2012) Imitation of facial expressions for a virtual robotic head. In: Robotics symposium and latin american robotics symposium (SBR-LARS), 2012 Brazilian, pp 251–254Google Scholar
  37. 37.
    Cid F, Prado JA, Bustos P, Nunez P (2013) A real time and robust facial expression recognition and imitation approach for affective human–robot interaction using gabor filtering. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2188–2193Google Scholar
  38. 38.
    Chumkamon S, Masato K, Hayashi E (2014) The robot’s eye expression for imitating human facial expression. In: The 11th international conference on electrical engineering/electronics, computer, telecommunications and information technology (ECTI-CON), pp 1–5Google Scholar
  39. 39.
    Meghdari A, Shouraki SB, Siamy A, Shariati A (2016) The real-time facial imitation by a social humanoid robot. In: The 4th international conference on robotics and mechatronics (ICROM), pp 524–529Google Scholar
  40. 40.
    Tanaka JW, Wolf JM, Klaiman C, Koenig K, Cockburn J, Herlihy L et al (2010) Using computerized games to teach face recognition skills to children with autism spectrum disorder: the let’s face it! program. J Child Psychol Psychiatry 51(8):944–952CrossRefGoogle Scholar
  41. 41.
    Duquette A, Michaud F, Mercier H (2008) Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism. Auton Robots 24(2):147–157CrossRefGoogle Scholar
  42. 42.
    Salvador MJ, Silver S, Mahoor MH (2015) An emotion recognition comparative study of autistic and typically-developing children using the zeno robot. In: IEEE international conference on robotics and automation (ICRA), pp 6128–6133Google Scholar
  43. 43.
    Wainer J, Robins B, Amirabdollahian F, Dautenhahn K (2014) Using the humanoid robot KASPAR to autonomously play triadic games and facilitate collaborative play among children with autism. IEEE Trans Auton Ment Dev 6(3):183–199CrossRefGoogle Scholar
  44. 44.
    Hanson D, Mazzei D, Garver C, Ahluwalia A, De Rossi D, Stevenson M, Reynolds K (2012) Realistic humanlike robots for treatment of ASD, social training, and research; shown to appeal to youths with ASD, cause physiological arousal, and increase human-to-human social engagement. In: Proceedings of the 5th ACM international conference on pervasive technologies related to assistive environments (PETRA’12)Google Scholar
  45. 45.
    Kinect for Windows SDK (2016) https://msdn.microsoft.com/en-us/library/
  46. 46.
  47. 47.
    Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists, Palo AltoGoogle Scholar
  48. 48.
    Bezdek JC, Ehrlich R, Full W (1984) FCM: the fuzzy c-means clustering algorithm. Comput Geosci 10(2–3):191–203CrossRefGoogle Scholar
  49. 49.
    Popescu M, Keller J, Bezdek J, Zare A (2015) Random projections fuzzy c-means (RPFCM) for big data clustering. In: IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–6Google Scholar
  50. 50.
    Yan J, Ryan M, Power J (1994) Using fuzzy logic: towards intelligent systems, vol 1. Prentice Hall, Upper Saddle RiverGoogle Scholar
  51. 51.
    Minitab INC (2000) MINITAB statistical software. Minitab Release, 13Google Scholar
  52. 52.
    Giannopulu I, Montreynaud V, Watanabe T (2014) PEKOPPA: a minimalistic toy robot to analyse a listener-speaker situation in neurotypical and autistic children aged 6 years. In: Proceedings of the second international conference on human-agent interaction. ACM, pp 9–16Google Scholar
  53. 53.
    Taheri A, Meghdari A, Alemi M, Pouretemad H (2017) Human-robot interaction in autism treatment: a case study on three pairs of autistic children as twins, siblings, and classmates. Int J Soc Robot.  https://doi.org/10.1007/s12369-017-0433-8 Google Scholar
  54. 54.
    Taheri A, Meghdari A, Alemi M, Pouretemad H (2017) Teaching music to children with autism: a social robotics challenge. Int J Sci Iran Trans G Socio Cognit Eng.  https://doi.org/10.24200/SCI.2017.4608 Google Scholar
  55. 55.
    Elahi MT, Korayem AH, Shariati A, Meghdari A, Alemi M, Ahmadi E et al (2017) “Xylotism”: a tablet-based application to teach music to children with autism. In: International conference on social robotics. Springer, Cham, pp 728–738Google Scholar

Copyright information

© Springer Science+Business Media B.V., part of Springer Nature 2018

Authors and Affiliations

  • Ali Ghorbandaei Pour
    • 1
  • Alireza Taheri
    • 1
  • Minoo Alemi
    • 1
    • 2
  • Ali Meghdari
    • 1
  1. 1.Social and Cognitive Robotics Laboratory, Center of Excellence in Design, Robotics and Automation (CEDRA)Sharif University of TechnologyTehranIran
  2. 2.Faculty of Humanities, West Tehran BranchIslamic Azad UniversityTehranIran

Personalised recommendations