Socially Assistive Robots: The Specific Case of the NAO

Abstract

Numerous researches have studied the development of robotics, especially socially assistive robots (SAR), including the NAO robot. This small humanoid robot has a great potential in social assistance. The NAO robot’s features and capabilities, such as motricity, functionality, and affective capacities, have been studied in various contexts. The principal aim of this study is to gather every research that has been done using this robot to see how the NAO can be used and what could be its potential as a SAR. Articles using the NAO in any situation were found searching PSYCHINFO, Computer and Applied Sciences Complete and ACM Digital Library databases. The main inclusion criterion was that studies had to use the NAO robot. Studies comparing it with other robots or intervention programs were also included. Articles about technical improvements were excluded since they did not involve concrete utilisation of the NAO. Also, duplicates and articles with an important lack of information on sample were excluded. A total of 51 publications (1895 participants) were included in the review. Six categories were defined: social interactions, affectivity, intervention, assisted teaching, mild cognitive impairment/dementia, and autism/intellectual disability. A great majority of the findings are positive concerning the NAO robot. Its multimodality makes it a SAR with potential.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2

References

  1. 1.

    Roshidul H, Shariff ARBM, Blackmore BS, Aris IB, Ramli ARB, Hossen J (2010) High adoption of behavior based robotics in the autonomous machines. J Inf Syst Technol Plan 3(6):30–41

    Google Scholar 

  2. 2.

    De Carolis B, Ferilli S, Palestra G (2016) Simulating empathic behavior in a social assistive robot. Multimed Tools Appl 76(4):5073–5094. https://doi.org/10.1007/s11042-016-3797-0

    Google Scholar 

  3. 3.

    De Carolis B, Ferilli S, Palestra G, Carofiglio V (2015) Modeling and simulating empathic behavior in social assistive robots. In: Proceedings of the 11th biannual conference on Italian SIGCHI chapter. ACM, pp 110–117

  4. 4.

    Abdi J, Al-Hindawi A, Ng T, Vizcaychipi MP (2018) Scoping review on the use of socially assistive robot technology in elderly care. In: BMJ Open. England, vol 8, p e018815 https://doi.org/10.1136/bmjopen-2017-018815

  5. 5.

    Broekens J, Heerink M, Rosendal H (2009) Assistive social robots in elderly care: a review. Gerontechnology 8(2):94–103

    Google Scholar 

  6. 6.

    Graf B, Hans M, Schraft RD (2004) Care-O-bot II—development of a next generation robotic home assistant. Autonomous Robots 16(2):193–205

    Google Scholar 

  7. 7.

    Martinez-Martin E, del Pobil AP (2018) Personal robot assistants for elderly care: an overview. In: Personal assistants: emerging computational technologies. Springer, Cham, pp 77–91

  8. 8.

    SoftBankRobotics. NAO. Available: https://www.softbankrobotics.com/emea/fr/nao

  9. 9.

    Gouaillier D, Hugel V, Blazevic P, Kilner C, Monceaux J, Lafourcade P, et al (2009) Mechatronic design of NAO humanoid. In: 2009 IEEE international conference on robotics and automation. IEEE, pp 769–774

  10. 10.

    Pan Y, Okada H, Uchiyama T, Suzuki K (2015) On the reaction to robot’s speech in a hotel public space. Int J Soc Rob 7(5):911–920. https://doi.org/10.1007/s12369-015-0320-0

    Google Scholar 

  11. 11.

    Lopez A, Ccasane B, Paredes R, Cuellar F (2017). Effects of using indirect language by a robot to change human attitudes. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 193–194

  12. 12.

    Lucas GM, Boberg J, Traum D, Artstein R, Gratch J, Gainer A, et al (2018) Getting to know each other: the role of social dialogue in recovery from errors in social robots. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction. ACM, pp 344–351

  13. 13.

    Kuchenbrandt D, Eyssel F, Bobinger S, Neufeld M (2013) When a robot’s group membership matters. Int J Soc Rob 5(3):409–417. https://doi.org/10.1007/s12369-013-0197-8

    Google Scholar 

  14. 14.

    Sandoval EB, Brandstetter J, Obaid M, Bartneck C (2016) Reciprocity in human–robot interaction: a quantitative approach through the prisoner’s dilemma and the ultimatum game. Int J Soc Robot 8(2):303–317

    Google Scholar 

  15. 15.

    Seo SH, Griffin K, Young JE, Bunt A, Prentice S, Loureiro-Rodríguez V (2018) Investigating people’s rapport building and hindering behaviors when working with a collaborative robot. Int J Soc Robot 10(1):147–161

    Google Scholar 

  16. 16.

    Wang B, Rau PLP (2019) Influence of embodiment and substrate of social robots on users’ decision-making and attitude. Int J Soc Robot 11(3):411–421

    Google Scholar 

  17. 17.

    Stanton CJ, Stevens CJ (2017) Don’t stare at me: the impact of a humanoid robot’s gaze upon trust during a cooperative human–robot visual task. Int J Soc Robot 9(5):745–753

    Google Scholar 

  18. 18.

    Sandygulova A, O’Hare, GMP (2016) Investigating the impact of gender segregation within observational pretend play interaction. In 2016 11th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 399–406

  19. 19.

    Sandygulova A, O’Hare GM (2018) Age-and gender-based differences in children’s interactions with a gender-matching robot. Int J Soc Robot 10(5):687–700

    Google Scholar 

  20. 20.

    Tokmurzina D, Sagitzhan N, Nurgaliyev A, Sandygulova A (2018). Exploring child–robot proxemics. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction. ACM, pp 257–258

  21. 21.

    Ahmad MI, Mubin O, Orlando J (2017) adaptive social robot for sustaining social engagement during long-term children–robot interaction. Int J Hum Comput Interact 33(12):943–962. https://doi.org/10.1080/10447318.2017.1300750

    Google Scholar 

  22. 22.

    Ahmad MI, Mubin O, Shahid S, Orlando J (2019) Robot’s adaptive emotional feedback sustains children’s social engagement and promotes their vocabulary learning: a long-term child–robot interaction study. Adapt Behav 27(4):243–266. https://doi.org/10.1177/1059712319844182

    Google Scholar 

  23. 23.

    Shinohara Y, Mitsukuni K, Yoneda T, Ichikawa J, Nishizaki Y, Oka N (2018) A humanoid robot can use mimicry to increase likability and motivation for helping. In: Proceedings of the 6th international conference on human–agent interaction. ACM, pp 122–128

  24. 24.

    Johnson DO, Cuijpers RH, Pollmann K, van de Ven AAJ (2016) Exploring the entertainment value of playing games with a humanoid robot. Int J Soc Rob 8(2):247–269. https://doi.org/10.1007/s12369-015-0331-x

    Google Scholar 

  25. 25.

    Kennedy J, Baxter P, Belpaeme T (2015) Comparing robot embodiments in a guided discovery learning interaction with children. Int J Soc Robot 7(2):293–308

    Google Scholar 

  26. 26.

    Ros R, Oleari E, Pozzi C, Sacchitelli F, Baranzini D, Bagherzadhalimi A et al (2016) A motivational approach to support healthy habits in long-term child–robot interaction. Int J Soc Robot 8(5):599–617

    Google Scholar 

  27. 27.

    Jochum E, Vlachos E, Christoffersen A, Nielsen SG, Hameed IA, Tan ZH (2016) Using theatre to study interaction with care robots. Int J Soc Robot 8(4):457–470

    Google Scholar 

  28. 28.

    Aly A, Tapus A (2015) Towards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human–robot interaction. Auton Robots 40(2):193–209. https://doi.org/10.1007/s10514-015-9444-1

    Google Scholar 

  29. 29.

    Dang T-H-H, Tapus A (2014) Stress game: the role of motivational robotic assistance in reducing user’s task stress. Int J Soc Robot 7(2):227–240. https://doi.org/10.1007/s12369-014-0256-9

    Google Scholar 

  30. 30.

    Bechade L, Dubuisson Duplessis G, Sehili M, Devillers L (2015). Behavioral and emotional spoken cues related to mental states in human–robot social interaction. In: Proceedings of the 2015 ACM on international conference on multimodal interaction. ACM, pp 347–350

  31. 31.

    Pelikan HRM, Broth M (2016) Why that nao? How humans adapt to a conventional humanoid robot in taking turns-at-talk. In Proceedings of the 2016 CHI conference on human factors in computing systems. ACM, pp 4921–4932

  32. 32.

    Behrens SI, Egsvang AKK, Hansen M, Møllegård-Schroll AM (2018) Gendered robot voices and their influence on trust. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction. ACM, pp 63–64

  33. 33.

    Tahir Y, Dauwels J, Thalmann D, Magnenat Thalmann N (2018) A user study of a humanoid robot as a social mediator for two-person conversations. Int J Soc Robot. https://doi.org/10.1007/s12369-018-0478-3

    Google Scholar 

  34. 34.

    Baddoura R, Venture G (2013) Social vs useful HRI: experiencing the familiar, perceiving the robot as a sociable partner and responding to its actions. Int J Soc Robot 5(4):529–547. https://doi.org/10.1007/s12369-013-0207-x

    Google Scholar 

  35. 35.

    van Dijk ET, Torta E, Cuijpers RH (2013) Effects of eye contact and iconic gestures on message retention in human–robot interaction. Int J Social Robot 5(4):491–501

    Google Scholar 

  36. 36.

    Sherman SJ, Ahlm K, Berman L, Lynn S (1978) Contrast effects and their relationship to subsequent behavior. J Exp Soc Psychol 14:340–350

    Google Scholar 

  37. 37.

    Cahrtrand Tanya L, Bargh John A (1999) The Chameleon effect: the perception-behavior link and social interaction. J Person Soc Psychol 76(6):893–910

    Google Scholar 

  38. 38.

    Cohen I, Looije R, Neerincx MA (2014) Child’s perception of robot’s emotions: effects of platform, context and experience. Int J Soc Robot 6(4):507–518. https://doi.org/10.1007/s12369-014-0230-6

    Google Scholar 

  39. 39.

    Read R, Belpaeme T (2016) People interpret robotic non-linguistic utterances categorically. Int J Soc Robot 8(1):31–50. https://doi.org/10.1007/s12369-015-0304-0

    Google Scholar 

  40. 40.

    Xu J, Broekens J, Hindriks K, Neerincx MA (2015) Mood contagion of robot body language in human robot interaction. Auton Agent Multi-Agent Syst 29(6):1216–1248. https://doi.org/10.1007/s10458-015-9307-3

    Google Scholar 

  41. 41.

    Andreasson R, Alenljung B, Billing E, Lowe R (2017) Affective touch in human–robot Interaction: conveying Emotion to the Nao Robot. Int J Soc Robot 10(4):473–491. https://doi.org/10.1007/s12369-017-0446-3

    Google Scholar 

  42. 42.

    Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334

    Google Scholar 

  43. 43.

    Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot 10(5):569–582

    Google Scholar 

  44. 44.

    Tielman M, Neerincx M, Meyer J-J, Looije R (2014) Adaptive emotional expression in robot-child interaction. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction. ACM, pp 407–414

  45. 45.

    Ahmad MI, Mubin O, Patel H (2018). Exploring the potential of NAO robot as an interviewer. In: Proceedings of the 6th international conference on human–agent interaction. ACM, pp 324–326

  46. 46.

    Brandstetter J, Liebman N, London K (2015) Fidgebot: working out while working. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction extended abstracts. ACM, pp 149–150

  47. 47.

    da Silva JGG, Kavanagh DJ, Belpaeme T, Taylor L, Beeson K, Andrade J (2018) Experiences of a motivational interview delivered by a robot: qualitative study. J Med Intern Res 20(5):e116

    Google Scholar 

  48. 48.

    Alemi M, Ghanbarzadeh A, Meghdari A, Moghadam LJ (2015) Clinical application of a humanoid robot in pediatric cancer interventions. Int J Soc Robot 8(5):743–759. https://doi.org/10.1007/s12369-015-0294-y

    Google Scholar 

  49. 49.

    Edwards A, Omilion-Hodges L, Edwards C (2017) How do patients in a medical interview perceive a robot versus human physician? In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human–robot interaction. ACM, pp 109–110

  50. 50.

    Lee N, Kim J, Kim E, Kwon O (2017) The influence of politeness behavior on user compliance with social robots in a healthcare service setting. Int J Soc Robot 9(5):727–743. https://doi.org/10.1007/s12369-017-0420-0

    Google Scholar 

  51. 51.

    López Recio D, Márquez Segura E, Márquez Segura L, Waern A (2013) The NAO models for the elderly. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction. IEEE Press, pp 187–188

  52. 52.

    Carrillo FM, Butchart J, Knight S, Scheinberg A, Wise L, Sterling L, McCarthy C (2018) Adapting a general-purpose social robot for paediatric rehabilitation through in situ design. ACM Trans Hum Robot Interact 7(1):12. https://doi.org/10.1145/3203304

    Google Scholar 

  53. 53.

    Looije R, Neerincx MA, Peters JK, Henkemans OAB (2016) Integrating robot support functions into varied activities at returning hospital visits. Int J Soc Robot 8(4):483–497

    Google Scholar 

  54. 54.

    Pulido JC, González JC, Suárez-Mejías C, Bandera A, Bustos P, Fernández F (2017) Evaluating the child–robot interaction of the NAOTherapist platform in pediatric rehabilitation. Int J Soc Robot 9(3):343–358

    Google Scholar 

  55. 55.

    van den Heuvel RJ, Lexis MA, de Witte LP (2017) Robot ZORA in rehabilitation and special education for children with severe physical disabilities: a pilot study. Int J Rehabil Res 40(4):353

    Google Scholar 

  56. 56.

    van den Heuvel RJF, Lexis MAS, de Witte LP (2020) ZORA robot based interventions to achieve therapeutic and educational goals in children with severe physical disabilities. Int J of Soc Robot 12:493–504. https://doi.org/10.1007/s12369-019-00578-z

    Google Scholar 

  57. 57.

    Niemelä M, Melkas H (2019) Robots as social and physical assistants in elderly care. In: Human-centered digitalization and services. Springer, Singapore, pp 177–197

  58. 58.

    Majgaard G, Brogaard Bertel L (2014). Initial phases of design-based research into the educational potentials of NAO-robots. In: 2014 9th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 238–239

  59. 59.

    Arias-Aguilar JA, Palacios-Villavicencio ML, Bretado-Gallegos R, Medina-Nieto MA, Ruiz AB, Rodríguez-López V, Estrada-Bautista J (2017) Analysis of children: humanoid robot interaction to support social skills development. In: Proceedings of the XVIII international conference on human computer interaction. ACM, p 10

  60. 60.

    Ahmad MI, Mubin O, Orlando J (2016). Children views’ on social robot’s adaptations in education. In: Proceedings of the 28th Australian conference on computer–human interaction. ACM, pp 145–149

  61. 61.

    Deshmukh A, Janarthanam S, Hastie H, Lim MY, Aylett R, Castellano G (2016) How expressiveness of a robotic tutor is perceived by children in a learning environment. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 423–424

  62. 62.

    Kazemi E, Stedman-Falls LM (2016) Can humanoid robots serve as patient simulators in behavior analytic research and practice? Behav Anal Res Pract 16(3):135–146. https://doi.org/10.1037/bar0000046

    Google Scholar 

  63. 63.

    Alemi M, Meghdari A, Ghazisaedy M (2015) The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. Int J Soc Robot 7(4):523–535

    Google Scholar 

  64. 64.

    Chandra S, Dillenbourg P, Paiva A (2019) Children teach handwriting to a social robot with different learning competencies. Int J of Soc Robot. https://doi.org/10.1007/s12369-019-00589-w

    Google Scholar 

  65. 65.

    Köse H, Uluer P, Akalın N, Yorgancı R, Özkul A, Ince G (2015) The effect of embodiment in sign language tutoring with assistive humanoid robots. Int J Soc Robot 7(4):537–548. https://doi.org/10.1007/s12369-015-0311-1

    Google Scholar 

  66. 66.

    Köse H, Yorganci R, Algan EH, Syrdal DS (2012) Evaluation of the robot assisted sign language tutoring using video-based studies. Int J Soc Robot 4(3):273–283. https://doi.org/10.1007/s12369-012-0142-2

    Google Scholar 

  67. 67.

    Ros R, Baroni I, Demiris Y (2014) Adaptive human–robot interaction in sensorimotor task instruction: from human to robot dance tutors. Robot Auton Syst 62(6):707–720. https://doi.org/10.1016/j.robot.2014.03.005

    Google Scholar 

  68. 68.

    Pino O, Palestra G, Trevino R, De Carolis B (2019) The humanoid robot NAO as trainer in a memory program for elderly people with mild cognitive impairment. Int J Soc Robot. https://doi.org/10.1007/s12369-019-00533-y

    Google Scholar 

  69. 69.

    Valentí Soler M, Agüera-Ortiz L, Olazarán Rodríguez J, Mendoza Rebolledo C, Pérez Muñoz A, Rodríguez Pérez I et al (2015) Social robots in advanced dementia. Front Aging Neurosci 7:133. https://doi.org/10.3389/fnagi.2015.00133

    Google Scholar 

  70. 70.

    Johnson DO, Cuijpers RH, Juola JF, Torta E, Simonov M, Frisiello A et al (2014) Socially assistive robots: a comprehensive approach to extending independent living. Int J Social Robot 6(2):195–211

    Google Scholar 

  71. 71.

    Tsardoulias EG, Kintsakis AM, Panayiotou K, Thallas AG, Reppou SE, Karagiannis GG et al (2017) Towards an integrated robotics architecture for social inclusion—the RAPP paradigm. Cogn Syst Res 43:157–173. https://doi.org/10.1016/j.cogsys.2016.08.004

    Google Scholar 

  72. 72.

    Sarabia M, Young N, Canavan K, Edginton T, Demiris Y, Vizcaychipi MP (2018) Assistive robotic technology to combat social isolation in acute hospital settings. Int J Soc Robot 10(5):607–620. https://doi.org/10.1007/s12369-017-0421-z

    Google Scholar 

  73. 73.

    Wilson JR, Lee NY, Saechao A, Tickle-Degnen L, Scheutz M (2018) Supporting human autonomy in a robot-assisted medication sorting task. Int J Soc Robot 10(5):621–641

    Google Scholar 

  74. 74.

    Tapus A, Peca A, Aly A, Pop C, Jisa L, Pintea S et al (2012) Children with autism social engagement in interaction with Nao, an imitative robot: a series of single case experiments. Interact Stud 13(3):315–347. https://doi.org/10.1075/is.13.3.01tap

    Google Scholar 

  75. 75.

    Chevalier P, Tapus A, Martin J-C, Isableu B (2015) Social personalized human-machine interaction for people with autism: defining user profiles and first contact with a robot. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction extended abstracts. ACM, pp 101–102

  76. 76.

    Chung EYH (2018) Robotic intervention program for enhancement of social engagement among children with autism spectrum disorder. J Dev Phys Disabil 31(4):419–434. https://doi.org/10.1007/s10882-018-9651-8

    Google Scholar 

  77. 77.

    Beer JM, Boren M, Liles KR (2016) Robot assisted music therapy: a case study with children diagnosed with autism. In: The eleventh ACM/IEEE international conference on human robot interaction. IEEE Press, pp 419–420

  78. 78.

    David DO, Costescu CA, Matu S, Szentagotai A, Dobrean A (2018) Developing joint attention for children with autism in robot-enhanced therapy. Int J Soc Robot 10(5):595–605

    Google Scholar 

  79. 79.

    Anzalone SM, Tilmont E, Boucenna S, Xavier J, Jouen A-L, Bodeau N et al (2014) How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3D+time) environment during a joint attention induction task with a robot. Res Autism Spectrum Disorders 8(7):814–826. https://doi.org/10.1016/j.rasd.2014.03.002

    Google Scholar 

  80. 80.

    Shukla J, Cristiano J, Oliver J, Puig D (2019) Robot assisted interventions for individuals with intellectual disabilities: impact on users and caregivers. Int J Soc Robot. https://doi.org/10.1007/s12369-019-00527-w

    Google Scholar 

  81. 81.

    Petric F, Kovacic Z (2020) Design and validation of MOMDP models for child–robot interaction within tasks of robot-assisted ASD diagnostic protocol. Int J of Soc Robotics 12:371–388. https://doi.org/10.1007/s12369-019-00577-0

    Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Sébastien Gaboury.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix: List of the Reviewed Articles

Appendix: List of the Reviewed Articles

  • Aagela, H., Holmes, V., Dhimish, M., & Wilson, D. (2017, March). Impact of video streaming quality on bandwidth in humanoid robot NAO connected to the cloud. In Proceedings of the Second International Conference on Internet of things, Data and Cloud Computing (pp. 1–8).

  • Abdolmaleki, A., Lau, N., Reis, L. P., Peters, J., & Neumann, G. (2016). Contextual policy search for linear and nonlinear generalization of a humanoid walking controller. Journal of Intelligent & Robotic Systems83(3–4), 393–408.

  • Ahmad, M. I., Mubin, O., & Orlando, J. (2016a, November). Children views’ on social robot’s adaptations in education. In Proceedings of the 28th Australian Conference on Computer-Human Interaction (pp. 145–149). ACM.

  • Ahmad, M. I., Mubin, O., & Orlando, J. (2016b, November). Effect of Different Adaptations by a Robot on Children’s Long-term Engagement: An Exploratory Study. In Proceedings of the 13th International Conference on Advances in Computer Entertainment Technology (pp. 1–6).

  • Ahmad, M. I., Mubin, O., & Orlando, J. (2017). Adaptive Social Robot for Sustaining Social Engagement during Long-Term Children–Robot Interaction. International Journal of Human–Computer Interaction, 33(12), 943–962. doi: 10.1080/10447318.2017.1300750

  • Ahmad, M. I., Mubin, O., & Patel, H. (2018, December). Exploring the Potential of NAO Robot as an Interviewer. In Proceedings of the 6th International Conference on Human–Agent Interaction (pp. 324–326). ACM.

  • Ahmad, M. I., Mubin, O., Shahid, S., & Orlando, J. (2019). Robot’s adaptive emotional feedback sustains children’s social engagement and promotes their vocabulary learning: a long-term child–robot interaction study. Adaptive Behavior, 27(4), 243–266. doi: 10.1177/1059712319844182

  • Alameda-Pineda, X., & Horaud, R. (2015). Vision-guided robot hearing. The International Journal of Robotics Research34(4–5), 437–456.

  • Albo-Canals, J., Martelo, A. B., Relkin, E., Hannon, D., Heerink, M., Heinemann, M., … & Bers, M. U. (2018). A pilot study of the KIBO robot in children with severe ASD. International Journal of Social Robotics10(3), 371–383.

  • Alemi, M., Ghanbarzadeh, A., Meghdari, A., & Moghadam, L. J. (2015). Clinical Application of a Humanoid Robot in Pediatric Cancer Interventions. International Journal of Social Robotics, 8(5), 743–759. doi: 10.1007/s12369-015-0294-y

  • Alemi, M., Meghdari, A., & Ghazisaedy, M. (2015). The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. International Journal of Social Robotics, 7(4), 523–535.

  • Alibeigi, M., Rabiee, S., & Ahmadabadi, M. N. (2017). Inverse kinematics based human mimicking system using skeletal tracking technology. Journal of Intelligent & Robotic Systems85(1), 27–45.

  • Aly, A., & Tapus, A. (2012, March). Prosody-driven robot arm gestures generation in human–robot interaction. In Proceedings of the seventh annual ACM/IEEE international conference on human–robot Interaction (pp. 257–258).

  • Aly, A., & Tapus, A. (2013, March). A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human–robot interaction. In 2013 8th ACM/IEEE International Conference on human–robot Interaction (HRI) (pp. 325–332). IEEE.

  • Aly, A., & Tapus, A. (2015). Towards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human–robot interaction. Autonomous Robots, 40(2), 193–209. doi: 10.1007/s10514-015-9444-1

  • Ames, A. D., Cousineau, E. A., & Powell, M. J. (2012, April). Dynamically stable bipedal robotic walking with NAO via human-inspired hybrid zero dynamics. In Proceedings of the 15th ACM international conference on Hybrid Systems: Computation and Control (pp. 135–144).

  • Andreasson, R., Alenljung, B., Billing, E., & Lowe, R. (2017). Affective Touch in Human–Robot Interaction: Conveying Emotion to the Nao Robot. International Journal of Social Robotics, 10(4), 473–491. doi: 10.1007/s12369-017-0446-3

  • Antonietti, A., Martina, D., Casellato, C., D’Angelo, E., & Pedrocchi, A. (2019). Control of a humanoid nao robot by an adaptive bioinspired cerebellar module in 3d motion tasks. Computational intelligence and neuroscience, 2019.

  • Anzalone, S. M., Boucenna, S., Ivaldi, S., & Chetouani, M. (2015). Evaluating the engagement with social robots. International Journal of Social Robotics7(4), 465-478.

  • Anzalone, S. M., Tilmont, E., Boucenna, S., Xavier, J., Jouen, A.-L., Bodeau, N., … Cohen, D. (2014). How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3D+time) environment during a joint attention induction task with a robot. Research in Autism Spectrum Disorders, 8(7), 814–826. doi: 10.1016/j.rasd.2014.03.002

  • Arce, F., Zamora, E., Sossa, H., & Barrón, R. (2018). Differential evolution training algorithm for dendrite morphological neural networks. Applied Soft Computing68, 303–313.

  • Arias-Aguilar, J. A., Palacios-Villavicencio, M. L., Bretado-Gallegos, R., Medina-Nieto, M. A., Ruiz, A. B., Rodríguez-López, V., & Estrada-Bautista, J. (2017, September). Analysis of children: humanoid robot interaction to support social skills development. In Proceedings of the XVIII International Conference on Human Computer Interaction (p. 10). ACM.

  • Atzeni, M., & Recupero, D. R. (2018, June). Deep learning and sentiment analysis for human–robot interaction. In European Semantic Web Conference (pp. 14–18). Springer, Cham.

  • Augello, A., Infantino, I., Manfrè, A., Pilato, G., Vella, F., & Chella, A. (2016). Creation and cognition for humanoid live dancing. Robotics and Autonomous Systems86, 128–137.

  • Bacula, A., & LaViers, A. (2018, June). Character recognition on a humanoid robotic platform via a Laban movement analysis. In Proceedings of the 5th International Conference on Movement and Computing (pp. 1–8).

  • Baddoura, R., & Venture, G. (2013). Social vs. Useful HRI: Experiencing the Familiar, Perceiving the Robot as a Sociable Partner and Responding to Its Actions. International Journal of Social Robotics, 5(4), 529–547. doi: 10.1007/s12369-013-0207-x

  • Bao, Y., & Cuijpers, R. H. (2017). On the imitation of goal directed movements of a humanoid robot. International Journal of Social Robotics, 9(5), 691–703.

  • Baraka, K., & Veloso, M. M. (2018). Mobile service robot state revealing through expressive lights: Formalism, design, and evaluation. International Journal of Social Robotics, 10(1), 65–92.

  • Baranwal, N., Singh, A. K., & Nandi, G. C. (2017). Development of a Framework for Human–Robot interactions with Indian Sign Language Using Possibility Theory. International Journal of Social Robotics9(4), 563–574.

  • Bechade, L., Dubuisson Duplessis, G., Sehili, M., & Devillers, L. (2015, November). Behavioral and Emotional Spoken Cues Related to Mental States in Human–Robot Social Interaction. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (pp. 347–350). ACM.

  • Beck, A., Cañamero, L., Hiolle, A., Damiano, L., Cosi, P., Tesser, F., & Sommavilla, G. (2013). Interpretation of emotional body language displayed by a humanoid robot: A case study with children. International Journal of Social Robotics, 5(3), 325–334.

  • Beck, A., Stevens, B., Bard, K. A., & Cañamero, L. (2012). Emotional body language displayed by artificial agents. ACM Transactions on Interactive Intelligent Systems (TiiS)2(1), 1–29.

  • Bertacchini, F., Bilotta, E., & Pantano, P. (2017). Shopping with a robotic companion. Computers in Human Behavior, 77, 382–395.

  • Beer, J. M., Boren, M., & Liles, K. R. (2016, March). Robot Assisted Music Therapy: A Case Study with Children Diagnosed with Autism. In The Eleventh ACM/IEEE International Conference on Human Robot Interaction (pp. 419–420). IEEE Press.

  • Behrens, S. I., Egsvang, A. K. K., Hansen, M., & Møllegård-Schroll, A. M. (2018, March). Gendered Robot Voices and Their Influence on Trust. In Companion of the 2018 ACM/IEEE International Conference on HumanRobot Interaction (pp. 63–64). ACM.

  • Belpaeme, T., Vogt, P., Van den Berghe, R., Bergmann, K., Göksun, T., De Haas, M., … & Papadopoulos, F. (2018). Guidelines for designing social robots as second language tutors. International Journal of Social Robotics, 10(3), 325–341.

  • Björling, E. A., Rose, E., Davidson, A., Ren, R., & Wong, D. (2019). Can we keep him forever? Teens’ engagement and desire for emotional connection with a social robot. International Journal of Social Robotics, 1–13.

  • B. KA, A. A., Mullapudi, A. K., Ebert, D., Phadnis, N., & Middha, R. (2016, June). HTKS Game for executive functions disorder using NAO Robot. In Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments (pp. 1–2).

  • Brandstetter, J., Liebman, N., & London, K. (2015, March). Fidgebot: Working Out while Working. In Proceedings of the Tenth Annual ACM/IEEE International Conference on HumanRobot Interaction Extended Abstracts (pp. 149–150). ACM.

  • Bremner, P., & Leonards, U. (2016). Iconic gestures for robot avatars, recognition and integration with speech. Frontiers in psychology7, 183.

  • Broadbent, E., Feerst, D. A., Lee, S. H., Robinson, H., Albo-Canals, J., Ahn, H. S., & MacDonald, B. A. (2018). How could companion robots be useful in rural schools?. International Journal of Social Robotics, 10(3), 295–307.

  • Cañamero, L., & Lewis, M. (2016). Making new “New AI” friends: designing a social robot for diabetic children from an embodied AI perspective. International Journal of Social Robotics, 8(4), 523–537.

  • Cao, H. L., Esteban, P. G., Simut, R., Van de Perre, G., Lefeber, D., & Vanderborght, B. (2017). A collaborative homeostatic-based behavior controller for social robots in human–robot interaction experiments. International Journal of Social Robotics9(5), 675–690.

  • Cao, H. L., Esteban, P. G., Simut, R., Van de Perre, G., Lefeber, D., & Vanderborght, B. (2017). A collaborative homeostatic-based behavior controller for social robots in human–robot interaction experiments. International Journal of Social Robotics, 9(5), 675–690.

  • Cao, H. L., Jensen, L. C., Nghiem, X. N., Vu, H., De Beir, A., Esteban, P. G., … & Vanderborght, B. (2019). DualKeepon: a human–robot interaction testbed to study linguistic features of speech. Intelligent Service Robotics, 12(1), 45–54.

  • Chandra, S., Dillenbourg, P., & Paiva, A. (2019). Children Teach Handwriting to a Social Robot with Different Learning Competencies. International Journal of Social Robotics, 1–28.

  • Chevalier, P., Tapus, A., Martin, J.-C., & Isableu, B. (2015, March). Social Personalized Human–Machine Interaction for People with Autism: Defining User Profiles and First Contact with a Robot. In Proceedings of the Tenth Annual ACM/IEEE International Conference on HumanRobot Interaction Extended Abstracts (pp. 101–102). ACM.

  • Chung, E. Y. H. (2018). Robotic Intervention Program for Enhancement of Social Engagement among Children with Autism Spectrum Disorder. Journal of Developmental and Physical Disabilities, 31(4), 419–434. doi: 10.1007/s10882-018-9651-8

  • Cohen, I., Looije, R., & Neerincx, M. A. (2014). Child’s Perception of Robot’s Emotions: Effects of Platform, Context and Experience. International Journal of Social Robotics, 6(4), 507–518. doi: 10.1007/s12369-014-0230-6

  • Collander, C., Tompkins, J., Lioulemes, A., Theofanidis, M., Sharifara, A., & Makedon, F. (2017, June). An Interactive Robot-based Vocational Assessment Game using Lego Assembly. In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments (pp. 346–353).

  • Conti, D., Commodari, E., & Buono, S. (2017). Personality factors and acceptability of socially assistive robotics in teachers with and without specialized training for children with disability. Life Span and Disability20(2), 251–272.

  • Conti, D., Di Nuovo, S., Buono, S., & Di Nuovo, A. (2017). Robots in education and care of children with developmental disabilities: a study on acceptance by experienced and future professionals. International Journal of Social Robotics, 9(1), 51–62.

  • Cuayáhuitl, H., & Kruijff-Korbayová, I. (2012, June). An interactive humanoid robot exhibiting flexible sub-dialogues. In Proceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Demonstration Session (pp. 17–20). Association for Computational Linguistics.

  • Dang, T.-H.-H., & Tapus, A. (2014). Stress Game: The Role of Motivational Robotic Assistance in Reducing User’s Task Stress. International Journal of Social Robotics, 7(2), 227–240. doi: 10.1007/s12369-014-0256-9

  • da Silva, J. G. G., Kavanagh, D. J., Belpaeme, T., Taylor, L., Beeson, K., & Andrade, J. (2018). Experiences of a Motivational Interview Delivered by a Robot: qualitative Study. Journal of medical Internet research, 20(5), e116.

  • David, D. O., Costescu, C. A., Matu, S., Szentagotai, A., & Dobrean, A. (2018). Developing joint attention for children with autism in robot-enhanced therapy. International Journal of Social Robotics, 10(5), 595–605.

  • De Beir, A., Cao, H. L., Esteban, P. G., Van de Perre, G., Lefeber, D., & Vanderborght, B. (2016). Enhancing emotional facial expressiveness on NAO. International Journal of Social Robotics8(4), 513–521.

  • De Carolis, B., Ferilli, S., & Palestra, G. (2016). Simulating empathic behavior in a social assistive robot. Multimedia Tools and Applications, 76(4), 5073–5094. doi: 10.1007/s11042-016-3797-0

  • De Carolis, B., Ferilli, S., Palestra, G., & Carofiglio, V. (2015, September). Modeling and Simulating Empathic Behavior in Social Assistive Robots. In Proceedings of the 11thBiannual Conference on Italian SIGCHI Chapter (pp. 110–117). ACM.

  • Delaborde, A., Tahon, M., Barras, C., & Devillers, L. (2009, November). A Wizard-of-Oz game for collecting emotional audio data in a children–robot interaction. In Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots (pp. 1–3).

  • Deshmukh, A., Janarthanam, S., Hastie, H., Lim, M. Y., Aylett, R., & Castellano, G. (2016, March). How expressiveness of a robotic tutor is perceived by children in a learning environment. In 2016 11th ACM/IEEE International Conference on HumanRobot Interaction (HRI) (pp. 423–424). IEEE.

  • Deshmukh, A., Jones, A., Janarthanam, S., Hastie, H., Ribeiro, T., Aylett, R., … & Papadopoulos, F. (2015, May). An empathic robotic tutor in a map application. In Proceedings of the 2015 International Conference on Autonomous Agents and Multiagent Systems (pp. 1923–1924).

  • Edwards, A., Omilion-Hodges, L., & Edwards, C. (2017, March). How do Patients in a Medical Interview Perceive a Robot versus Human Physician?. In Proceedings of the companion of the 2017 ACM/IEEE International Conference on HumanRobot Interaction (pp. 109–110). ACM.

  • Elbasiony, R., & Gomaa, W. (2018). Humanoids skill learning based on real-time human motion imitation using Kinect. Intelligent Service Robotics11(2), 149–169.

  • Erden, M. S. (2013). Emotional postures for the humanoid–robot nao. International Journal of Social Robotics5(4), 441–456.

  • Fernández-Llamas, C., Conde, M. Á., Rodríguez-Sedano, F. J., Rodríguez-Lera, F. J., & Matellán-Olivera, V. (2017). Analysing the computational competences acquired by K-12 students when lectured by robotic and human teachers. International Journal of Social Robotics, 1–11.

  • Flemisch, T., Viergutz, A., & Dachselt, R. (2014, March). Easy authoring of variable gestural expressions for a humanoid robot. In Proceedings of the 2014 ACM/IEEE international conference on Humanrobot interaction (pp. 328–328).

  • Garcia Becerro, F., Sousa, S., Slavkovik, M., & van der Torre, L. (2012). Selecting judgment aggregation rules for NAO robots: an experimental approach. Selecting judgment aggregation rules for NAO robots: an experimental approach, 1403–1404.

  • Garrell, A., Villamizar, M., Moreno-Noguer, F., & Sanfeliu, A. (2017). Teaching robot’s proactive behavior using human assistance. International Journal of Social Robotics, 9(2), 231–249.

  • Graf, B., Hans, M., & Schraft, R. D. (2004). Care-O-bot II—Development of a Next Generation Robotic Home Assistant. Autonomous Robots, 16(2), 193–205.

  • Grollman, D. H. (2018). Avoiding the Content Treadmill for Robot Personalities. International Journal of Social Robotics, 10(2), 225–234.

  • Hadfield, S. (2015). US Air Force Academy. ACM SIGCSE Bulletin47(1), 3–3.

  • Hadfield, S. M., Coulston, C. S., Hadfield, M. G., & Warner, L. B. (2016, February). Adventures in K-5 STEM Outreach Using the NAO Robot. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (pp. 697–697).

  • Hall, L., Hume, C., Tazzyman, S., Deshmukh, A., Janarthanam, S., Hastie, H., … & Corrigan, L. J. (2016, March). Map reading with an empathic robot tutor. In 2016 11th ACM/IEEE International Conference on HumanRobot Interaction (HRI) (pp. 567–567). IEEE.

  • Hu, Y., Sirlantzis, K., Howells, G., Ragot, N., & Rodríguez, P. (2016). An online background subtraction algorithm deployed on a NAO humanoid robot based monitoring system. Robotics and Autonomous Systems85, 37–47.

  • Huang, W., Xiao, X., & Xu, M. (2019). Design and implementation of domain-specific cognitive system based on question similarity algorithm. Cognitive Systems Research, 57, 20–24.

  • Ismail, L. I., Verhoeven, T., Dambre, J., & Wyffels, F. (2019). Leveraging robotics research for children with autism: a review. International Journal of Social Robotics, 11(3), 389–410.

  • Jochum, E., Vlachos, E., Christoffersen, A., Nielsen, S. G., Hameed, I. A., & Tan, Z. H. (2016). Using theatre to study interaction with care robots. International Journal of Social Robotics, 8(4), 457–470.

  • Johnson, D. O., & Cuijpers, R. H. (2019). Investigating the effect of a humanoid robot’s head position on imitating human emotions. International Journal of Social Robotics, 11(1), 65–74.

  • Johnson, D. O., Cuijpers, R. H., Juola, J. F., Torta, E., Simonov, M., Frisiello, A., … & Meins, N. (2014). Socially assistive robots: a comprehensive approach to extending independent living. International journal of social robotics, 6(2), 195–211.

  • Johnson, D. O., Cuijpers, R. H., Pollmann, K., & van de Ven, A. A. J. (2016). Exploring the Entertainment Value of Playing Games with a Humanoid Robot. International Journal of Social Robotics, 8(2), 247–269. doi: 10.1007/s12369-015-0331-x

  • Johnson, D. O., Cuijpers, R. H., & van der Pol, D. (2013). Imitating human emotions with artificial facial expressions. International Journal of Social Robotics5(4), 503–513.

  • Joubert, O. R. (2015). L’enfant autiste, le robot, et l’enseignant: une rencontre sociétale. Enfance, (1), 127–140.

  • Juang, L. H., & Zhang, J. S. (2018). Robust visual line-following navigation system for humanoid robots. Artificial Intelligence Review, 1–18.

  • Karayaneva, Y., & Hintea, D. (2018, February). Object recognition algorithms implemented on NAO robot for children’s visual learning enhancement. In Proceedings of the 2018 2nd International Conference on Mechatronics Systems and Control Engineering (pp. 86–92).

  • Kazemi, E., & Stedman-Falls, L. M. (2016). Can humanoid robots serve as patient simulators in behavior analytic research and practice?. Behavior Analysis: Research and Practice, 16(3), 135–146. doi: 10.1037/bar0000046

  • Kennedy, J., Baxter, P., & Belpaeme, T. (2015). Comparing robot embodiments in a guided discovery learning interaction with children. International Journal of Social Robotics, 7(2), 293–308.

  • Kennedy, J., Baxter, P., & Belpaeme, T. (2017). Nonverbal immediacy as a characterisation of social behaviour for human–robot interaction. International Journal of Social Robotics, 9(1), 109–128.

  • Kofinas, N., Orfanoudakis, E., & Lagoudakis, M. G. (2015). Complete analytical forward and inverse kinematics for the NAO humanoid robot. Journal of Intelligent & Robotic Systems77(2), 251–264.

  • Köse, H., Uluer, P., Akalın, N., Yorgancı, R., Özkul, A., & Ince, G. (2015). The Effect of Embodiment in Sign Language Tutoring with Assistive Humanoid Robots. International Journal of Social Robotics, 7(4), 537–548. doi: 10.1007/s12369-015-0311-1

  • Köse, H., Yorganci, R., Algan, E. H., & Syrdal, D. S. (2012). Evaluation of the Robot Assisted Sign Language Tutoring Using Video-Based Studies. International Journal of Social Robotics, 4(3), 273–283. doi: 10.1007/s12369-012-0142-2

  • Kuchenbrandt, D., Eyssel, F., Bobinger, S., & Neufeld, M. (2013). When a Robot’s Group Membership Matters. International Journal of Social Robotics, 5(3), 409–417. doi: 10.1007/s12369-013-0197-8

  • Kumar, P. B., Rawat, H., & Parhi, D. R. (2019). Path planning of humanoids based on artificial potential field method in unknown environments. Expert Systems, 36(2), e12360.

  • Lathuilière, S., Massé, B., Mesejo, P., & Horaud, R. (2019). Neural network based reinforcement learning for audio–visual gaze control in human–robot interaction. Pattern Recognition Letters118, 61–71.

  • Lim, A., & Okuno, H. G. (2015). A recipe for empathy. International Journal of Social Robotics, 7(1), 35–49.

  • Lee, N., Kim, J., Kim, E., & Kwon, O. (2017). The Influence of Politeness Behavior on User Compliance with Social Robots in a Healthcare Service Setting. International Journal of Social Robotics, 9(5), 727–743. doi: 10.1007/s12369-017-0420-0

  • Liu, J., & Urbann, O. (2016). Bipedal walking with dynamic balance that involves three-dimensional upper body motion. Robotics and Autonomous Systems77, 39–54.

  • Looije, R., Neerincx, M. A., Peters, J. K., & Henkemans, O. A. B. (2016). Integrating robot support functions into varied activities at returning hospital visits. International Journal of Social Robotics, 8(4), 483–497.

  • Lopez, A., Ccasane, B., Paredes, R., & Cuellar, F. (2017, March). Effects of Using Indirect Language by a Robot to Change Human Attitudes. In Proceedings of the companion of the 2017 ACM/IEEE international conference on humanrobot interaction (pp. 193–194). ACM.

  • López Recio, D., Márquez Segura, E., Márquez Segura, L., & Waern, A. (2013, March). The NAO models for the elderly. In Proceedings of the 8th ACM/IEEE international conference on Humanrobot interaction (pp. 187–188). IEEE Press.

  • Lorenz, T., Weiss, A., & Hirche, S. (2016). Synchrony and reciprocity: Key mechanisms for social companion robots in therapy and care. International Journal of Social Robotics, 8(1), 125–143.

  • Lucas, G. M., Boberg, J., Traum, D., Artstein, R., Gratch, J., Gainer, A., … & Nakano, M. (2017, October). The role of social dialogue and errors in robots. In Proceedings of the 5th International Conference on Human Agent Interaction (pp. 431–433).

  • Lucas, G. M., Boberg, J., Traum, D., Artstein, R., Gratch, J., Gainer, A., … Nakano, M. (2018, February). Getting to Know Each Other: The role of social dialogue in recovery from errors in social robots. In Proceedings of the 2018 ACM/IEEE International Conference on HumanRobot Interaction (pp. 344–351). ACM.

  • Majgaard, G., & Brogaard Bertel, L. (2014, March). Initial phases of design-based research into the educational potentials of NAO-robots. In 2014 9th ACM/IEEE International Conference on HumanRobot Interaction (HRI) (pp. 238–239). IEEE.

  • Martí Carrillo, F., Butchart, J., Knight, S., Scheinberg, A., Wise, L., Sterling, L., & McCarthy, C. (2018). Adapting a general-purpose social robot for paediatric rehabilitation through in situ design. ACM Transactions on HumanRobot Interaction (THRI), 7(1), 12. doi: 10.1145/3203304

  • Matthieu, C., & Dominique, D. (2015, November). Artificial companions as personal coach for children: the interactive drums teacher. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology (pp. 1–4).

  • Mazzeo, G., & Staffa, M. (2019). TROS: Protecting Humanoids ROS from Privileged Attackers. International Journal of Social Robotics, 1–15.

  • McCarthy, C., Butchart, J., George, M., Kerr, D., Kingsley, H., Scheinberg, A. M., & Sterling, L. (2015, December). Robots in rehab: towards socially assistive robots for paediatric rehabilitation. In Proceedings of the annual meeting of the australian special interest group for computer human interaction (pp. 39–43).

  • McColl, D., & Nejat, G. (2014). Recognizing emotional body language displayed by a human-like social robot. International Journal of Social Robotics, 6(2), 261–280.

  • Meriçli, C., Veloso, M., & Akın, H. L. (2012). Multi-resolution corrective demonstration for efficient task execution and refinement. International Journal of Social Robotics, 4(4), 423–435.

  • Michieletto, S., Tosello, E., Pagello, E., & Menegatti, E. (2016). Teaching humanoid robotics by means of human teleoperation through RGB-D sensors. Robotics and Autonomous Systems75, 671–678.

  • Min, H., Luo, R., & Zhu, J. (2016). Goal-directed affordance prediction at the subtask level. Industrial Robot: An International Journal.

  • Mirnig, N., Stollnberger, G., Giuliani, M., & Tscheligi, M. (2017, March). Elements of humor: How humans perceive verbal and non-verbal aspects of humorous robot behavior. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on HumanRobot Interaction (pp. 211–212).

  • Mohammad, Y., & Nishida, T. (2015). Why should we imitate robots? Effect of back imitation on judgment of imitative skill. International Journal of Social Robotics, 7(4), 497–512.

  • Moshkina, L., Park, S., Arkin, R. C., Lee, J. K., & Jung, H. (2011). TAME: Time-varying affective response for humanoid robots. International Journal of Social Robotics, 3(3), 207–221.

  • Mubin, O., Henderson, J., & Bartneck, C. (2013, December). Talk ROILA to your Robot. In Proceedings of the 15th ACM on International conference on multimodal interaction (pp. 317–318).

  • Mubin, O., Khan, A., & Obaid, M. (2016, November). # naorobot: exploring Nao discourse on Twitter. In Proceedings of the 28th Australian Conference on ComputerHuman Interaction (pp. 155–159)

  • Mwangi, E., Barakova, E., Zhang, R., Diaz, M., Catala, A., & Rauterberg, M. (2016, October). See where i am looking at: Perceiving Gaze Cues with a NAO robot. In Proceedings of the fourth international conference on human agent interaction (pp. 329–332).

  • Mwangi, E., Barakova, E. I., Díaz-Boladeras, M., Mallofré, A. C., & Rauterberg, M. (2018). Directing attention through gaze hints improves task solving in human–humanoid interaction. International journal of social robotics, 10(3), 343–355.

  • Nap, H. H., & Cornelisse, L. (2019). Zorgrobotica: geen science fiction meer. TVZ-Verpleegkunde in praktijk en wetenschap129(1), 20–23.

  • Nassour, J., Hénaff, P., Benouezdou, F., & Cheng, G. (2014). Multi-layered multi-pattern CPG for adaptive locomotion of humanoid robots. Biological cybernetics108(3), 291–303.

  • Nefti-Meziani, S., Manzoor, U., Davis, S., & Pupala, S. K. (2015). 3D perception from binocular vision for a low-cost humanoid robot NAO. Robotics and autonomous systems, 68, 129–139.

  • Niemelä, M., & Melkas, H. (2019). Robots as social and physical assistants in elderly care. In Humancentered digitalization and services (pp. 177–197). Springer, Singapore.

  • Nieuwenhuisen, M., & Behnke, S. (2013). Human-like interaction skills for the mobile communication robot Robotinho. International Journal of Social Robotics, 5(4), 549–561.

  • Novikova, J., & Watts, L. (2015). Towards artificial emotions to assist social coordination in HRI. International Journal of Social Robotics7(1), 77–88.

  • Obaid, M., Kistler, F., Häring, M., Bühling, R., & André, E. (2014). A framework for user-defined body gestures to control a humanoid robot. International Journal of Social Robotics6(3), 383–396.

  • Ou, Y., Hu, J., Wang, Z., Fu, Y., Wu, X., & Li, X. (2015). A real-time human imitation system using kinect. International Journal of Social Robotics7(5), 587-600.

  • Pan, Y., Okada, H., Uchiyama, T., & Suzuki, K. (2015). On the Reaction to Robot’s Speech in a Hotel Public Space. International Journal of Social Robotics, 7(5), 911–920. doi: 10.1007/s12369-015-0320-0

  • Pelikan, H. R. M., & Broth, M. (2016, May). Why That Nao?: How humans adapt to a conventional humanoid robot in taking turns-at-talk. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 4921–4932). ACM.

  • Peng, H., Hu, H., Chao, F., Zhou, C., & Li, J. (2016). Autonomous robotic choreography creation via semi-interactive evolutionary computation. International Journal of Social Robotics8(5), 649–661.

  • Petric, F., & Kovacic, Z. (2019). Design and Validation of MOMDP Models for Child–Robot Interaction Within Tasks of Robot-Assisted ASD Diagnostic Protocol. International Journal of Social Robotics, 1–18.

  • Pino, O., Palestra, G., Trevino, R., & De Carolis, B. (2019). The Humanoid Robot NAO as Trainer in a Memory Program for Elderly People with Mild Cognitive Impairment. International Journal of Social Robotics, 1–13. doi: 10.1007/s12369-019-00533-y

  • Piumsomboon, T., Clifford, R., & Bartneck, C. (2012, March). Demonstrating Maori Haka with kinect and nao robots. In Proceedings of the seventh annual ACM/IEEE international conference on HumanRobot Interaction (pp. 429–430).

  • Pulido, J. C., González, J. C., Suárez-Mejías, C., Bandera, A., Bustos, P., & Fernández, F. (2017). Evaluating the child–robot interaction of the NAOTherapist platform in pediatric rehabilitation. International Journal of Social Robotics, 9(3), 343–358.

  • Read, R., & Belpaeme, T. (2016). People Interpret Robotic Non-linguistic Utterances Categorically. International Journal of Social Robotics, 8(1), 31–50. doi: 10.1007/s12369-015-0304-0

  • Reppou, S. E., Tsardoulias, E. G., Kintsakis, A. M., Symeonidis, A. L., Mitkas, P. A., Psomopoulos, F. E., … & Gkiokas, A. (2016). RAPP: A Robotic-Oriented Ecosystem for Delivering Smart User Empowering Applications for Older People. International Journal of Social Robotics, 1–14. doi: 10.1007/s12369-016-0361-z

  • Ribeiro, T., Alves-Oliveira, P., Di Tullio, E., Petisca, S., Sequeira, P., Deshmukh, A., … & Papadopoulos, F. (2015, March). The Empathic Robotic Tutor: Featuring the NAO Robot. In Proceedings of the Tenth Annual ACM/IEEE International Conference on HumanRobot Interaction Extended Abstracts (pp. 285–285).

  • Rioux, A., & Suleiman, W. (2018). Autonomous SLAM based humanoid navigation in a cluttered environment while transporting a heavy load. Robotics and Autonomous Systems99, 50–62.

  • Ros, R., Baroni, I., & Demiris, Y. (2014). Adaptive human–robot interaction in sensorimotor task instruction: From human to robot dance tutors. Robotics and Autonomous Systems, 62(6), 707–720. doi: 10.1016/j.robot.2014.03.005

  • Ros, R., Oleari, E., Pozzi, C., Sacchitelli, F., Baranzini, D., Bagherzadhalimi, A., … & Demiris, Y. (2016). A motivational approach to support healthy habits in long-term child–robot interaction. International Journal of Social Robotics, 8(5), 599–617.

  • Rosenthal-von der Pütten, A. M., & Hoefinghoff, J. (2018). The more the merrier? effects of humanlike learning abilities on humans’ perception and evaluation of a robot. International Journal of Social Robotics, 10(4), 455–472.

  • Rosenthal-von der Pütten, A. M., Krämer, N. C., & Herrmann, J. (2018). The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5), 569–582.

  • Rudhru, O., Ser, Q. M., & Sandoval, E. (2016, March). Robot Maori Haka: Robots as cultural preservationists. In 2016 11th ACM/IEEE International Conference on HumanRobot Interaction (HRI) (pp. 569–569). IEEE.

  • Ruiz-del-Solar, J., Palma-Amestoy, R., Marchant, R., Parra-Tsunekawa, I., & Zegers, P. (2009). Learning to fall: Designing low damage fall sequences for humanoid soccer robots. Robotics and Autonomous Systems57(8), 796–807.

  • Saduanov, B., Tokmurzina, D., Alizadeh, T., & Abibullaev, B. (2018, March). Brain–Computer Interface Humanoid Pre-trained for Interaction with People. In Companion of the 2018 ACM/IEEE International Conference on HumanRobot Interaction (pp. 229–230).

  • Sandoval, E. B., Brandstetter, J., Obaid, M., & Bartneck, C. (2016). Reciprocity in human–robot interaction: a quantitative approach through the prisoner’s dilemma and the ultimatum game. International Journal of Social Robotics, 8(2), 303–317.

  • Sandygulova, A., & O’Hare, G. M. P. (2016, March). Investigating the Impact of Gender Segregation within Observational Pretend Play Interaction. In 2016 11th ACM/IEEE International Conference on HumanRobot Interaction (HRI) (pp. 399–406. IEEE.

  • Sandygulova, A., & O’Hare, G. M. (2018). Age-and gender-based differences in children’s interactions with a gender-matching robot. International Journal of Social Robotics, 10(5), 687–700.

  • Sarabia, M., Young, N., Canavan, K., Edginton, T., Demiris, Y., & Vizcaychipi, M. P. (2018). Assistive Robotic Technology to Combat Social Isolation in Acute Hospital Settings. International Journal of Social Robotics, 10(5), 607-620. doi: 10.1007/s12369-017-0421-z

  • Saunderson, S., & Nejat, G. (2019). How robots influence humans: A survey of nonverbal communication in social human–robot interaction. International Journal of Social Robotics, 11(4), 575–608.

  • Schillaci, G., Bodiroža, S., & Hafner, V. V. (2013). Evaluating the effect of saliency detection and attention manipulation in human–robot interaction. International Journal of Social Robotics, 5(1), 139–152.

  • Seo, S. H., Griffin, K., Young, J. E., Bunt, A., Prentice, S., & Loureiro-Rodríguez, V. (2018). Investigating people’s rapport building and hindering behaviors when working with a collaborative robot. International Journal of Social Robotics, 10(1), 147–161.

  • Ser, Q. M., Rudhru, O., & Sandoval, E. B. (2016, March). Robot Maori Haka. In 2016 11th ACM/IEEE International Conference on HumanRobot Interaction (HRI) (pp. 549–549). IEEE.

  • Shafii, N., Reis, L. P., & Rossetti, R. J. (2011, June). Two humanoid simulators: Comparison and synthesis. In 6th Iberian Conference on Information Systems and Technologies (CISTI 2011) (pp. 1–6). IEEE.

  • Shahbazi, H., Jamshidi, K., Monadjemi, A. H., & Eslami, H. (2014). Biologically inspired layered learning in humanoid robots. Knowledge-Based Systems57, 8–27.

  • Shahbazi, H., Parandeh, R., & Jamshidi, K. (2016). Implementation of imitation learning using natural learner central pattern generator neural networks. Neural Networks83, 94–108.

  • Sheikhi, S., Babu Jayagopi, D., Khalidov, V., & Odobez, J. M. (2013, December). Context aware addressee estimation for human robot interaction. In Proceedings of the 6th workshop on Eye gaze in intelligent human machine interaction: gaze in multimodal interaction (pp. 1–6)

  • Shinohara, Y., Mitsukuni, K., Yoneda, T., Ichikawa, J., Nishizaki, Y., & Oka, N. (2018, December). A Humanoid Robot Can Use Mimicry to Increase Likability and Motivation for Helping. In Proceedings of the 6th International Conference on HumanAgent Interaction (pp. 122–128). ACM.

  • Shukla, J., Cristiano, J., Oliver, J., & Puig, D. (2019). Robot Assisted Interventions for Individuals with Intellectual Disabilities: Impact on Users and Caregivers. International Journal of Social Robotics, 1–19. doi: 10.1007/s12369-019-00527-w

  • Singh, A. K., & Nandi, G. C. (2016). NAO humanoid robot: Analysis of calibration techniques for robot sketch drawing. Robotics and Autonomous Systems, 79, 108–121.

  • Spataro, R., Chella, A., Allison, B., Giardina, M., Sorbello, R., Tramonte, S., … & La Bella, V. (2017). Reaching and grasping a glass of water by locked-in ALS patients through a BCI-controlled humanoid robot. Frontiers in human neuroscience, 11, 68.

  • Stanton, C. J., & Stevens, C. J. (2017). Don’t stare at me: the impact of a humanoid robot’s gaze upon trust during a cooperative human–robot visual task. International Journal of Social Robotics, 9(5), 745–753.

  • Striepe, H., Donnermann, M., Lein, M., & Lugrin, B. (2019). Modeling and Evaluating Emotion, Contextual Head Movement and Voices for a Social Robot Storyteller. International Journal of Social Robotics, 1–17.

  • Sun, Z., & Roos, N. (2018). Dynamically stable walk control of biped humanoid on uneven and inclined terrain. Neurocomputing, 280, 111–122.

  • Suay, H. B., & Chernova, S. (2011, March). Humanoid robot control using depth camera. In Proceedings of the 6th international conference on Humanrobot interaction (pp. 401–402). ACM.

  • Tahir, Y., Dauwels, J., Thalmann, D., & Magnenat Thalmann, N. (2018). A User Study of a Humanoid Robot as a Social Mediator for Two-Person Conversations. International Journal of Social Robotics, 1–14. doi: 10.1007/s12369-018-0478-3

  • Tahir, Y., Rasheed, U., Dauwels, S., & Dauwels, J. (2014, March). Perception of humanoid social mediator in two-person dialogs. In Proceedings of the 2014 ACM/IEEE international conference on Humanrobot interaction (pp. 300–301).

  • Tapus, A., Peca, A., Aly, A., Pop, C., Jisa, L., Pintea, S., … David, D. O. (2012). Children with autism social engagement in interaction with Nao, an imitative robot: A series of single case experiments. Interaction Studies, 13(3), 315–347. doi: 10.1075/is.13.3.01tap

  • Testart, J., Del Solar, J. R., Schulz, R., Guerrero, P., & Palma-Amestoy, R. (2011). A real-time hybrid architecture for biped humanoids with active vision mechanisms. Journal of Intelligent & Robotic Systems63(2), 233–255.

  • Tielman, M., Neerincx, M., Meyer, J.-J., & Looije, R. (2014, March). Adaptive emotional expression in robot–child interaction. In Proceedings of the 2014 ACM/IEEE international conference on Humanrobot interaction (pp. 407–414). ACM.

  • Tokmurzina, D., Sagitzhan, N., Nurgaliyev, A., & Sandygulova, A. (2018, March). Exploring Child–Robot Proxemics. In Companion of the 2018 ACM/IEEE International Conference on HumanRobot Interaction (pp. 257–258). ACM.

  • Toprak, S., Navarro-Guerrero, N., & Wermter, S. (2018). Evaluating integration strategies for visuo-haptic object recognition. Cognitive computation, 10(3), 408–425.

  • Torta, E., Cuijpers, R. H., & Juola, J. F. (2013). Design of a parametric model of personal space for robotic social navigation. International Journal of Social Robotics, 5(3), 357–365.

  • Torta, E., van Heumen, J., Piunti, F., Romeo, L., & Cuijpers, R. (2015). Evaluation of unimodal and multimodal communication cues for attracting attention in human–robot interaction. International Journal of Social Robotics, 7(1), 89–96.

  • Tsardoulias, E. G., Symeonidis, A. L., & Mitkas, P. A. (2015). An automatic speech detection architecture for social robot oral interaction. In Proceedings of the Audio Mostly 2015 on Interaction With Sound (pp. 1–8).

  • Tuisku, O., Pekkarinen, S., Hennala, L., & Melkas, H. (2019). Robots do not replace a nurse with a beating heart. Information Technology & People.

  • Turp, M., González, J. C., Pulido, J. C., & Fernández, F. (2019). Developing a robot–guided interactive Simon game for physical and cognitive training. International Journal of Humanoid Robotics, 16(01), 1950003.

  • Tutsoy, O., Erol Barkana, D., & Colak, S. (2017). Learning to balance an NAO robot using reinforcement learning with symbolic inverse kinematic. Transactions of the Institute of Measurement and Control, 39(11), 1735–1748.

  • Uluer, P., Akalın, N., & Köse, H. (2015). A new robotic platform for sign language tutoring. International Journal of Social Robotics, 7(5), 571–585.

  • Valentí Soler, M., Agüera-Ortiz, L., Olazarán Rodríguez, J., Mendoza Rebolledo, C., Pérez Muñoz, A., Rodríguez Pérez, I., … & Felipe Ruiz, S. (2015). Social robots in advanced dementia. Frontiers in aging neuroscience, 7, 133. doi: 10.3389/fnagi.2015.00133

  • Valtazanos, A., & Ramamoorthy, S. (2013, March). Evaluating the effects of limited perception on interactive decisions in mixed robotic domains. In 2013 8th ACM/IEEE International Conference on HumanRobot Interaction (HRI) (pp. 9–16). IEEE.

  • Valtazanos, A., & Ramamoorthy, S. (2013, May). Bayesian interaction shaping: Learning to influence strategic interactions in mixed robotic domains. In Proceedings of the 2013 international conference on Autonomous agents and multi-agent systems (pp. 63–70).

  • Vanderelst, D., & Winfield, A. (2018). An architecture for ethical robots inspired by the simulation theory of cognition. Cognitive Systems Research, 48, 56–66.

  • van Dijk, E. T., Torta, E., & Cuijpers, R. H. (2013). Effects of eye contact and iconic gestures on message retention in human–robot interaction. International Journal of Social Robotics, 5(4), 491–501.

  • van den Heuvel, R. J., Lexis, M. A., & de Witte, L. P. (2017). Robot ZORA in rehabilitation and special education for children with severe physical disabilities: a pilot study. International journal of rehabilitation research. Internationale Zeitschrift fur Rehabilitationsforschung. Revue internationale de recherches de readaptation, 40(4), 353.

  • van den Heuvel, R. J., Lexis, M. A., & de Witte, L. P. (2019). ZORA Robot Based Interventions to Achieve Therapeutic and Educational Goals in Children with Severe Physical Disabilities. International Journal of Social Robotics, 1–12.

  • van der Woerdt, S., & Haselager, P. (2019). When robots appear to have a mind: The human perception of machine agency and responsibility. New Ideas in Psychology, 54, 93–100.

  • van Straten, C. L., Peter, J., & Kühne, R. (2019). Child–Robot Relationship Formation: A Narrative Review of Empirical Research. International Journal of Social Robotics, 1–20.

  • Vänni, K. J., & Salin, S. E. (2017, November). A need for service robots among health care professionals in hospitals and housing services. In International Conference on Social Robotics (pp. 178–187). Springer, Cham.

  • Vänni, K. J., & Salin, S. E. (2019). Attitudes of professionals toward the need for assistive and social robots in the healthcare sector. In Social Robots: Technological, Societal and Ethical Aspects of HumanRobot Interaction (pp. 205–236). Springer, Cham.

  • Xu, J., Broekens, J., Hindriks, K., & Neerincx, M. A. (2015). Mood contagion of robot body language in human robot interaction. Autonomous Agents and Multi-Agent Systems, 29(6), 1216–1248. doi: 10.1007/s10458-015-9307-3

  • Wallkötter, S., Joannou, M., Westlake, S., & Belpaeme, T. (2017, October). Continuous Multi-Modal Interaction Causes Human–Robot Alignment. In Proceedings of the 5th International Conference on Human Agent Interaction (pp. 375–379).

  • Wang, B., & Rau, P. L. P. (2019). Influence of Embodiment and Substrate of Social Robots on Users’ Decision-Making and Attitude. International Journal of Social Robotics, 11(3), 411–421.

  • Wen, S., Sheng, M., Ma, C., Li, Z., Lam, H. K., Zhao, Y., & Ma, J. (2018). Camera recognition and laser detection based on EKF-SLAM in the autonomous navigation of humanoid robot. Journal of Intelligent & Robotic Systems92(2), 265–277.

  • Wigdor, N., Fraaije, A., Solms, L., de Greeff, J., Janssen, J., & Blanson Henkemans, O. (2014, March). The NAO goes to camp. In Proceedings of the 2014 ACM/IEEE international conference on Humanrobot interaction (pp. 110–110).

  • Wilke, C., Götz, S., & Richly, S. (2013, March). JouleUnit: a generic framework for software energy profiling and testing. In Proceedings of the 2013 workshop on Green in/by software engineering (pp. 9–14).

  • Willemse, C. J., & van Erp, J. B. (2019). Social touch in Human–robot interaction: Robot-initiated touches can induce positive responses without extensive prior bonding. International journal of social robotics, 11(2), 285–304.

  • Wilson, J. R., Lee, N. Y., Saechao, A., Tickle-Degnen, L., & Scheutz, M. (2018). Supporting human autonomy in a robot-assisted medication sorting task. International Journal of Social Robotics, 10(5), 621–641.

  • Wykowska, A., Kajopoulos, J., Obando-Leiton, M., Chauhan, S. S., Cabibihan, J. J., & Cheng, G. (2015). Humans are well tuned to detecting agents among non-agents: examining the sensitivity of human perception to behavioral characteristics of intentional systems. International Journal of Social Robotics, 7(5), 767–781.

  • Xu, J. (2014, May). Body language of humanoid robots for mood expression. In Proceedings of the 2014 international conference on Autonomous agents and multi-agent systems (pp. 1711–1712).

  • Zanatto, D., Patacchiola, M., Cangelosi, A., & Goslin, J. (2019). Generalisation of Anthropomorphic Stereotype. International Journal of Social Robotics, 1–10.

  • Zeller, F., Smith, D. H., Duong, J. A., & Mager, A. (2019). Social Media in Human–Robot Interaction. International Journal of Social Robotics, 1–14.

  • Zhu, T., Zhao, Q., Wan, W., & Xia, Z. (2017). Robust regression-based motion perception for online imitation on humanoid robot. International Journal of Social Robotics, 9(5), 705–725.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Robaczewski, A., Bouchard, J., Bouchard, K. et al. Socially Assistive Robots: The Specific Case of the NAO. Int J of Soc Robotics (2020). https://doi.org/10.1007/s12369-020-00664-7

Download citation

Keywords

  • Socially assistive robot
  • NAO
  • Social interactions
  • Affectivity
  • Intervention
  • Assisted teaching
  • Mild cognitive impairment
  • Dementia
  • Autism
  • Intellectual disability