Advertisement

Construction of a Computer Vision Test Platform: VISART for Facial Recognition in Social Robotics

  • Edwin Rodríguez
  • Christian Gutiérrez
  • Cristian Ochoa
  • Freddy Trávez
  • Luis EscobarEmail author
  • David Loza
Conference paper
  • 60 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1194)

Abstract

Robotics has undoubtedly found its way deeper into every day human tasks up to the point where now they even share workspaces with people. In this context, social robotics has increased its field of action, one important part is the advances in aspects that make up the Human-Machine Interaction. This article reports the advance that an investigation team at Universidad de las Fuerzas Armadas ESPE has done to create a test platform for Computer Vision algorithms. This approach consists of a 3-DOF articulated robotic head with anthropomorphic characteristics, considering the guidelines established by different authors for social robotics, besides it provides affordable, flexible and scalable hardware resources to facilitate the implementation, testing, analysis and development of different stereo artificial vision algorithms. The system, called Visart, is intended to work in more natural situations than independent cameras, therefore, work deeply in Human-Robot Interaction. As the first report of the Visart prototype, some basic tests with the platform were performed, they consist of face detection, facial and expression recognition, object tracking, and object distance calculation. The results obtained were consistent, opening a door for further research focusing on the comparison of more computer vision algorithms to examine their performance in real scenarios and evaluate them more naturally with our prototype for Human-Robot Interaction.

Keywords

HRI Computer Vision Human-Robot Interaction Anthropomorphic robot Social robotics 

References

  1. 1.
    Armea, A.: Calculating a depth map from a stereo camera with OpenCV (2017). https://albertarmea.com/post/opencv-stereo-camera/
  2. 2.
    Azevedo, H., Belo, J.P.R., Romero, R.A.F.: Cognitive and robotic systems: speeding up integration and results. In: 2017 Latin American Robotics Symposium (LARS) and 2017 Brazilian Symposium on Robotics (SBR), Curitiba, pp. 1–6. IEEE (2017).  https://doi.org/10.1109/SBR-LARS-R.2017.8215337
  3. 3.
    Budynas, R.G., Keith, N.: Diseño en ingeniería mecánica de Shigley, 8th edn. McGrawHill, Mexico (2008)Google Scholar
  4. 4.
    Duffy, B.R.: Anthropomorphism and the social robot. Robot. Auton. Syst. 42(3–4), 177–190 (2003).  https://doi.org/10.1016/S0921-8890(02)00374-3. https://linkinghub.elsevier.com/retrieve/pii/S0921889002003743CrossRefzbMATHGoogle Scholar
  5. 5.
    Faust, R.A.: Robotics in Surgery: History, Current and Future Applications. Nova Publishers, New York (2007). Google-Books-ID: p70afWyqcrMCGoogle Scholar
  6. 6.
    Fernández, L., Sotomayor, L.: Análisis cinemático inverso y directo del robot paralelo (2016)Google Scholar
  7. 7.
    Fong, T., Thorpe, C., Baur, C.: Collaboration, dialogue, human-robot interaction. In: Jarvis, R.A., Zelinsky, A. (eds.) Robotics Research. Springer Tracts in Advanced Robotics, vol. 6, p. 10. Springer, Heidelberg (2003).  https://doi.org/10.1007/3-540-36460-9_17CrossRefGoogle Scholar
  8. 8.
    Gomez Camacho, M., Barrera, Y.V.: Prototipo de un sistema biométrico integrado para la identificación de personas - fase I (2016).  https://doi.org/10.13140/rg.2.1.3267.2881
  9. 9.
    Goodrich, M.A., Schultz, A.C.: Human-robot interaction: a survey. Found. Trends Hum. Comput. Interact. 1(3), 203–275 (2007).  https://doi.org/10.1561/1100000005CrossRefzbMATHGoogle Scholar
  10. 10.
    Hayashi, K., Shiomi, M., Kanda, T., Hagita, N.: Who is appropriate? A robot, human and mascot perform three troublesome tasks. In: 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, pp. 348–354. IEEE (2010).  https://doi.org/10.1109/ROMAN.2010.5598661
  11. 11.
    Hockstein, N.G., Gourin, C.G., Faust, R.A., Terris, D.J.: A history of robots: from science fiction to surgical robotics. J. Robot. Surg. 1(2), 113–118 (2007).  https://doi.org/10.1007/s11701-007-0021-2CrossRefGoogle Scholar
  12. 12.
    IFR: World Robotics Survey: Industrial robots are conquering the world (2015). https://ifr.org/news/world-robotics-survey-industrial-robots-are-conquering-the-world-/
  13. 13.
    Kehoe, B., Patil, S., Abbeel, P., Goldberg, K.: A survey of research on cloud robotics and automation. IEEE Trans. Autom. Sci. Eng. 12(2), 398–409 (2015).  https://doi.org/10.1109/TASE.2014.2376492CrossRefGoogle Scholar
  14. 14.
    Krishnan, A., Hs, A.: Design and implementation of robotic vision for face recognition. In: 2017 International Conference on Circuits, Controls, and Communications (CCUBE), Bangalore, pp. 135–138. IEEE (2017).  https://doi.org/10.1109/CCUBE.2017.8394148
  15. 15.
    Leite, I., Martinho, C., Paiva, A.: Social robots for long-term interaction: a survey. Int. J. Social Robot. 5(2), 291–308 (2013).  https://doi.org/10.1007/s12369-013-0178-yCrossRefGoogle Scholar
  16. 16.
    Lindblom, J., Andreasson, R.: Current challenges for UX evaluation of human-robot interaction. In: Schlick, C., Trzcieliński, S. (eds.) Advances in Ergonomics of Manufacturing: Managing the Enterprise of the Future. Advances in Intelligent Systems and Computing, vol. 490, pp. 267–277. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-41697-7_24CrossRefGoogle Scholar
  17. 17.
    Littlewort, G., et al.: Towards social robots: automatic evaluation of human-robot interaction by facial expression classification. In: Thrun, S., Saul, L.K., Schölkopf, B. (eds.) Advances in Neural Information Processing Systems 16, pp. 1563–1570. MIT Press, Cambridge (2004)Google Scholar
  18. 18.
    Lohse, M., Hegel, F., Wrede, B.: Domestic applications for social robots: an online survey on the influence of appearance and capabilities. J. Phys. Agents (JoPha) 2(2), 21–32 (2008).  https://doi.org/10.14198/JoPha.2008.2.2.04CrossRefGoogle Scholar
  19. 19.
    Martinez, B., Valstar, M.F., Jiang, B., Pantic, M.: Automatic analysis of facial actions: a survey. IEEE Trans. Affect. Comput. 10(3), 325–347 (2019).  https://doi.org/10.1109/TAFFC.2017.2731763. https://ieeexplore.ieee.org/document/7990582/CrossRefGoogle Scholar
  20. 20.
    McGinn, C.: Why do robots need a head? The role of social interfaces on service robots. Int. J. Social Robot. (2019).  https://doi.org/10.1007/s12369-019-00564-5CrossRefGoogle Scholar
  21. 21.
    Nagai, K., Sakabe, H., Ohka, M.: Finger direction recognition toward human-and-robot cooperative tasks. In: 2017 International Symposium on Micro-NanoMechatronics and Human Science (MHS), Nagoya, pp. 1–3. IEEE (2017).  https://doi.org/10.1109/MHS.2017.8305242
  22. 22.
    Ochoa, C., Trávez, F.: Diseno y construcción de una plataforma robótica para pruebas técnicas sobre visión artificial que permita el aprendizaje y desarrollo del software en robots interactivos. Ph.D. thesis, Universidad de las Fuerzas Armadas ESPE, Sangolquí, Ecuador (2018)Google Scholar
  23. 23.
    Pinillos, R., Marcos, S., Feliz, R., Zalama, E., Gómez-García-Bermejo, J.: Long-term assessment of a service robot in a hotel environment. Robot. Auton. Syst. 79, 40–57 (2016).  https://doi.org/10.1016/j.robot.2016.01.014. https://linkinghub.elsevier.com/retrieve/pii/S0921889015300440CrossRefGoogle Scholar
  24. 24.
    Pinto Sánchez-Matamoros, L.F., Carrasco, J.A.P.: Análisis de la aplicación de algoritmos de K-means y Continuous Max-Flow a la segmentación de imágenes en color. Dep. Teoría de la Señal y Comunicaciones Escuela Técnica Superior de Ingeniería Universidad de Sevilla, p. 80 (2015)Google Scholar
  25. 25.
    Preston, S.D., de Waal, F.B.M.: Empathy: its ultimate and proximate bases. Behav. Brain Sci. 25(1), 1–20 (2002).  https://doi.org/10.1017/S0140525X02000018CrossRefGoogle Scholar
  26. 26.
    Reyes Fernández, B.: Empathy and moral development: implications for caring and justice. Actualidades en Psicología 20(107), 141 (2011).  https://doi.org/10.15517/ap.v20i107.40CrossRefGoogle Scholar
  27. 27.
    Riek, L.D., Robinson, P.: When my robot smiles at me: enabling human-robot rapport via real-time head gesture mimicry. J. Multimodal User Interfaces 3(1–2), 99–108 (2010)CrossRefGoogle Scholar
  28. 28.
    Robie, A.A., Seagraves, K.M., Egnor, S.E.R., Branson, K.: Machine vision methods for analyzing social interactions. J. Exp. Biol. 220(1), 25–34 (2017).  https://doi.org/10.1242/jeb.142281CrossRefGoogle Scholar
  29. 29.
    Stewart, M., Gwen, B., Ian, L., Javier, F., Movellan, R.: Real time face detection and facial expression recognition: development and applications to human computer interaction, Madison, Wisconsin, USA, p. 53. IEEE (2003)Google Scholar
  30. 30.
    Tapus, A., Maja, M., Scassellatti, B.: The grand challenges in socially assistive robotics. IEEE Robot. Autom. Mag. 14(1) (2007)Google Scholar
  31. 31.
    Tontini, G.: Integrating the Kano model and QFD for designing new products. Total Qual. Manag. Bus. Excell. 18(6), 599–612 (2007).  https://doi.org/10.1080/14783360701349351CrossRefGoogle Scholar
  32. 32.
    Wiese, E., Metta, G., Wykowska, A.: Robots as intentional agents: using neuroscientific methods to make robots appear more social. Front. Psychol. 8, 1663 (2017).  https://doi.org/10.3389/fpsyg.2017.01663. http://journal.frontiersin.org/article/10.3389/fpsyg.2017.01663/fullCrossRefGoogle Scholar
  33. 33.
    Yi, D., Liu, R., Chu, R.F., Lei, Z., Li, S.Z.: Face matching between near infrared and visible light images. In: Lee, S.-W., Li, S.Z. (eds.) ICB 2007. LNCS, vol. 4642, pp. 523–530. Springer, Heidelberg (2007).  https://doi.org/10.1007/978-3-540-74549-5_55CrossRefGoogle Scholar
  34. 34.
    Yii Lim, M., Aylett, R.: Human-like memory retrieval mechanisms for social companions. In: AAMAS, Taipei (2011)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Edwin Rodríguez
    • 1
  • Christian Gutiérrez
    • 1
  • Cristian Ochoa
    • 1
  • Freddy Trávez
    • 1
  • Luis Escobar
    • 1
    Email author
  • David Loza
    • 1
  1. 1.Department of Energy and MechanicsUniversidad de las Fuerzas Armadas - ESPEPichinchaEcuador

Personalised recommendations