Advertisement

Towards seamless human robot collaboration: integrating multimodal interaction

  • Stergios Papanastasiou
  • Niki Kousi
  • Panagiotis Karagiannis
  • Christos Gkournelos
  • Apostolis Papavasileiou
  • Konstantinos Dimoulas
  • Konstantinos Baris
  • Spyridon Koukas
  • George Michalos
  • Sotiris MakrisEmail author
ORIGINAL ARTICLE

Abstract

This paper discusses the challenges in the collaboration between human operators and industrial robots for assembly operations focusing on safety and simplified interaction. A case study is presented, involving perception technologies for the robot in conjunction with wearable devices used by the operator. In terms of robot perception, a manual guidance module, an air pressor contact sensor namely skin, and a vision system for recognition and tracking of objects have been developed and integrated. Concerning the wearable devices, an advanced user interface including audio and haptic commands accompanied by augmented reality technology are used to support the operator and provide awareness by visualizing information related to production and safety aspects. In parallel, safety functionalities are implemented through collision detection technologies such as a safety skin and safety monitored regions delimiting the area of the robot activities. The complete system is coordinated under a common integration platform and it is validated in a case study of the white goods industry.

Keywords

Human robot collaboration Interaction Augmented reality Wearable devices Safety Integration 

Notes

Acknowledgements

This research was supported by the EC research project “ROBO-PARTNER – Seamless Human-Robot Cooperation for Intelligent, Flexible and Safe Operations in the Assembly Factories of the Future” (Grant Agreement: 608855) (www.robo-partner.eu). The authors would like to specially thank Electrolux Italia S.P.A. (ELUX) for providing valuable input for the current status, the challenges, and the requirements of the refrigerator’s assembly line used as a case study in the present work.

References

  1. 1.
    Tsarouchi P, Makris S, Chryssolouris G (2016) Human-robot interaction review and challenges on task planning and programming. Int J Comput Integr Manuf 29:916–931.  https://doi.org/10.1080/0951192X.2015.1130251 CrossRefGoogle Scholar
  2. 2.
    Chryssolouris G (2006) Manufacturing systems: theory and practice, 2nd edn. Springer-Verlag, New YorkGoogle Scholar
  3. 3.
    Bortolini M, Faccio M, Gamberi M, Pilati F (2018) Motion analysis system (MAS) for production and ergonomics assessment in the manufacturing processes. Comput Ind Eng.  https://doi.org/10.1016/j.cie.2018.10.046
  4. 4.
    Bortolini M, Faccio M, Gamberi M, Pilati F (2017a) Multi-objective assembly line balancing considering component picking and ergonomic risk. Comput Ind Eng 112:348–367.  https://doi.org/10.1016/j.cie.2017.08.029 CrossRefGoogle Scholar
  5. 5.
    Bortolini M, Ferrari E, Gamberi M, Pilati F, Faccio M (2017b) Assembly system design in the industry 4.0 era: a general framework. 20th IFAC World Congress. IFAC-PapersOnLine 50:5700–5705.  https://doi.org/10.1016/j.ifacol.2017.08.1121 CrossRefGoogle Scholar
  6. 6.
    Kousi N, Michalos G, Aivaliots S, Makris S (2018) An outlook on future assembly systems introducing robotic mobile dual arm workers. 51st CIRP Conference on Manufacturing System. Proced CIRP 72:33–38.  https://doi.org/10.1016/j.procir.2018.03.130 CrossRefGoogle Scholar
  7. 7.
    Chryssolouris G, Mourtzis D (2012) Proceedings on manufacturing, modeling, management and control. 45th CIRP Conference on Manufacturing Systems, Procedia CIRP 3:1–650Google Scholar
  8. 8.
    Pilz GmbH (2015) Safe camera system SafetyEYE. https://www.pilz.com/en-DE/eshop/00014000337042/SafetyEYE-Safe-camera-system. Accessed 2 Dec 2015
  9. 9.
    Bley H, Reinhart G, Seliger G, Bernardi M, Korne T (2004) Appropriate human involvement in assembly and disassembly. CIRP Ann 53:487–509.  https://doi.org/10.1016/S0007-8506(07)60026-2 CrossRefGoogle Scholar
  10. 10.
    Kruger J, Bernhardt R, Surdilovic D, Spur G (2006) Intelligent assist systems for flexible assembly. CIRP Ann 55:29–33.  https://doi.org/10.1016/S0007-8506(07)60359-X CrossRefGoogle Scholar
  11. 11.
    Helms E, Schraft RD, Hagele M (2002) Rob@work: robot assistant in industrial environments. Proceedings of the 11th IEEE Int. Workshop on Robot and Human Interactive Communication, pp 399–404.  https://doi.org/10.1109/ROMAN.2002.1045655
  12. 12.
    Wannasuphoprasit W, Akella P, Peshkin M, Colgate JE (1998) A novel material handling technology. International Mechanical Engineering Congress and Exposition, pp 1–7Google Scholar
  13. 13.
    Brecher C, Schroter B, Almeida C (2005) Development and programming of portable robot systems for material handling tasks. Proceedings of the CIRP International Conference on Reconfigurable ManufacturingGoogle Scholar
  14. 14.
    Bernhardt R, Surdilovic D, Katschinski V, Schroer K, Schroer K (2008) Next generation of flexible assembly systems. Innovation in Manufacturing Networks, pp 279–288.  https://doi.org/10.1007/978-0-387-09492-2_30
  15. 15.
    Bernhardt R, Surdilovic D, Katschinski V, Schroer K (2008) Flexible assembly systems through human integration. IEEE SMC International Conference on Distributed Human-Machine Systems Proceedings, pp 497–502.  https://doi.org/10.3182/20070523-3-ES-4908.00041 CrossRefGoogle Scholar
  16. 16.
    Michalos G, Makris S, Tsarouchi P, Guasch T, Kontovrakis D, Chryssolouris G (2015) Design considerations for safe human–robot collaborative workplaces. Proced CIRP 37:248–253.  https://doi.org/10.1016/j.procir.2015.08.014 CrossRefGoogle Scholar
  17. 17.
    Michalos G, Makris S, Spiliotopoulos J, Misios I, Tsarouchi P, Chryssolouris G (2014) ROBO-PARTNER: seamless human-robot cooperation for intelligent, flexible and safe operations in the assembly factories of the future. Proced CIRP 23:71–76.  https://doi.org/10.1016/j.procir.2014.10.079 CrossRefGoogle Scholar
  18. 18.
    Goodrich M, Schultz C (2008) Human-robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203–275.  https://doi.org/10.1561/1100000005 CrossRefzbMATHGoogle Scholar
  19. 19.
    Universal robots https://www.universal-robots.com/. Accessed on 2019
  20. 20.
    KUKA robots https://www.kuka.com/. Accessed on 2019
  21. 21.
    Michalos G, Kousi N, Karagiannis P, Gkournelos C, Dimoulas K, Koukas S, Mparis K, Papavasiliou A, Makris S (2018) Seamless human robot collaborative assembly—an automotive case study. Mechatronics 55:194–211.  https://doi.org/10.1016/j.mechatronics.2018.08.006 CrossRefGoogle Scholar
  22. 22.
  23. 23.
    Perez L, Rodriguez I, Rodriguez N, Usamentiaga R, Garcia DF (2016) Robot guidance using machine vision techniques in industrial environments: a comparative review. Sensors 16(3):335.  https://doi.org/10.3390/s16030335 CrossRefGoogle Scholar
  24. 24.
    Statista (2018) Household appliances. Statista digital market outlook. https://www.statista.com/outlook/256/102/home-appliances/europe#. Accessed on 2018
  25. 25.
    Kang MK, Lee S, Kim JH (2014) Shape optimization of a mechanically decoupled six-axis force/torque sensor. Sensors Actuators 209:41–51.  https://doi.org/10.1016/j.sna.2014.01.001 CrossRefGoogle Scholar
  26. 26.
    Chen S, Li Y, Kwok NM (2011) Active vision in robotic systems: a survey of recent developments. Int J Robot Res 30:1343–1377.  https://doi.org/10.1177/0278364911410755 CrossRefGoogle Scholar
  27. 27.
    Davies ER (1998) Automated visual inspection. Mach Vis 19:471–502.  https://doi.org/10.1016/B978-0-12-206090-8.50027-X CrossRefGoogle Scholar
  28. 28.
    Sanz J, Petkovic D (1988) Machine vision algorithm for automated inspection of thin-film disk heads. IEEE Trans PAMI 10:830–848.  https://doi.org/10.1109/34.9106 CrossRefGoogle Scholar
  29. 29.
    Tucker JW (1989) Inside beverage can inspection: an application from start to finish. Proceedings of the Vision '89 ConferenceGoogle Scholar
  30. 30.
    Ker J, Kengskool K (1990) An efficient method for inspecting machine parts by a fixtureless machine vision system. Proceedings of the Vision '90 ConferenceGoogle Scholar
  31. 31.
    Li H, Lin JC (1994) Using fuzzy logic to detect dimple defects of polished wafer surfaces. IEEE Trans Ind Appl 30:1530–1543.  https://doi.org/10.1109/28.287528 CrossRefGoogle Scholar
  32. 32.
    Rentzos L, Papanastasiou S, Papakostas N, Chryssolouris G (2013) Augmented reality for human-based assembly: using product and process semantics. IFAC Proceed 46(15):98–101.  https://doi.org/10.3182/20130811-5-US-2037.00053 CrossRefGoogle Scholar
  33. 33.
    Michalos G, Karagiannis P, Makris S, Tokcalar O, Chryssolouris G (2015) Augmented reality (AR) applications for supporting human-robot interactive cooperation. Proced CIRP 41:370–375.  https://doi.org/10.1016/j.procir.2015.12.005 CrossRefGoogle Scholar
  34. 34.
    Makris S, Karagiannis P, Koukas S, Matthaiakis A-S (2016) Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann 65:61–64.  https://doi.org/10.1016/j.cirp.2016.04.038 CrossRefGoogle Scholar
  35. 35.
    Gkournelos C, Karagiannis P, Kousi N, Michalos G, Koukas S, Makris S (2018) Application of wearable devices for supporting operators in human-robot cooperative assembly tasks. 7th CIRP Conference on Assembly Technologies And Systems, (CATS 2018), Tianjin, ChinaGoogle Scholar
  36. 36.
    Kokkalis K, Michalos G, Aivaliotis P, Makris S (2018) An approach for implementing power and force limiting in sensorless industrial robots. 7th CIRP Conference on Assembly Technologies And Systems, (CATS 2018), Tianjin, ChinaGoogle Scholar
  37. 37.
    Blue Danube Robotics http://www.bluedanuberobotics.com. Accessed on 2016
  38. 38.
    Argyrou A, Giannoulis C, Sardelis A, Karagiannis P, Michalos G, Makris S (2018) A data fusion system for controlling the execution status in human-robot collaborative cells. Proced CIRP 76:193–198.  https://doi.org/10.1016/j.procir.2018.01.012 CrossRefGoogle Scholar
  39. 39.
    Karagiannis P, Giannoulis C, Michalos G, Makris S (2018) Configuration and control approach for flexible production stations. Proced CIRP 78:166–171.  https://doi.org/10.1016/j.procir.2018.09.053 CrossRefGoogle Scholar
  40. 40.
    Romanelli F (2011) Advanced methods for robot-environment interaction towards an industrial robot aware of its volume. J Robot 2011:1–12.  https://doi.org/10.1155/2011/389158 CrossRefGoogle Scholar
  41. 41.
    Makris S, Rentzos L, Pintzos G, Mavrikios D, Chryssolouris G (2012) Semantic-based taxonomy for immersive product design using VR techniques. CIRP Ann 61:147–150.  https://doi.org/10.1016/j.cirp.2012.03.008 CrossRefGoogle Scholar
  42. 42.
    Patil M, Joshi N, Tilekar A, Shinde P (2015) Li-fi based voice controlled robot. National Conference on Emerging Trends in Advanced Communication Technologies 35–37Google Scholar
  43. 43.
    Silaghi H, Rohde U, Spoial V, Silaghi A, Gergely E, Nagy Z (2014) Voice command of an industrial robot in a noisy environment. Fundamentals of Electrical Engineering (ISFEE), pp 1–5.  https://doi.org/10.1109/ISFEE.2014.7050596

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  • Stergios Papanastasiou
    • 1
  • Niki Kousi
    • 1
  • Panagiotis Karagiannis
    • 1
  • Christos Gkournelos
    • 1
  • Apostolis Papavasileiou
    • 1
  • Konstantinos Dimoulas
    • 1
  • Konstantinos Baris
    • 1
  • Spyridon Koukas
    • 1
  • George Michalos
    • 1
  • Sotiris Makris
    • 1
    Email author
  1. 1.Department of Mechanical Engineering and Aeronautics, Laboratory for Manufacturing Systems and AutomationUniversity of PatrasPatrasGreece

Personalised recommendations