Abstract
The working world is facing a constant change. New technologies emerge enabling new forms of human-system interactions. Especially autonomous robots in services industries as well as manufacturing settings create novel forms of human-robot interaction. Not only researchers but also system integrators as well as practitioners are confronted with the question how to analyze, to evaluate and finally how to design these new working systems in a human-centered way. In this paper we present evaluation criteria as well as a toolkit with concrete measures in order to enable a holistic evaluation of cognitive aspects in human-robot-interactions in work related scenarios. The evaluation criteria comprise technology and human related parameters. Further the paper presents a first empirical validation of the evaluation criteria and their measurements. The validation study uses a manual assembly task accomplished with a lightweight robot. The results indicate that the evaluation criteria can be used to describe the quality of the human-robot interaction.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
BMAS (F.M.o.L.a.S.A.) (2017) White Paper, Work 4.0. Federal Ministry of Labour and Social Affairs Directorate-General for Basic Issues of the Social State, the Working World and the Social Mark et Economy, Berlin
IFR, Website. https://ifr.org/free-downloads/. Accessed 15 May 2018
Barner A et al (2016) Innovationspotenziale der Mensch-Maschine-Interaktion. acatech Dossier, Berlin
Olsen DR, Goodrich MA (2003) Metrics for evaluating human-robot interactions. In: Proceedings of PERMIS
Steinfeld A et al (2006) Common metrics for human-robot interaction. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human-robot interaction. ACM
Weiss A et al (2009) The USUS evaluation framework for human-robot interaction. In: AISB 2009: proceedings of the symposium on new frontiers in human-robot interaction
Young JE et al (2011) Evaluating human-robot interaction. Int J Soc Robot 3(1):53–67
Onnasch L, Maier X, Jürgensohn T (2016) Mensch-Roboter-Interaktion - Eine Taxonomie für alle Anwendungsfälle.. baua: Fokus, 1 Auflage
Bicchi A, Peshkin MA, Colgate JE (2008) Safety for physical human-robot interaction. In: Springer handbook of robotics. Springer, Heidelberg, pp 1335–1348
Rosen PH et al (2016) Mensch-Roboter-Teams – Klassifikation, Gestaltung und Evaluation der Interaktionen im Arbeitssystem. wt Werkstatttechnik online, 106(H9), pp 605–609
Standardization, I.O.f. (2006) ISO 9241-110:2006
Gediga G, Hamborg K-C, Düntsch I (1999) The IsoMetrics usability inventory: an operationalization of ISO 9241-10 supporting summative and formative evaluation of software systems. Behav Inf Technol 18(3):151–164
Legris P, Ingham J, Collerette P (2003) Why do people use information technology? A critical review of the technology acceptance model. Inf Manag 40(3):191–204
Adams DA, Nelson RR, Todd PA (1992) Perceived usefulness, ease of use, and usage of information technology: a replication. MIS Q 16:227–247
Standardization, I.O.f., Ergonomic principles related to mental work-load - Part 1: General terms and definitions, Geneva, Switzerland
Onnasch L et al (2014) Human performance consequences of stages and levels of automation: An integrated meta-analysis. Hum Factors 56(3):476–488
Jimenez P, Dunkl A (2017) The buffering effect of workplace resources on the relationship between the areas of worklife and burnout. Front Psychol 8:12
Demerouti E et al (2002) From mental strain to burnout. Eur J Work Organ Psychol 11(4):423–441
Schnall PL, Landsbergis PA, Baker D (1994) Job strain and cardiovascular disease. Ann Rev Public Health 15(1):381–411
Wieland R, Hammes M (2014) Wuppertaler screening instrument Psychische Beanspruchung (WSIB) Beanspruchungsbilanz und Kontrollerleben als Indikatoren für gesunde Arbeit. J Psychol Alltagshandelns/J Everyday Act 7(1):30–50
Hart SG (2006) NASA-task load index (NASA-TLX); 20 years later. In: Proceedings of the human factors and ergonomics society annual meeting. Sage Publications
Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183
Weiss HM, Cropanzano R (1996) Affective events theory: a theoretical discussion of the structure, causes and consequences of affective experiences at work. In: Research in organizational behavior: an annual series of analytical essays and critical reviews, vol 18
Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1):119–155
Powers A et al (2007) Comparing a computer agent with a humanoid robot. In: 2007 2nd ACM/IEEE international conference on human-robot interaction (HRI). IEEE
Watson D, Clark LA, Tellegen A (1988) Development and validation of brief measures of positive and negative affect: the PANAS scales. J Pers Soc Psychol 54(6):1063
Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59
Langner T, Schmidt J, Fischer A (2015) Is it really love? A comparative investigation of the emotional nature of brand and interpersonal love. Psychol Mark 32(6):624–634
Cortina JM (1993) What is coefficient alpha? An examination of theory and applications. J Appl Psychol 78(1):98
Cohen J (1988) Statistical power analysis for the behavioral sciences, 2nd edn. Erlbaum, Hillsdale
Nisbett RE, Wilson TD (1977) The halo effect: Evidence for unconscious alteration of judgments. J Pers Soc Psychol 35(4):250
Wanous JP, Reichers AE, Hudy MJ (1997) Overall job satisfaction: how good are single-item measures? J Appl Psychol 82(2):247
Gardner DG et al (1998) Single-item versus multiple-item measurement scales: an empirical comparison. Educ Psychol Measur 58(6):898–915
Acknowledgments
Parts of this work were developed within the research project Hybr-iT. The research project Hybr-iT is funded by the Federal Ministry of Education and Research and is administrated by the DLR Project Management Agency (reference no. 01IS16026H).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Rosen, P.H., Sommer, S., Wischniwski, S. (2019). Evaluation of Human-Robot Interaction Quality: A Toolkit for Workplace Design. In: Bagnara, S., Tartaglia, R., Albolino, S., Alexander, T., Fujita, Y. (eds) Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018). IEA 2018. Advances in Intelligent Systems and Computing, vol 824. Springer, Cham. https://doi.org/10.1007/978-3-319-96071-5_169
Download citation
DOI: https://doi.org/10.1007/978-3-319-96071-5_169
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-96070-8
Online ISBN: 978-3-319-96071-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)