Advertisement

The Role of Psychophysiological Measures as Implicit Communication Within Mixed-Initiative Teams

  • Kim Drnec
  • Greg Gremillion
  • Daniel Donavanik
  • Jonroy D. Canady
  • Corey Atwater
  • Evan Carter
  • Ben A. Haynes
  • Amar R. Marathe
  • Jason S. Metcalfe
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10909)

Abstract

There has been considerable effort, particularly in the military, at integrating automated agents into human teams. Currently, automated agents lack the ability to intelligently adapt to a dynamic operational environment, which results in them acting as tools rather than teammates. Rapidly advancing technology is enabling the development of autonomous agents that are able to actively make team-oriented decisions meaning truly intelligent autonomous agents are on the horizon. This makes the understanding of what is important to team performance a critical goal. In human teams, mission success depends on the development of a shared mental models and situation awareness. Development of these constructs requires good intra-team communication. However, establishing effective intra-team communication in a mixed-initiative team represents a current bottleneck in achieving successful teams. There has been significant research aimed at identifying modes of communication that can be used both by human and agent teammates, but often neglects a source of communication or information for the agent teammate that has been adopted by the human robot community to increase robot acceptance. Specifically, the use of psychophysiological features supplied to the agent that can then use algorithms to infer the cognitive state of the human teammate. The utility of using psychophysiological features for communication within teams has not been widely explored yet representing a knowledge gap in developing mixed-initiative teams. We designed an experimental paradigm that created an integrated human-automation team where psychophysiological data was collected and analyzed in real-time to address this knowledge gap. We briefly present a general background to human automation teaming before presenting our research and preliminary analysis.

Keywords

Automation Autonomy Robot Mixed-initiative teams  Human automation interaction Bi-directional communication  Psychophysiology 

Notes

Acknowledgements

This research was supported the U.S. Office of the Secretary of Defense through the Autonomy Research Pilot Initiative (MIPR DWAM31168) as well as by an appointment to the U.S. Army Research Laboratory Postdoctoral Fellowship program administered by the Oak Ridge Associated Universities through a cooperative agreement with the U.S. Army Research Laboratory. We would also like to acknowledge the contribution provided by the Air Force Research Laboratory.

References

  1. 1.
    Teo, G., Reinerman-Jones, L.: Robot behavior for enhanced human performance and workload. In: Shumaker, R., Lackey, S. (eds.) VAMR 2014. LNCS, vol. 8525, pp. 117–128. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-07458-0_12CrossRefGoogle Scholar
  2. 2.
    Sheridan, T.B., Parasuraman, R.: Human-automation interaction. Rev. Hum. Factors Ergon. 1(1), 89–129 (2005)CrossRefGoogle Scholar
  3. 3.
    Parasuraman, R., Riley, V.: Humans and automation; use misuse, disuse, abuse. Hum. Factors 39(2), 23 (1997)CrossRefGoogle Scholar
  4. 4.
    Manzey, D., Bahner, J.E., Hueper, A.-D.: Misuse of automated aids in process control: complacency, automation bias and possible training interventions. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 50, no. 3, pp. 220–224. Sage Publications (2006)CrossRefGoogle Scholar
  5. 5.
    McGuirl, J.M., Sarter, N.B.: Supporting trust calibration and the effective use of decision aids by presenting dynamic system confidence information. Hum. Factors 48(4), 656–665 (2006)CrossRefGoogle Scholar
  6. 6.
    Endsley, M.R., Kiris, E.O.: The out-of-the-loop performance problem and level of control in automation. Hum. Factors: J. Hum. Factors Ergon. Soc. 37(2), 381–394 (1995)CrossRefGoogle Scholar
  7. 7.
    Kaber, D.B., Endsley, M.R.: Out-of-the-loop performance problems and the use of intermediate levels of automation for improved control system functioning and safety. Process Saf. Prog. 16(3), 126–131 (1997)CrossRefGoogle Scholar
  8. 8.
    Cummings, M.L., Mastracchio, C., Thornburg, K.M., Mkrtchyan, A.: Boredom and distraction in multiple unmanned vehicle supervisory control. Interact. Comput. 25(1), 34–47 (2013)CrossRefGoogle Scholar
  9. 9.
    David, R.A., Nielsen, P.: Defense science board summer study on autonomy. Defense Science Board Washington United States (2016)Google Scholar
  10. 10.
    Schaefer, K.E., Straub, E.R., Chen, J.Y., Putney, J., Evans, A.: Communicating intent to develop shared situation awareness and engender trust in human-agent teams. Cogn. Syst. Res. 46, 26–39 (2017)CrossRefGoogle Scholar
  11. 11.
    Lackey, S., Barber, D., Reinerman, L., Badler, N.I., Hudson, I.: Defining next-generation multi-modal communication in human robot interaction. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 55, no. 1, pp. 461–464. SAGE Publications Sage CA, Los Angeles (2011)CrossRefGoogle Scholar
  12. 12.
    Phillips, E., Ososky, S., Grove, J., Jentsch, F.: From tools to teammates: toward the development of appropriate mental models for intelligent robots. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 55, no. 1, pp. 1491–1495. SAGE Publications Sage CA, Los Angeles (2011)CrossRefGoogle Scholar
  13. 13.
    Ososky, S., et al.: The importance of shared mental models and shared situation awareness for transforming robots from tools to teammates. In: Proceedings of SPIE, vol. 8387, p. 838710 (2012)Google Scholar
  14. 14.
    Hill, S.G., Barber, D., Evans III, A.W.: Achieving the vision of effective Soldier-robot teaming: Recent work in multimodal communication. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts, pp. 177–178. ACM (2015)Google Scholar
  15. 15.
    Kruijff, G.J.M., Janíček, M., Keshavdas, S., Larochelle, B., Zender, H., Smets, N.J.J.M., Mioch, T., Neerincx, M.A., Diggelen, J.V., Colas, F., Liu, M., Pomerleau, F., Siegwart, R., Hlaváč, V., Svoboda, T., Petříček, T., Reinstein, M., Zimmermann, K., Pirri, F., Gianni, M., Papadakis, P., Sinha, A., Balmer, P., Tomatis, N., Worst, R., Linder, T., Surmann, H., Tretyakov, V., Corrao, S., Pratzler-Wanczura, S., Sulk, M.: Experience in System Design for Human-Robot Teaming in Urban Search and Rescue. In: Yoshida, K., Tadokoro, S. (eds.) Field and Service Robotics. STAR, vol. 92, pp. 111–125. Springer, Heidelberg (2014).  https://doi.org/10.1007/978-3-642-40686-7_8CrossRefGoogle Scholar
  16. 16.
    Klien, G., Woods, D., Bradshaw, J., Hoffman, R., Feltovich, P.: Ten challenges for making automation a team player in joint human agent activity. IEEE Intell. Syst. 19(6), 4 (2004)CrossRefGoogle Scholar
  17. 17.
    Christoffersen, K., Woods, D.D.: How to make automated systems team players. In: Advances in human performance and cognitive engineering research. Emerald Group Publishing Limited, pp. 1–12 (2002)Google Scholar
  18. 18.
    Barber, D., Lackey, S., Reinerman-Jones, L., Hudson, I.: Visual and tactile interfaces for bi-directional human robot communication. In: SPIE Defense, Security, and Sensing. International Society for Optics and Photonics, pp. 87410U (2013)Google Scholar
  19. 19.
    Oh, J., Howard, T.M., Walter, M.R., Barber, D., Zhu, M., Park, S., Suppe, A., Navarro-Serment, L., Duvallet, F., Boularias, A., Romero, O., Vinokurov, J., Keegan, T., Dean, R., Lennon, C., Bodt, B., Childers, M., Shi, J., Daniilidis, K., Roy, N., Lebiere, C., Hebert, M., Stentz, A.: Integrated intelligence for human-robot teams. In: Kulić, D., Nakamura, Y., Khatib, O., Venture, G. (eds.) ISER 2016. SPAR, vol. 1, pp. 309–322. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-50115-4_28CrossRefGoogle Scholar
  20. 20.
    Barber, D.J., Abich IV, J., Phillips, E., Talone, A.B., Jentsch, F., Hill, S.G.: Field assessment of multimodal communication for dismounted human-robot teams. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 59, no. 1, pp. 921–925. SAGE Publications Sage CA, Los Angeles (2015)CrossRefGoogle Scholar
  21. 21.
    Kaupp, T., Makarenko, A., Durrant-Whyte, H.: Human–robot communication for collaborative decision making—A probabilistic approach. Robot. Auton. Syst. 58(5), 444–456 (2010)CrossRefGoogle Scholar
  22. 22.
    Schaefer, K.E., Evans, A., Hill, S.G.: Command and control in network-centric operations: trust and robot autonomy. In: 20th International Command and Control Research and Technology Symposium, Annapolis, MD (2015)Google Scholar
  23. 23.
    Kulic, D., Croft, E.A.: Affective state estimation for human–robot interaction. IEEE Trans. Robot. 23(5), 991–1000 (2007)CrossRefGoogle Scholar
  24. 24.
    Kulic, D., Croft, E.: Anxiety detection during human-robot interaction. In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2005, pp. 616–621. IEEE (2005)Google Scholar
  25. 25.
    Kulic, D., Croft, E.: Estimating robot induced affective state using hidden markov models. In: 2006 The 15th IEEE International Symposium on Robot and Human Interactive Communication, ROMAN 2006, pp. 257–262. IEEE (2006)Google Scholar
  26. 26.
    Rani, P., Sarkar, N., Smith, C.A., Kirby, L.D.: Anxiety detecting robotic system–towards implicit human-robot collaboration. Robotica 22(1), 85–95 (2004)CrossRefGoogle Scholar
  27. 27.
    Dzindolet, M., Peterson, S., Pomranky, T., Beck, P.: The role of trust in automation reliance. Int. J. Hum.-Comput. Stud. 58(6), 21 (2003)CrossRefGoogle Scholar
  28. 28.
    Muir, B.M., Moray, N.: Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. Ergonomics 39(3), 31 (1996)CrossRefGoogle Scholar
  29. 29.
    Drnec, K., Metcalfe, J.S.: Paradigm development for identifying and validating indicators of trust in automation in the operational environment of human automation integration. In: Schmorrow, D.D.D., Fidopiastis, C.M.M. (eds.) AC 2016. LNCS (LNAI), vol. 9744, pp. 157–167. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-39952-2_16CrossRefGoogle Scholar
  30. 30.
    Metcalfe, J.S., et al.: Building a framework to manage trust in automation. In: Micro-and Nanotechnology Sensors, Systems, and Applications IX. Vol. 10194. International Society for Optics and Photonics (2017)Google Scholar
  31. 31.
    John, O.P., Donahue, E.M., Kentle, R.L.: The big five inventory—versions 4a and 54. Berkeley, CA. University of California, Berkeley, Institute of Personality and Social Research (1991)Google Scholar
  32. 32.
    Hart, S.G., Staveland, L.E.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv. Psychol. 52, 139–183 (1988)CrossRefGoogle Scholar
  33. 33.
    Lee, J.D., Moray, N.: Trust, self-confidence, and operators’ adaptation to automation. Int. J. Hum.-Comput. Stud. 40(1), 153–184 (1994)CrossRefGoogle Scholar
  34. 34.
    Lee, J., Moray, N.: Trust, control strategies and allocation of function in human-machine systems. Ergonomics 35(10), 1243–1270 (1992)CrossRefGoogle Scholar
  35. 35.
    Carmody, M.A., Gluckman, J.P.: Task specific effects of automation and automation failure on performance, workload and situational awareness. In: Proceedings of the Seventh International Symposium on Aviation Psychology, vol. 1, pp. 167–171 (1993)Google Scholar
  36. 36.
    Ma, R., Kaber, D.B.: Situation awareness and workload in driving while using adaptive cruise control and a cell phone. Int. J. Ind. Ergon. 35(10), 939–953 (2005)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Kim Drnec
    • 1
  • Greg Gremillion
    • 2
  • Daniel Donavanik
    • 2
  • Jonroy D. Canady
    • 1
  • Corey Atwater
    • 3
  • Evan Carter
    • 1
  • Ben A. Haynes
    • 4
  • Amar R. Marathe
    • 1
  • Jason S. Metcalfe
    • 1
  1. 1.Army Research LaboratoryAberdeen Proving GroundAberdeenUSA
  2. 2.Adelphi Laboratory CenterAdelphiUSA
  3. 3.DCSCorpWarrenUSA
  4. 4.U.S. Army Tank Automotive Research Development and Engineering CenterWarrenUSA

Personalised recommendations