Creativity in Measuring Trust in Human-Robot Interaction Using Interactive Dialogs
The measurement of human trust in humanoid robots in human-robot interaction requires novel approaches that can predict trust effectively. We present a method that mapped subjective measures (i.e. general trust, psychological) to objective measures (i.e. physiological) to predict trust. We designed interactive dialogs that represent real world service scenarios of Business, Disaster, and Healthcare. The dialogs embedded fifteen trust attributes of Ability, Benevolence and Integrity (ABI) in the communication dialogs. The ABI measures were mapped to physiological measures of facial expressions, voiced speech and camera-based heart rate. Forty-eight subjects comprising 24 males and 24 females aged between 18 to 36 years participated in the experiment. Half of the subjects were Malays and half were Chinese. Three humanoid robots represented full bodied, partial bodied and virtual agents. The experimental design was a within-subjects design. Each subject was tested on all robots in all scenarios. Subjects scored trust on an online scale that ranged from 0 to 7 points. The subjective data was analyzed using Univariate and Oneway MANOVA. The results found the humanoids to be trustworthy in different service tasks. The attributes of ‘Integrity’ and ‘Ability’ trust components are important in Business and Disaster scenarios. The estimation of trust was about 83% accurate when using this creative approach. In conclusion, humanoid robots can interact with humans using dialogs that are representative of real world communication.
KeywordsTrust Dialog design Human-Robot interaction
We gratefully acknowledge the financial support by the US Air Force Office of Scientific Research (AFOSR), Washington D.C. and the US Aerospace Research and Development office (AOARD), Japan under Grant No. FA2386-14-1-0016.
- 1.Khalid HM, Shiung LW, Nooralishahi P, Rasool Z, Helander MG, Loo CK, Chin AV (2016) Exploring psycho-physiological correlates to trust: implications for human-robot-human interaction. In: Proceedings of the human factors and ergonomics society annual meeting, vol. 60, no. 1. SAGE, Los Angeles, pp 697–701CrossRefGoogle Scholar
- 2.Bartneck C, Forlizzi J (2004) Shaping human-robot interaction: understanding the social aspects of intelligent robotic products. In: CHI 2004 Extended abstracts on human factors in computing systems. ACM, pp 1731–1732Google Scholar
- 3.Enos F, Hirschberg J (2006) A framework for eliciting emotional speech: capitalizing on the actor’s process. In: First international workshop on emotion: corpora for research on emotion and affect (international conference on language resources and evaluation (LREC 2006)), pp 6–10Google Scholar
- 4.Busso C, Narayanan SS (2008) Scripted dialogs versus improvisation: lessons learned about emotional elicitation techniques from the IEMOCAP database. In: Ninth annual conference of the international speech communication associationGoogle Scholar
- 5.Krahmer E, Swerts M (2008) On the role of acting skills for the collection of simulated emotional speech. In: Ninth annual conference of the international speech communication associationGoogle Scholar
- 9.Derrick DC (2011) Special-Purpose, Embodied Conversational Intelligence with Environmental Sensors (SPECIES) agents: implemented in an automated interviewing kiosk. The University of ArizonaGoogle Scholar
- 11.Ekman P, Rosenberg EL (eds) (1997) What the face reveals: basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, USAGoogle Scholar
- 12.Baltrušaitis T, Robinson P, Morency LP (2016) Openface: an open source facial behavior analysis toolkit. In: Applications of computer vision (WACV), IEEE winter conference. IEEE, pp 1–10Google Scholar
- 13.Liew WS, Khalid HM, Nooralishahi P, Rasool Z, Loo CK, Helander MG (2016) Determining the relationship between psychological and physiological measurements of human trust using rough set analysis. In: Industrial Engineering and Engineering Management (IEEM) international conference. IEEE, pp 1458–1462Google Scholar
- 14.Liew WS, Loo CK, Obo T (2017) Optimizing FELM ensembles using GA-BIC. In: Fuzzy systems association and 9th international conference on soft computing and intelligent systems (IFSA-SCIS), 2017 joint 17th world congress of international. IEEE, pp 1–6Google Scholar
- 20.El Kaliouby R, Robinson P (2004) Mind reading machines: automated inference of cognitive mental states from video. In: 2004 IEEE international conference on systems, man and cybernetics, vol. 1. IEEE, pp 682–688Google Scholar
- 22.Lucas G, Stratou G, Lieblich S, Gratch J (2016) Trust me: multimodal signals of trustworthiness. In: Proceedings of the 18th ACM international conference on multimodal interaction. ACM, pp 5–12Google Scholar