Advertisement

Do Smart Speakers Respond to Their Errors Properly? A Study on Human-Computer Dialogue Strategy

  • Xiang GeEmail author
  • Dan Li
  • Daisong Guan
  • Shihui Xu
  • Yanyan Sun
  • Moli Zhou
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11584)

Abstract

As smart speakers with voice interaction capability permeate continuously in the world, more and more people will gradually get used to the new interaction medium–voice. Although speech recognition, natural language processing (NLP) have been greatly improved over the past few years, users still may encounter errors from time to time like “cannot understand”, “no requested audio resource (such as music)”, which can frustrate users. Therefore, when an error message is reported, it is vital that the smart speaker gives an effective and proper response. However, currently the response strategies adopted by leading smart speaker brands in China differed mainly on two dimensions: “apology or not” and “humor or neutral”. We explored user’s preference of response strategies under two error scenarios——“cannot understand” and “no requested audio resource”. A 2 (apology: yes vs. no) × 2 (error message expression tone: humor vs. neutral) within-subjects experiment was conducted. Two dependent variables (satisfaction and perceived sincerity of response) were measured. The results showed that participants were more satisfied and perceived higher sincerity when smart speaker apologized in both error scenarios. In the “no requested audio resource” scenario, humor had no significant impact on the perception of satisfaction and sincerity. But in the “cannot understand” scenario, humorous expression decreased perceived sincerity.

Keywords

Voice interaction Smart speaker Error Emotion Human-computer interaction Humor Apology 

References

  1. 1.
    IDC. Smart Home Devices by Category, 2018 and 2022. https://www.idc.com/getdoc.jsp?containerId=prUS44361618. Accessed 1 Oct 2018
  2. 2.
    Pyae, A., Joelsson, T.N.: Investigating the usability and user experiences of voice user interface: a case of Google home smart speaker. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct (MobileHCI 2018), pp. 127–131. ACM, New York (2016)Google Scholar
  3. 3.
    Gao, Y., Pan, Z., Wang, H., Chen, G.: Alexa, My love: analyzing reviews of Amazon echo. In: 2018 IEEE Smart World, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation, pp. 372–380. IEEE Press, New York (2018)Google Scholar
  4. 4.
    Nielsen, J.: Error message guidelines. http://www.useit.com/alertbox/20010624.html. Accessed 24 Jun 2001
  5. 5.
    Neilsen, J.: Improving the Dreaded 404 Error Message. https://www.nngroup.com/articles/improving-dreaded-404-error-message/. Accessed 14 Jun 1998
  6. 6.
    Bargas-Avila, J.A., Brenzikofer, O., Roth, S.P., Tuch, A.N., Orsini, S., Opwis, K.: Simple but crucial user interfaces in the world wide web: introducing 20 guidelines for usable web form design. In: Matrai, R. (ed.) User Interfaces. InTech (2010)Google Scholar
  7. 7.
    Bargas-Avila, J.A., Oberholzer, G., Schmutz, P., de Vito, M., Opwis, K.: Usable error message presentation in the world wide web: do not show errors right away. Interact. Comput. 19(3), 330–341 (2007)CrossRefGoogle Scholar
  8. 8.
    Klein, J., Moon, Y., Picard, R.W.: This computer responds to user frustration: theory, design, and results. Interact. Comput. 14(2), 119–140 (2002)CrossRefGoogle Scholar
  9. 9.
    Hone, K., Aktar, F., Saffu, M.: Affective agents to reduce user frustration: the role of agent embodiment. In: Proceedings of Human-Computer Interaction (HCI 2003), Bath, UK (2003)Google Scholar
  10. 10.
    Hone, K.: Empathic agents to reduce user frustration: the effects of varying agent characteristics. Interact. Comput. 18(2), 227–245 (2006)CrossRefGoogle Scholar
  11. 11.
    Feigenblat, G., Konopnicki, D., Shmueli-Scheuer, M., Herzig, J., Shkedi, H.: I understand your frustration. In: Proceedings of the 19th ACM Conference on Computer Supported Cooperative Work and Social Computing Companion, pp. 25–28. ACM, New York (2016)Google Scholar
  12. 12.
    Rajendran, R., Iyer, S., Murthy, S.: Personalized affective feedback to address students’ frustration in ITS. IEEE Trans. Learn. Technol., 1 (2018)Google Scholar
  13. 13.
    Purington, A., Taft, J.G., Sannon, S., Bazarova, N.N., Taylor, S.H.: Alexa is my new BFF: social roles, user satisfaction, and personification of the amazon echo. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2853–2859. ACM, New York (2017)Google Scholar
  14. 14.
    Lopatovska, I., Williams, H.: Personification of the Amazon Alexa: BFF or a mindless companion. In: Proceedings of the 2018 Conference on Human Information Interaction & Retrieval, pp. 265–268. ACM, New York (2018)Google Scholar
  15. 15.
    Suo, Z.Y.: Pragmatics in Chinese: A Course Book. Beijing University Press, Beijing (2000)Google Scholar
  16. 16.
    Luger, E., Sellen, A.: “Like Having a Really Bad PA”: the Gulf between user expectation and experience of conversational agents. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 5286–5297. ACM, New York (2016)Google Scholar
  17. 17.
    Myers, C., Furqan, A., Nebolsky, J., Caro, K., Zhu, J.: Patterns for how users overcome obstacles in voice user interfaces. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 6. ACM, New York (2018)Google Scholar
  18. 18.
    Prendinger, H., Ishizuka, M.: The empathic companion: a character-based interface that addresses users’ affective states. Appl. Artif. Intell. 19(3–4), 267–285 (2005)CrossRefGoogle Scholar
  19. 19.
    Woolf, B., Burleson, W., Arroyo, I., Dragon, T., Cooper, D., Picard, R.: Affect-aware tutors: recognizing and responding to student affect. Int. J. Learn. Technol. 4(3–4), 129–164 (2009)CrossRefGoogle Scholar
  20. 20.
    Park, E.K., Lee, K.M., Shin, D.H.: Social responses to conversational TV VUI: apology and voice. Int. J. Technol. Hum. Interact. 11(1), 17–32 (2015)CrossRefGoogle Scholar
  21. 21.
    Blum-Kulka, S., Olshtain, E.: Requests and apologies: a cross-cultural study of speech act realization patterns (CCSARP). Appl. Linguist. 5(3), 196–213 (1984)CrossRefGoogle Scholar
  22. 22.
    Akgun, M., Cagiltay, K., Zeyrek, D.: The effect of apologetic error messages and mood states on computer users’ self-appraisal of performance. J. Pragmat. 42(9), 2430–2448 (2010)CrossRefGoogle Scholar
  23. 23.
    Tzeng, J.Y.: Toward a more civilized design: studying the effects of computers that apologize. Int. J. Hum. Comput. Stud. 61(3), 319–345 (2004)CrossRefGoogle Scholar
  24. 24.
    Park, S.J., MacDonald, C.M., Khoo, M.: Do you care if a computer says sorry? User experience design through affective messages. In: Proceedings of the Designing Interactive Systems Conference, pp. 731–740. ACM, New York (2012)Google Scholar
  25. 25.
    Baylor, A.L., Warren, D., Park, C.H., Shen, E., Perez, R.: The impact of frustration-mitigating messages delivered by an interface agent. In: Artificial Intelligence in Education: Supporting Learning through Intelligent and Socially Informed Technology, vol. 125, pp. 73–79. IOS Press, Amsterdam (2005)Google Scholar
  26. 26.
    Jaksic, N., Branco, P., Stephenson, P., Encarnaçao, L.M.: The effectiveness of social agents in reducing user frustration. In: CHI 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 917–922. ACM, New York (2006)Google Scholar
  27. 27.
    de Visser, E.J., Monfort, S.S., McKendrick, R., Smith, M.A., McKnight, P.E., Krueger, F., Parasuraman, R.: Almost human: anthropomorphism increases trust resilience in cognitive agents. J. Exp. Psychol. Appl. 22(3), 331 (2016)CrossRefGoogle Scholar
  28. 28.
    Nijholt, A.: From word play to world play: introducing humor in human-computer interaction. In: Proceedings of the 36th European Conference on Cognitive Ergonomics, p. 1. ACM, New York (2018)Google Scholar
  29. 29.
    Nijholt, A., Niculescu, A.I., Valitutti, A., Banchs, R.E.: Humor in human computer interaction. A short survey. In: Adjunct Proceedings of INTERACT 2017, pp. 192–214. IDC, Indian Institute of Technology, Bombay, India (2017)Google Scholar
  30. 30.
    Morkes, J., Kernal, H.K., Nass, C.: Effects of humor in task-oriented human–computer interaction and computer-mediated communication: a direct test of SRCT theory. Hum. Comput. Interact. 14, 395–435 (1999)CrossRefGoogle Scholar
  31. 31.
    Dolen, W.M., Ruyter, K., Streukens, S.: The effect of humor in electronic service encounters. J. Econ. Psychol. 29(2), 160–179 (2008)CrossRefGoogle Scholar
  32. 32.
    Tzeng, J.Y.: Matching users’ diverse social scripts with resonating humanized features to create a polite interface. Int. J. Hum. Comput. Stud. 64(12), 1230–1242 (2006)CrossRefGoogle Scholar
  33. 33.
    Khooshabeh, P., McCall, C., Gandhe, S., Gratch, J., Blascovich, J.: Does it matter if a computer jokes. In: CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 77–86. ACM, New York (2011)Google Scholar
  34. 34.
    Niculescu, A.I., van Dijk, B., Nijholt, A., Li, H., See, S.L.: Making social robots more attractive the effects of voice pitch, humor and empathy. Int. J. Soc. Robot. 5(2), 171–191 (2013)CrossRefGoogle Scholar
  35. 35.
    Niculescu, A., Banchs, R.: Humor intelligence for virtual agents. In: Ninth International Workshop on Spoken Dialogue Systems Technology (IWSDS 2018), Singapore, vol. 1 (2018)Google Scholar
  36. 36.
    Dybala, P., Ptaszynski, M., Rzepka, R., Araki, K.: Humoroids: conversational agents that induce positive emotions with humor. In: Proceedings of the 8th International Conference on Autonomous Agents and Multiagent Systems, vol. 2, pp. 1171–1172. International Foundation for Autonomous Agents and Multiagent Systems (2009)Google Scholar
  37. 37.
    Niculescu, A.I., Banchs, R.E.: Strategies to cope with errors in human-machine spoken interactions: using chatbots as back-off mechanism for task-oriented dialogues. In: Proceedings of ERRARE, Sinaia, Romania (2015)Google Scholar
  38. 38.
    BBC News. Amazon Alexa: is it friends with your kids? https://www.bbc.com/news/technology-44847184. Accessed 16 Jul 2016
  39. 39.
    Zillmann, D., Williams, B.R., Bryant, J., Boynton, K.R., Wolf, M.A.: Acquisition of information from educational television programs as a function of differently paced humorous inserts. J. Educ. Psychol. 72(2), 170 (1980)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Xiang Ge
    • 1
    Email author
  • Dan Li
    • 1
  • Daisong Guan
    • 1
  • Shihui Xu
    • 1
  • Yanyan Sun
    • 1
  • Moli Zhou
    • 1
  1. 1.Baidu AI Interaction Design LabBeijingChina

Personalised recommendations