Skip to main content

Chatbot Personalities Matters

Improving the User Experience of Chatbot Interfaces

  • Conference paper
  • First Online:
Book cover Internet Science (INSCI 2018)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11551))

Included in the following conference series:

Abstract

In this study, we investigated the impact of a match in personality between a chatbot and the user. Previous research have proposed that personality can offer a stable pattern to how chatbots are perceived, and add consistency to the user experience. The assumptions regarding the effects of personality was investigated by measuring the effects of two chatbot agents, with two levels of personality, on the user experience. This study found that personality has a significant positive effect on the user experience of chatbot interfaces, but this effect is dependent on context, the job it performs, and its user group.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Balzarotti, S., Piccini, L., Andreoni, G., Ciceri, R.: “I know that you know how i feel”: behavioral and physiological signals demonstrate emotional attunement while interacting with a computer simulating emotional intelligence. J. Nonverbal Behav. 38(3), 283–299 (2014). https://doi.org/10.1007/s10919-014-0180-6

    Article  Google Scholar 

  2. Beun, R.-J., de Vos, E., Witteman, C.: Embodied conversational agents: effects on memory performance and anthropomorphisation. In: Rist, T., Aylett, R.S., Ballin, D., Rickel, J. (eds.) IVA 2003. LNCS (LNAI), vol. 2792, pp. 315–319. Springer, Heidelberg (2003). https://doi.org/10.1007/978-3-540-39396-2_52

    Chapter  Google Scholar 

  3. Brahnam, S., De Angeli, A.: Gender affordances of conversational agents. Interact. Comput. 24(3), 139–153 (2012). https://doi.org/10.1016/j.intcom.2012.05.001

    Article  Google Scholar 

  4. Callejas, Z., López-Cózar, R., Ábalos, N., Griol, D.: Affective conversational agents: the role of personality and emotion in spoken interactions. In: Conversational Agents and Natural Language Interaction: Techniques and Effective Practices: Techniques and Effective Practices, pp. 203–223 (2011). https://doi.org/10.4018/978-1-60960-617-6.ch009

  5. Dautenhahn, K., Ogden, B., Quick, T.: From embodied to socially embedded agents-implications for interaction-aware robots. Cogn. Syst. Res. 3(3), 397–428 (2002). https://doi.org/10.1016/S1389-0417(02)00050-5

    Article  Google Scholar 

  6. Oxford English Dictionary: Anthropomorphism. www.oed.com/view/Entry/8449?redirectedFrom=anthropomorphism&

  7. Epley, N., Waytz, A., Cacioppo, J.T.: On seeing human: a three-factor theory of anthropomorphism. Psychol. Rev. 114(4), 864 (2007). https://doi.org/10.1037/0033-295X.114.4.864

    Article  Google Scholar 

  8. Fogg, B.J.: Persuasive technology: using computers to change what we think and do. Interactive Technologies, Elsevier Science (2002). https://doi.org/10.1145/764008.763957

  9. Forrester: Chatbots are transforming marketing (2017). https://www.forrester.com/report/Chatbots+Are+Transforming+Marketing/-/E-RES136771

  10. Griol, D., Molina, J.M., Callejas, Z.: Towards emotionally sensitive conversational interfaces for E-therapy. In: Ferrández Vicente, J.M., Álvarez-Sánchez, J.R., de la Paz López, F., Toledo-Moreo, F.J., Adeli, H. (eds.) IWINAC 2015. LNCS, vol. 9107, pp. 498–507. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-18914-7_52

    Chapter  Google Scholar 

  11. Hassenzahl, M., Platz, A., Burmester, M., Lehner, K.: Hedonic and ergonomic quality aspects determine a software’s appeal. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2000, pp. 201–208. ACM, New York (2000). https://doi.org/10.1145/332040.332432

  12. JuniperResearch: Chatbot infographic key statistics (2017). https://www.juniperresearch.com/resources/infographics/chatbots-infographic-key-statistics-2017

  13. Lee, E.J.: The more humanlike, the better? How speech type and users’ cognitive style affect social responses to computers. Comput. Hum. Behav. 26(4), 665–672 (2010). https://doi.org/10.1016/j.chb.2010.01.003

    Article  Google Scholar 

  14. Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors 46(1), 50–80 (2004). https://doi.org/10.1518/hfes.46.1.50_30392

    Article  Google Scholar 

  15. Lester, J., Converse, S., Kahler, S., Barlow, S., Stone, B., Bhogal, R.: The persona effect: affective impact of animated pedagogical agents, pp. 359–366 (1997). https://doi.org/10.1145/258549.258797

  16. McTear, M., Callejas, Z., Griol, D.: Affective conversational interfaces. The Conversational Interface. LNCS, pp. 329–357. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-32967-3_15

    Chapter  Google Scholar 

  17. Mencia, B.L., Pardo, D.D., Trapote, A.H., Gómez, L.A.H.: Embodied conversational agents in interactive applications for children with special educational needs. In: Technologies for Inclusive Education: Beyond Traditional Integration Approaches: Beyond Traditional Integration Approaches, p. 59 (2012). https://doi.org/10.4018/978-1-4666-2530-3.ch004

  18. Meyer, J., Miller, C., Hancock, P., de Visser, E.J., Dorneich, M.: Politeness in machine-human and human-human interaction. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 60(1), 279–283 (2016). https://doi.org/10.1177/1541931213601064

  19. Mori, M.: The uncanny valley. Energy 7(4), 33–35 (1970). https://doi.org/10.1109/MRA.2012.2192811

    Article  Google Scholar 

  20. Nass, C., Moon, Y.: Machines and mindlessness: social responses to computers. J. Soc. Issues 56(1), 81–103 (2000). https://doi.org/10.1111/0022-4537.00153

    Article  Google Scholar 

  21. Norman, D.A.: Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books, New York (2007)

    Google Scholar 

  22. Orf, D.: Facebook chatbots are frustrating and useless (2017). https://gizmodo.com/facebook-messenger-chatbots-are-more-frustrating-than-h-1770732045

  23. Piltch, A.: Talk is cheap: why chatbots will always be a waste of time (2017). www.tomsguide.com/us/chatbots-waste-our-time,news-22562.html

  24. Prada, R., Vala, M., Paiva, A., Hook, K., Bullock, A.: FantasyA – the duel of emotions. In: Rist, T., Aylett, R.S., Ballin, D., Rickel, J. (eds.) IVA 2003. LNCS (LNAI), vol. 2792, pp. 62–66. Springer, Heidelberg (2003). https://doi.org/10.1007/978-3-540-39396-2_11

    Chapter  Google Scholar 

  25. Reeves, B., Nass, C.: The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places. Cambridge University Press, New York (1996)

    Google Scholar 

  26. Schroeder, J., Epley, N.: Mistaking minds and machines: how speech affects dehumanization and anthropomorphism. J. Exp. Psychol.: Gen. (2016). https://doi.org/10.1037/xge0000214

    Article  Google Scholar 

  27. Smestad, T.L.: Personality matters! Improving the user experience of chatbot interfaces (2018). http://hdl.handle.net/11250/2502575

  28. Stern, A.: Creating emotional relationships with virtual characters; from: emotions in humans and artifacts. Trappl, R., Petta, P., Payr, S. (eds.) (2003)

    Google Scholar 

  29. Terada, K., Jing, L., Yamada, S.: Effects of agent appearance on customer buying motivations on online shopping sites. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 929–934. ACM, Seoul (2015). https://doi.org/10.1145/2702613.2732798

  30. Xiao, H., Reid, D., Marriott, A., Gulland, E.K.: An adaptive personality model for ECAs. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 637–645. Springer, Heidelberg (2005). https://doi.org/10.1007/11573548_82

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tuva Lunde Smestad .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Smestad, T.L., Volden, F. (2019). Chatbot Personalities Matters. In: Bodrunova, S., et al. Internet Science. INSCI 2018. Lecture Notes in Computer Science(), vol 11551. Springer, Cham. https://doi.org/10.1007/978-3-030-17705-8_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-17705-8_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-17704-1

  • Online ISBN: 978-3-030-17705-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics