Skip to main content

Gender Bias in Chatbot Design

  • Conference paper
  • First Online:
Book cover Chatbot Research and Design (CONVERSATIONS 2019)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11970))

Included in the following conference series:

Abstract

A recent UNESCO report reveals that most popular voice-based conversational agents are designed to be female. In addition, it outlines the potentially harmful effects this can have on society. However, the report focuses primarily on voice-based conversational agents and the analysis did not include chatbots (i.e., text-based conversational agents). Since chatbots can also be gendered in their design, we used an automated gender analysis approach to investigate three gender-specific cues in the design of 1,375 chatbots listed on the platform chatbots.org. We leveraged two gender APIs to identify the gender of the name, a face recognition API to identify the gender of the avatar, and a text mining approach to analyze gender-specific pronouns in the chatbot’s description. Our results suggest that gender-specific cues are commonly used in the design of chatbots and that most chatbots are – explicitly or implicitly – designed to convey a specific gender. More specifically, most of the chatbots have female names, female-looking avatars, and are described as female chatbots. This is particularly evident in three application domains (i.e., branded conversations, customer service, and sales). Therefore, we find evidence that there is a tendency to prefer one gender (i.e., female) over another (i.e., male). Thus, we argue that there is a gender bias in the design of chatbots in the wild. Based on these findings, we formulate propositions as a starting point for future discussions and research to mitigate the gender bias in the design of chatbots.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. ACM: Code of Ethics and Professional Conduct. https://www.acm.org/code-of-ethics (2019). Accessed 26 July 2019

  2. de Angeli, A., Brahnam, S.: Sex Stereotypes and Conversational Agents (2006)

    Google Scholar 

  3. Araujo, T.: Living up to the chatbot hype: the influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Comput. Hum. Behav. 85, 183–189 (2018). https://doi.org/10.1016/j.chb.2018.03.051

    Article  Google Scholar 

  4. Artz, N., Munger, J., Purdy, W.: Gender issues in advertising language. Women Lang. 22(2), 20 (1999)

    Google Scholar 

  5. Beldad, A., Hegner, S., Hoppen, J.: The effect of virtual sales agent (VSA) gender – product gender congruence on product advice credibility, trust in VSA and online vendor, and purchase intention. Comput. Hum. Behav. 60, 62–72 (2016). https://doi.org/10.1016/j.chb.2016.02.046

    Article  Google Scholar 

  6. Bhagyashree, R.: A chatbot toolkit for developers: design, develop, and manage conversational UI (2019). https://hub.packtpub.com/chatbot-toolkit-developers-design-develop-manage-conversational-ui/. Accessed 22 July 2019

  7. Bickmore, T.W., Picard, R.W.: Establishing and maintaining long-term human-computer relationships. ACM Trans. Comput.-Hum. Interact. 12(2), 293–327 (2005). https://doi.org/10.1145/1067860.1067867

    Article  Google Scholar 

  8. Bohnet, I.: What Works. Harvard University Press (2016)

    Google Scholar 

  9. Brahnam, S., de Angeli, A.: Gender affordances of conversational agents. Interact. Comput. 24(3), 139–153 (2012). https://doi.org/10.1016/j.intcom.2012.05.001

    Article  Google Scholar 

  10. Brandtzaeg, P.B., Følstad, A.: Chatbots: changing user needs and motivations. Interactions 25(5), 38–43 (2018). https://doi.org/10.1145/3236669

    Article  Google Scholar 

  11. Burnett, M., et al.: GenderMag: a method for evaluating software’s gender inclusiveness. Interact. Comput. 28(6), 760–787 (2016). https://doi.org/10.1093/iwc/iwv046

    Article  MathSciNet  Google Scholar 

  12. Council of Europe: Discrimination, artificial intelligence, and algorithmic decision-making (2018). https://rm.coe.int/discrimination-artificial-intelligence-and-algorithmic-decision-making/1680925d73

  13. Cowell, A.J., Stanney, K.M.: Manipulation of non-verbal interaction style and demographic embodiment to increase anthropomorphic computer character credibility. Int. J. Hum.-Comput. Stud. 62(2), 281–306 (2005). https://doi.org/10.1016/j.ijhcs.2004.11.008

    Article  Google Scholar 

  14. Dale, R.: The return of the chatbots. Nat. Lang. Eng. 22(5), 811–817 (2016). https://doi.org/10.1017/S1351324916000243

    Article  Google Scholar 

  15. EU: Ethics Guidelines for Trustworthy AI (2019). https://ec.europa.eu/futurium/en/ai-alliance-consultation. Accessed 30 July 2019

  16. Feine, J., Gnewuch, U., Morana, S., Maedche, A.: A taxonomy of social cues for conversational agents. Int. J. Hum.-Comput. Stud. 132, 138–161 (2019). https://doi.org/10.1016/j.ijhcs.2019.07.009

    Article  Google Scholar 

  17. Feine, J., Morana, S., Maedche, A.: Designing a chatbot social cue configuration system. In: Proceedings of the 40th International Conference on Information Systems (ICIS). AISel, Munich (2019)

    Google Scholar 

  18. Feine, J., Morana, S., Maedche, A.: Leveraging machine-executable descriptive knowledge in design science research – the case of designing socially-adaptive chatbots. In: Tulu, B., Djamasbi, S., Leroy, G. (eds.) DESRIST 2019. LNCS, vol. 11491, pp. 76–91. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-19504-5_6

    Chapter  Google Scholar 

  19. Følstad, A., Brandtzæg, P.B.: Chatbots and the new world of HCI. Interactions 24(4), 38–42 (2017). https://doi.org/10.1145/3085558

    Article  Google Scholar 

  20. Følstad, A., Brandtzaeg, P.B., Feltwell, T., Law, E.L.-C., Tscheligi, M., Luger, E.A.: SIG: chatbots for social good. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, SIG06:1‐SIG06:4. ACM, New York (2018). https://doi.org/10.1145/3170427.3185372

  21. Følstad, A., Skjuve, M., Brandtzaeg, P.: Different chatbots for different purposes: towards a typology of chatbots to understand interaction design, pp. 145–156 (2019)

    Google Scholar 

  22. Gnewuch, U., Morana, S., Maedche, A.: Towards designing cooperative and social conversational agents for customer service. In: Proceedings of the 38th International Conference on Information Systems (ICIS). AISel, Seoul (2017)

    Google Scholar 

  23. Hayashi, Y.: Lexical network analysis on an online explanation task. Effects of affect and embodiment of a pedagogical agent. IEICE Trans. Inf. Syst. 99(6), 1455–1461 (2016). https://doi.org/10.1587/transinf.2015CBP0005

    Article  Google Scholar 

  24. Hone, K.: Empathic agents to reduce user frustration. The effects of varying agent characteristics. Interact. Comput. 18(2), 227–245 (2006). https://doi.org/10.1016/j.intcom.2005.05.003

    Article  Google Scholar 

  25. Johannsen, F., Leist, S., Konadl, D., Basche, M., de Hesselle, B.: Comparison of commercial chatbot solutions for supporting customer interaction. In: Proceedings of the 26th European Conference on Information Systems (ECIS), Portsmouth, United Kingdom, 23–28 June 2018

    Google Scholar 

  26. Kraemer, N.C., Karacora, B., Lucas, G., Dehghani, M., Ruether, G., Gratch, J.: Closing the gender gap in STEM with friendly male instructors? On the effects of rapport behavior and gender of a virtual agent in an instructional interaction. Comput. Educ. 99, 1–13 (2016). https://doi.org/10.1016/j.compedu.2016.04.002

    Article  Google Scholar 

  27. Louwerse, M.M., Graesser, A.C., Lu, S.L., Mitchell, H.H.: Social cues in animated conversational agents. Appl. Cogn. Psychol. 19(6), 693–704 (2005). https://doi.org/10.1002/acp.1117

    Article  Google Scholar 

  28. McDonnell, M., Baxter, D.: Chatbots and gender stereotyping. Interact. Comput. 31(2), 116–121 (2019). https://doi.org/10.1093/iwc/iwz007

    Article  Google Scholar 

  29. McTear, M.F.: The rise of the conversational interface: a new kid on the block? In: Quesada, J.F., Martín Mateos, F.J., López-Soto, T. (eds.) FETLT 2016. LNCS (LNAI), vol. 10341, pp. 38–49. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-69365-1_3

    Chapter  Google Scholar 

  30. Microsoft: Face recognition API (2019). https://azure.microsoft.com/en-us/services/cognitive-services/face/. Accessed 22 July 2019

  31. Myers, M.D., Venable, J.R.: A set of ethical principles for design science research in information systems. Inf. Manag. 51(6), 801–809 (2014). https://doi.org/10.1016/j.im.2014.01.002

    Article  Google Scholar 

  32. Nass, C., Moon, Y.: Machines and mindlessness social responses to computers. J. Soc. Issues 56(1), 81–103 (2000). https://doi.org/10.1111/0022-4537.00153

    Article  Google Scholar 

  33. Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 72–78. ACM, New York (1994). https://doi.org/10.1145/191666.191703

  34. Nass, C., Moon, Y., Green, N.: Are machines gender neutral? Gender-stereotypic responses to computers with voices. J. Appl. Soc. Pyschol. 27(10), 864–876 (1997). https://doi.org/10.1111/j.1559-1816.1997.tb00275.x

    Article  Google Scholar 

  35. Niculescu, A., Hofs, D., van Dijk, B., Nijholt, A.: How the agent’s gender influence users’ evaluation of a QA system. In: International Conference on User Science and Engineering (i-USEr) (2010)

    Google Scholar 

  36. npmjs: Gender-detection (2019). https://www.npmjs.com/package/gender-detection. Accessed 22 July 2019

  37. Nunamaker, J.E., Derrick, D.C., Elkins, A.C., Burgoon, J.K., Patton, M.W.: Embodied conversational agent-based kiosk for automated interviewing. J. Manag. Inf. Syst. 28(1), 17–48 (2011). https://doi.org/10.2753/mis0742-1222280102

    Article  Google Scholar 

  38. Rosenwald, M.S.: How millions of kids are being shaped by know-it-all voice assistants (2019). https://www.washingtonpost.com/local/how-millions-of-kids-are-being-shaped-by-know-it-all-voice-assistants/2017/03/01/c0a644c4-ef1c-11e6-b4ff-ac2cf509efe5_story.html?noredirect=on&utm_term=.7d67d631bd52. Accessed 16 July 2019

  39. United Nations: Sustainability development goals. Goal 5: gender equality (2015). https://www.sdgfund.org/goal-5-gender-equality. Accessed 30 Oct 2019

  40. Vala, M., Blanco, G., Paiva, A.: Providing gender to embodied conversational agents. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, Kristinn R. (eds.) IVA 2011. LNCS (LNAI), vol. 6895, pp. 148–154. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23974-8_16

    Chapter  Google Scholar 

  41. Verhagen, T., van Nes, J., Feldberg, F., van Dolen, W.: Virtual customer service agents. Using social presence and personalization to shape online service encounters. J. Comput.-Mediat. Commun. 19(3), 529–545 (2014). https://doi.org/10.1111/jcc4.12066

    Article  Google Scholar 

  42. Weizenbaum, J.: ELIZA - a computer program for the study of natural language communication between man and machine. Commun. ACM 9(1), 36–45 (1966)

    Article  Google Scholar 

  43. West, M., Kraut, R., Chew, H.E.: I’d blush if I could: closing gender divides in digital skills through education (2019). https://unesdoc.unesco.org/ark:/48223/pf0000367416

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jasper Feine .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Feine, J., Gnewuch, U., Morana, S., Maedche, A. (2020). Gender Bias in Chatbot Design. In: Følstad, A., et al. Chatbot Research and Design. CONVERSATIONS 2019. Lecture Notes in Computer Science(), vol 11970. Springer, Cham. https://doi.org/10.1007/978-3-030-39540-7_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-39540-7_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-39539-1

  • Online ISBN: 978-3-030-39540-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics