Advertisement

Culture & Environment

  • Katy Cook
Open Access
Chapter

Abstract

This chapter looks critically at the environment and culture of Silicon Valley, with particular attention to issues around gender and ethnic diversity, and the consequences of having a largely homogenous workforce. Chapter  3 explores how occupational inequality in tech is encoded into the industry’s products, the implications of such insularity, and the social and financial benefits of pluralism.

Our individual psychology is complemented and formed, to a large degree, by our environment, which we can think of as the surroundings and conditions of any place we live, work, or spend our time. We exist in multiple environments simultaneously, including our home and family, work life, social circle, and the geographic or physical spaces we inhabit, each of which has its own distinctive patterns, qualities, and behaviors that are considered the norm within that group. Two of the most salient features of any environment are the people who comprise it and the quality of the relationships and interactions that occur within it.

Just as we can work to better understand our individual psychology, we can also infer certain features and qualities about the psychological health of our environments. The disciplines of environmental and cultural psychology explore the relationships, social dynamics, norms, and customs of different cultures and groups. In order to understand a social environment, psychologists might study the makeup of the community: who comprises it? Is it a diverse or homogenous group? Is it inclusive, welcoming, and connected, or exclusive, inhospitable, and isolated? They would also observe the quality of the relationships: are the interactions between members of the group healthy or unhealthy? Do members of the group support each other and treat each other with respect? Finally, we might ask more about the cultural norms: What is permissible? Do all members of the community feel psychologically safe? Answering these questions would not only help us to better understand different environments, but also provide an indication of the relative health of a group’s culture.

At an individual level, the quality of our environment can profoundly impact both our physical and mental health.1,2 Collectively, the culture of an organization informs the broader psychosocial health of its workforce. At an even higher level, the psychological health of an industry—its behavioral norms, group dynamics, and in-group relationships—can have profound social consequences as these bleed beyond the confines of the industry. If the psychology and values of an industry are sound, or at least not grossly negligent or unhealthy, we may never notice or even think about an industry’s culture. When group psychological health is in some way compromised, however, its effects may be observable or experienced beyond the industry itself. A thorough study of pre-2008 Wall Street culture, for example, might have uncovered a male-dominated, risk-oriented, and profit-hungry industry focused on short-term returns. The financial typhoon that resulted from specific behaviors and priorities was, to a large degree, the result of the culture of the industry at that time.

Cupertino, We Have a Problem

My time in Silicon Valley can generally be divided into two categories: things I felt privileged to see and things I wished I could un-see. The former was almost always a result of products shown or described to me: apps that laid out non-partisan local voting information, impact investing platforms, phones designed to reduce specific absorption rates3 to protect users from radiation, early warning indicators for medical imaging, drones that tracked and predicted poaching patterns. The things I wanted to un-see were almost always social: the way people spoke to each other, venture capitalists (VCs) bullying young CEOs, women feeling unwelcome in their jobs, sexual harassment , a lack of awareness of others’ feelings, and a staggering amount of unconscious bias . As the months passed, I realized everything I wanted to un-see came back to a problem of culture, of what was permissible within the working relationships of the industry that elsewhere would not have been acceptable.

A number of unflattering realities, including skewed hiring practices, rampant bias, and a shocking degree of insularity have led to what engineer Erica Joy Baker calls a “catastrophic failure in [the] culture”4 of Silicon Valley. Perpetuated by what Tom Goodwin describes as a “tribe of people that have come together and reinforce questionable values with each other,”5 the cultural problems in Silicon Valley tend to come back to three primary issues, from which a variety of other complications arise. First, tech tends to be an uncommonly homogenous culture, marked by a lack of diversity and an unwillingness to embrace pluralism; second, it is rife with discrimination, including sexism , ageism, and racism, as well as harassment ; and third, there is a disturbing level of immaturity that permeates many corporations, often emanating from the highest levels of the company. You can probably already see these issues are interrelated: a homogenous culture is more likely to exhibit discriminatory behaviors; discrimination is more likely to run rampant and unchallenged in an immature organization. Without the awareness necessary to recognize such behaviors as inappropriate, tendencies become patterns, which become increasingly embedded not only in the industry’s culture, but also in its products.

One dynamic that perpetuates the homogeny of the industry is what companies in Silicon Valley refer to as “culture fit ,” which is the idea that to be a good addition to the organization, you must possess the same qualities as those already employed within it. Author and venture capitalist Brad Feld explains that culture fit is essentially the practice of “hiring people like everyone else in the company,” and has become the norm in many Silicon Valley companies.6 The result is an industry that has a great deal in common with itself and is comprised primarily of people with similar backgrounds, perspectives, and experiences. The idea of culture fit is so deeply embedded within the vocabulary of Silicon Valley that Google famously has its own word for it: Googley.7 There are two primary problems with Googleyness, aside from the cringe factor. The first is the lack of transparency about what the term encompasses. There is no list of qualities that spell out what would make someone Googley or un-Googley, and therefore there is little insight into whether the qualities Google prioritizes promote a fair and nondiscriminatory work environment. The main problem, however, is the suggestion that there is a single mold of the ideal Google employee, which encourages fitting in rather than standing out, prioritizes homogeny over diversity, and puts pressure on employees, according to former employee Justin Maxwell, to act in a “Googley way.”8

A focus on preserving its existing culture has led many to charge that Googleyness is a vehicle for discrimination . Norman Matloff, who studies age discrimination in tech, explains that unlike gender and racial discrimination , which are captured in annual diversity reports, “the magic word ‘diversity’ doesn’t seem to apply to age in Silicon Valley,”9 despite the fact that age discrimination lawsuits and investigations have plagued Google and other tech giants for years. In 2004, Google fired 52-year-old manager Brian Reid just over a week before the company went public. Reid filed a discrimination suit, citing comments from “his supervisors, including the company’s vice president for engineering operations, allegedly called him a poor ‘cultural fit ,’ an ‘old guy’ and a ‘fuddy-duddy’ with ideas ‘too old to matter.’”10 (The suit settled out of court for an undisclosed amount.) A more recent case charged that “Googleyness or culture fit are euphemisms for youth and Google interviewers use these to intentionally discriminate on the basis of age.”11 While Google continues to deny charges of ageism and discrimination , the Department of Labor found the company guilty of repeatedly engaging in “extreme” age discrimination .12

The problem of ageism, unfortunately, is not Google’s alone, but an industry-wide bias. Matloff explains that prioritizing younger workers began largely as a cost-cutting exercise, wherein older staff were increasingly replaced with younger and cheaper employees willing to do the same work for less money.13 Yiren Lu has suggested that if tech is “not ageist, then at least increasingly youth-fetishizing,” noting the average age at Facebook is 26 (at the more mature Hewlett-Packard, by contrast, the median age is 39).14 However we label it, the prioritization of youth has resulted not only in destructive patterns of age-related complaints and lawsuits, but the perpetuation of uniformity in an already highly uniform culture.

99 Problems and Diversity’s Just One

In addition to prioritizing youthful employees, tech has historically failed to welcome women and people of color into its ranks across a variety of roles. Year-on-year, diversity reports at tech companies reflect the abysmal demographics of Silicon Valley’s workforce, which remains largely white and predominantly male. While such reports may not capture the complex dynamics behind the industry’s failure of diversity, they remain a useful tool to understand the scale of the problem.

When it comes to gender, recent diversity reports at Google, Facebook, and Microsoft show men make up 70%,15 65%,16 and 74% of all staff,17 respectively, statistics which are broadly reflective of gender demographics across the industry. In technical roles, the numbers skew even higher: at all three companies, men make up approximately 80% of engineering roles. In leadership roles across tech firms in the Bay Area, over 72% of positions are held by men.18 A joint study conducted by Wired and Element AI, for example, found only 12 percent of leading machine learning researchers were women,19 a statistic that has profound implications for future bias embedded in systems that rely on AI. When it comes to ethnic diversity, the numbers are even worse. At Google, Facebook, and Microsoft , white and Asian staff make up 87%, 89%,20 and 90% of all roles, respectively,21 and in technical and leadership roles, the numbers again increase dramatically. For women and people of color who do make it into these roles, data suggests their pay is typically far less than that of their white and Asian male colleagues.22

The problem with focusing on diversity statistics alone is that numbers fail to offer insight into the attitudes, behaviors, and cultural norms of the industry that drive these dynamics. Diversity reports provide quantitative data—which the tech industry loves—but they do not provide qualitative information about why the numbers are the way they are or how to make them better. While some have suggested there are simply not enough women and people of color applying for engineering jobs, research shows that even when under-represented employees are appointed to technical or leadership roles, many tech companies have difficulty retaining them. In a survey of 716 women who had worked in tech, over a quarter cited an “overtly or implicitly discriminatory” environment as their primary reason for leaving the industry.23

Attrition rates in tech are indeed much higher for women and people of color, particularly black and Latin American employees,24 suggesting it is likely the industry’s culture, rather than its pipeline, that makes many tech corporations unwelcoming, unfair, and unhealthy environments for those not in the majority. The suggestion that there should be more women and people of color in tech is not wrong—there should be—but embarking on a hiring spree of non-white, non-male employees will not alone change the culture of the industry, which is deeply embedded in its social and organizational psychology. Social change can be a slow and often painful process and it may take years to effectively modify the norms of a large group or an entire industry. Thankfully, there are many people stepping up to the challenge.

In her book Reset: My Fight for Inclusion and Lasting Change, Ellen Pao describes her experience working in the white, male dominated world of venture capital at Kleiner Perkins. Pao’s account of Silicon Valley portrays an industry that is not only unwelcoming, but “designed to keep people out who aren’t white men.”

You can’t always get ahead by working hard if you’re not part of the ‘in’ crowd. You will be ostracized no matter how smart you are, how bone-crushingly hard you work, how much money you make for the firm, or even how many times they insist they run a meritocracy. Year after year, we hear the same empty promises about inclusion , and year after year we see the same pitiful results.25

Reset chronicles years of discrimination against both Pao and her female colleagues, including pay disparities and promotions that were repeatedly reserved for male colleagues. Women were consistently driven out of the firm; few lasted more than two to three years. The world Pao portrays in Reset is one of homogeny perpetuated by bias and favoritism. She recalls her former boss speaking to the National Venture Capital Association, describing ideal tech founders as “white, male, nerds who’ve dropped out of Harvard or Stanford” and have absolutely “no social life,”26 perpetuating the false narrative of the consummate engineer: young, Caucasian, and socially skill-less.

Pao eventually filed a discrimination lawsuit against Kleiner Perkins, which she lost, but not before bearing an onslaught of abuse, harassment, and retaliation in her final weeks at the firm. Following her departure, Pao founded Project Include, an initiative that advocates for diversity and inclusion in tech. Both Project Include and Reset make strong cases for amending the psychosocial norms of the tech industry such that they are open to and inclusive of everyone, regardless of gender, race, ethnicity, disability, or age. “To make tech truly diverse,” Pao argues, “we need to make all sorts of people feel welcome and set them up to succeed.”27

Erica Joy Baker, a founding advisor at Project Include, is an engineer who has worked in tech for over a decade. She recounts a similar environment and dynamic within her Silicon Valley engineering teams, each of which Baker describes as comprised predominantly of young, white men. As an African-American woman, Baker recalls feeling that she stuck “out like a sore thumb” in what she soon realized were consistently homogenous surroundings, where she was often neither welcomed nor recognized as an engineer.28

I have been mistaken for an administrative assistant more than once. I have been asked if I was physical security (despite security wearing very distinctive uniforms). I’ve gotten passed over for roles I know I could not only perform in, but that I could excel in. Most recently, one such role was hired out to a contractor who needed to learn the language the project was in (which happened to be my strongest language).29

Baker describes her time in the Bay Area as great for her career but bad for her as a person, noting the cultural dynamics that were, at best, inhospitable, at worst, sexist, racist, and discriminatory.30 The psychological scars such treatment can inflict, particularly if sustained over a period of time, is what led many of the women I spoke to not only to leave the industry, but to do so with the knowledge they would never return.

The environment Pao and Baker describe is emblematic of a pattern in Silicon Valley that has been largely ignored and, in many cases, condoned. The homogeny, bias, and, at times, hostile culture towards those who don’t “fit” have forced Silicon Valley companies to acknowledge an industry-wide working environment that is fundamentally broken and unhealthy, and which no amount of free lunches or company perks can fix. It also illustrates an industry that fails to understand the distinction between diversity and pluralism. Where the former implies a culture or group that is mixed, pluralism is defined by a sense of inclusion, engagement, and power-sharing. Diversity is measured in numbers; pluralism is demonstrated in environments that value inclusion, equality, and respect. Facebook can hire as many women and people of color as their HR department will allow, but without engaging with the voices, talents, and experiences different people bring, diversity in itself remains a rather meaningless aim that ends with quotas and hiring targets. There are perfectly diverse populations where discrimination and harassment are still alive and well. While diversity should continue to be fought for—particularly as a first step towards a more inclusive environment—diversity on its own is not enough, and the complex problems in Silicon Valley’s environment will not be fixed without examining the culture that allows such homogeny to thrive.

The discrimination Pao and Baker depict quietly communicates the belief that women and people of color cannot perform engineering and leadership roles to the same standard as their young, white, male counterparts. In 2017, Google employee James Damore published an internal memo outlining his belief that the “abilities of men and women differ in part due to biological causes and that these differences may explain why we don’t see equal representation of women in tech and leadership.”31 Damore suggested companies should “stop assuming that gender gaps imply sexism,” and that women were simply more prone to choose different career paths.32 The memo received criticism both within and outside Google and Damore was soon fired for what CEO Sundar Pichai called “advancing harmful gender stereotypes” in the workplace by suggesting “a group of our colleagues have traits that make them less biologically suited to” their work.33 Damore’s memo is at once awful and illuminating; perhaps without intending to, Damore illustrated precisely the type of discrimination that runs rampant, unchecked, and unspoken within many Silicon Valley tech companies, and has thus pushed the problem of discrimination in tech to the fore.

The question of how men and women are different, and if these differences might affect their work, is actually an interesting one—though the research does not point in the direction people like Damore might like. A 2015 study from Iowa State University found that the psychological differences between men and women were far less pronounced than most people assume. In 75 percent of the psychological qualities that were measured, including morality, risk taking, and occupational stress, men and women’s responses overlapped approximately 80 percent of the time. The study’s researchers explain these results suggest that men and women are actually “more similar than most people think, and the majority of perceived differences can be attributed to gendered stereotypes.”34 A separate study found that where there are measureable psychometric differences between men and women, these tend to be constellated around characteristics such as empathy, compassion, problem-solving, psychological awareness, and social sensitivity, which women collectively are inclined to demonstrate more frequently.35 A separate study on gender differences found men were more than twice as likely as women to engage in behaviors regarded as unethical.36 (Whether these are learned or innate qualities the studies do not say.) Other researchers have mirrored these results, and shown that qualities such as collaboration,37 empathy,38 open-mindedness and maturity,39 and social and emotional skills,40 tend to be more prevalent amongst women than men. When we consider the value that more gender diversity may bring to the tech industry, the very skills research suggests may be more common in female employees are precisely those that would benefit the industry as it enters the third year of its identity crisis.41

Yonatan Zunger, a former Google engineer has argued that the skills women bring to tech are a welcome addition to the field. “It’s true that women are socialized to be better at paying attention to people’s emotional needs and so on—this is something that makes them better engineers, not worse ones.”42 Bob Wyman, who has worked in the industry for over forty years, has written that while men and women may differ in some respects, any purported differences “which are relevant to ‘software’ are culturally imposed.”43 Where woman are often different, Wyman suggests, is in their refusal to celebrate and adhere to the distinctly “dysfunctional” male culture that encourages working “ridiculously hard for stupidly long hours… while exhibiting no cultural awareness or social skills.”44 Zunger and Wyman’s accounts are the exact opposite of Damore’s: where Damore believes women are biologically less equipped to work as engineers, Zunger and Wyman recognize not only that such beliefs are unfounded, but also that the qualities women do bring to the industry are exactly those it needs most.

There is strong evidence that increasing diversity in the industry would not only elevate the psychological and emotional skillsets that are lacking in tech, but also increase profitability. Studies show companies perform better when they have at least one female executive on the board45,46 and companies with a more diverse workforce across all demographic measures tend to have higher profits and earnings. Both racial and gender diversity are associated with “increased sales revenue, more customers, greater market share, and greater relative profits.”47 A report by McKinsey similarly found that gender diversity positively impacted profitability and value creation and that the most ethnically diverse executive teams were on average 33 percent more profitable.48 Not only, then, could the inclusion of more women and people of color in tech help shift the mindset and social priorities of Silicon Valley that are so desperately needed, but could also increase financial returns in the process.

This is not only true of gender and ethnicity, of course; increasing diversity across the industry is important in less obvious ways as well. People with different backgrounds and experiences have the capacity to consider Silicon Valley’s issues from a different perspective, which may encourage greater empathy for those using the industry’s products and more effectively consider the long-term impacts of those products on society. In a 2017 TED talk, Harvard psychologist Susan David reminds her listeners that “diversity isn’t just people, it’s also what’s inside people.” David persuasively contends that this includes thinking of diversity in terms of how we experience emotion. A greater capacity for emotional intelligence, according to David, will result in organizational dynamics that are more agile and resilient across the board. Those capable of asking questions such as “What is my emotion telling me?” “Which action will bring me towards my values?” “Which will take me away from my values?” encourages greater self-awareness and emotional agility, which tend to lead to what David describes as more “values-connected” behaviors.49

Attitudes that have allowed beliefs such as Damore’s to proliferate in the tech community are largely the result of unconscious bias, rather than conscious malicious intent. Howard J. Ross, author of Everyday Bias, compares the unconscious assumptions we accumulate throughout our lives to a “polluted river”50 that runs through our conscious mind, silently informing what we believe about ourselves or others, often based on false information and mistaken ideas. Ross explains that no one is exempt: we all draw on conscious bias, unconscious bias, “and stereotypes, all of the time… without realizing we are doing it.”51 The process of stereotyping is actually a result of evolution. Stereotypes, Ross explains, “provide a shortcut that helps us navigate through our world more quickly, more efficiently, and, our minds believe, more safely,”52 and keep us from having to reassess each situation from scratch every time we encounter something or someone new. The downside, of course, is that the same beliefs that ease our decision-making also cause a proliferation of biases, particularly in relation to people who we consider to be in some way different from us.

When left unchecked, discrimination is the inevitable precursor to a host of other issues. Not only does bias influence hiring, interviews, job assignments, and promotions,53 it can also drive harassment, bullying, and dysfunctional cultures. Combined with the imbalance of power in Silicon Valley, which generally sits in the hands of white male executives, discrimination has led to intimidation, gross abuses of power, and inappropriate behavior throughout the industry and has given birth to what Caroline McCarthy, a former Google engineer, calls the “rampant and well-documented sexism and sexual harassment ”54 endemic in Silicon Valley.

One of the first and most famous examples of discrimination and harassment in Silicon Valley is Susan Fowler’s account of her time working at Uber. Her first day at the company, Fowler was sexually propositioned by her manager on Uber’s internal chat system. She took screenshots and brought them to HR, but was told it was the man’s first offense and, given his status as a “high performer,” the company was unwilling to punish him. Instead, he received a warning. Soon after, the same high performer was reported again; HR reiterated to his new accuser that it was his first offense. The situation was escalated to upper management, but no action was taken. When Fowler attempted to transfer to a different team, despite her excellent performance reviews, a book contract with O’Reilly publishing, and multiple speaking engagements, she was blocked from moving within the company. When she attempted to transfer again, she was told her performance reviews had been changed; it was now noted that she showed no signs of “an upward career trajectory.” The wide-spread sexist attitudes at Uber ran deeply throughout the organization, resulting in an exodus of female employees, including Fowler, who left after a year, calling it “an organization in complete, unrelenting chaos.”55 When she joined the company, Fowler’s department was over 25% female; when she attempted to transfer, that number had dropped to 6%; by the time she left, only 3% of the SRE engineers in the company were women. Before her departure, Fowler attended an all-hands meeting, where she asked a director what was being done to address the depleted numbers of women in the organization: “his reply was, in a nutshell, that the women of Uber just needed to step up and be better engineers.”56 At the time of this writing, the company is being investigated by the EEOC over charges of gender inequity,57 and has also been accused of attempting to silence not only its own employees, but female riders who have reported harassment and rape by the company’s driver partners .58

Fowler’s case may be one of the most notable, but her experience is hardly an anomaly. For every Ellen Pao, Erica Joy Baker, and Susan Fowler, there are countless cases of discrimination , bullying, and harassment that go unreported or which tech companies keep out of the public eye. A 2017 survey found that 60 percent of female employees working in tech in Silicon Valley had experienced unwanted sexual advances.59 Thanks to women like Pao, Baker, and Fowler, as well as the #metoo movement, more cases than ever have been reported in the past several years. Some of the most prominent inquiries and investigations of gender and racial harassment and discrimination include:
  • Justin Caldbeck, a Venture Capitalist and founder of Binary Capital, was accused of multiple charges of sexual harassment in a suit brought against him by 6 women. While he immediately denied the claims, he soon took a leave of absence and later resigned.60

  • Mike Cagney, CEO of SoFi, stepped down following accusations of harassment and a lawsuit by former employee Brandon Charles, who was fired after reporting the harassment of female co-workers and “alleging a toxic culture of gender-related discrimination and harassment .”61

  • Elizabeth Scott filed a suit against VR company Upload in 2017, after she was fired for issuing a complaint about the inappropriate and “hostile atmosphere” of the company, which Scott alleged included a room in the office with a bed “to encourage sexual intercourse at the workplace,” colloquially known as the “kink room.”62

  • A number of charges have been leveled against Tesla, including lawsuits filed on the basis of harassment, racism, discrimination, and homophobia.63 One example includes an 11-count suit filed by California Civil Rights Law Group on behalf of DeWitt Lambert that alleges instances of “Race Harassment, Race Discrimination, Sexual Harassment, Retaliation, Failure to Prevent Harassment, Discrimination and Retaliation, Threats of Violence in Violation of the Ralph Act, Violation of the Bane Act, Failure to Accommodate, Failure to Engage in the Interactive Process, and Assault and Battery.”64 In addition to refuting the claims, Tesla has criticized those who bring charges or complaints against the company, including engineer AJ Vandermeyden, who sued Tesla for harassment and discrimination,65 and Tesla factory workers who have complained about working conditions and safety concerns.66

  • Software engineer Kelly Ellis, accused her senior male colleagues at Google of harassment in 2015, including one manager telling her during a company trip to Hawaii that it was “taking all of [his] self control not to grab” her.67

  • Whitney Wolfe sued Tinder in 2014, after she alleged the company’s Chief Marketing Officer, Justin Mateen, referred to her as a “slut” and “whore.” Wolfe also alleged she was not given the co-founder title she deserved because she was female.68

  • Tom Preston-Werner, founder of GitHub, resigned in 2014 following sexual harassment charges and an investigation into his behavior toward female colleagues. The company found there to be no “legal wrongdoing” on Preston-Werner’s part, but “evidence of mistakes and errors of judgement.”69

In some of the above cases, the accused were found guilty, in others they were not; some left their companies voluntarily, while others were forced out; some cases were found not to have sufficient evidence, while many settled with the plaintiffs out of court. The sheer volume of harassment lawsuits in tech have thrown light onto a culture one case described as “male bravado” combined with “unchecked arrogance” and “a laser focus on growth and financial success while ignoring workplace regulations.” The lawsuit explained how the attitudes of the organization had “filter[ed] down from the leadership team… throughout the company, empowering other managers to engage in sexual conduct in the workplace.” The result was an environment in which sexual harassment was not only condoned, but those who spoke out against it were punished.70 Even the most forgiving employees of one tech organization under investigation described it as “a company run by young, immature men who were flush with cash and did not know how to handle their power.”71

Research has demonstrated, somewhat unsurprisingly, that bullying and harassment lead to a low-quality work environment, not only for those who are victimized, but also for those who witness inappropriate behaviors, which may take the form of “insulting remarks and ridicule, verbal abuse, offensive teasing, isolation, and social exclusion, or the constant degrading of one’s work and efforts.”72 Decreased job satisfaction, decreased productivity, and high turnover are among the most common organizational consequences, to say nothing of the psychological effects on those involved, which can include depression , anxiety, and post-traumatic stress .

How Bias Is Encoded

When sexism , racism, and ageism are written into the cultural norms of an industry, it is naïve to think these would somehow not be coded into the products and services that industry produces. Given the homogeny and dysfunctional behavior of an appreciable cohort of Silicon Valley, we shouldn’t be surprised when Google’s photo service tags black people as gorillas73; when predatory loans are targeted at racial minorities74; when research photo collections supported by Facebook and Microsoft associate women with cooking and men with sports75; when parole decisions and risk scores used by courts are grossly biased against black people76; when hostile online communities target, harass, and threaten women and minorities77; when video games called RapeLay go viral78; or when algorithms automatically produce and sell t-shirts with the words “Keep Calm and Rape A Lot,” “Keep Calm and Grope A Lot,” and “Keep Calm and Knife Her.”79 We should be outraged, but we shouldn’t be surprised.

Though they can be incredibly complex and inexplicable even to their creators, algorithms are, at their core, machines that employ a “set of steps that can be used to make calculations, resolve problems and reach decisions.”80 And because humans program algorithms, algorithms are encoded with human biases. Cathy O’Neil, author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, explains that the danger of training computers using existing data is that our existing data is littered with our own biases. “Algorithms are opinions embedded in code. It’s really different from what most people think of algorithms. They think algorithms are objective and true and scientific. That’s a marketing trick.”81 O’Neil also points out that once bias is built into a system, it is incredibly difficult to remove, making it harder to correct our previous stereotypes and assumptions down the road. Instead of making things more fair, as we assume they should, O’Neil argues algorithms “automate the status quo” and encourage us to “repeat our past practices, [and] our patterns.”82 Imagine yourself in high school. Would you act the same way, hold the same beliefs, or even use the same phrases you did back then? There’s every chance you’ve changed and matured quite a bit since your teens, given the opportunity to grow and expand your understanding of the world around you. When we take a snapshot of our values and beliefs and freeze them in time, we limit our ability to progress beyond them. When this happens individually, it’s a shame; when we freeze our prejudices, beliefs, and biases in time collectively, it may limit our capacity to grow and advance as a species.

A 2018 study in the journal Nature explains how using large data sets to program algorithms—whether they are social, research, or legal systems—will naturally perpetuate the biases, both conscious and unconscious, that we hold as a society.

A major driver of bias in AI is the training data. Most machine-learning tasks are trained on large, annotated data sets. Deep neural networks for image classification, for instance, are often trained on ImageNet, a set of more than 14 million labelled images. In natural-language processing, standard algorithms are trained on corpora consisting of billions of words. Researchers typically construct such data sets by scraping websites, such as Google Images and Google News, using specific query terms, or by aggregating easy-to-access information from sources such as Wikipedia. These data sets are then annotated, often by graduate students or through crowdsourcing platforms such as Amazon Mechanical Turk.83

When a small subset of individuals are responsible for programming algorithms that are used throughout the world, there are bound to be disparities between the world represented in such systems and the world as it actually is. The researchers explain that in the majority of data sets used to program systems and inform research, certain groups are over-represented, while others are under-represented.

More than 45% of ImageNet data, which fuels research in computer vision, comes from the United States, home to only 4% of the world’s population. By contrast, China and India together contribute just 3% of ImageNet data, even though these countries represent 36% of the world’s population. This lack of geodiversity partly explains why computer vision algorithms label a photograph of a traditional US bride dressed in white as ‘bride’, ‘dress’, ‘woman’, ‘wedding’, but a photograph of a North Indian bride as ‘performance art’ and ‘costume.’84

The result of having predominantly Western, white, male input into systems such as ImageNet, Google Images, and Mechanical Turk is the assumption of white, male dominance and the proliferation of racial and gendered stereotypes . When converting Spanish articles written by women into English, for example, Google Translate often defaults to “he said/wrote,” assuming the writer is male.85 Software developed for Nikon cameras, meant to warn when subjects are blinking, routinely tag Asian subjects as blinking. Algorithms designed to process naming data tend to classify Caucasian names as “pleasant” and African American names as “unpleasant.”86 A 2013 study by Harvard researcher Latanya Sweeney found that a greater number of ads on Google and Reuters mentioning “arrest” appeared beside searches for black identifying names than white identifying names.87 A 2016 study by Boston University and Microsoft found that software “trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent,” yielding responses such as “Man is to computer programmer as woman is to homemaker.”88

As algorithms increasingly take on ever more significant jobs, they will not only perpetuate grossly racist and sexist stereotypes, but will also have profound, tangible effects on peoples’ lives. This is particularly problematic in cases where automated systems are used to assist judges in parole decisions, predict areas of future crime, help employers find job candidates, and negotiate contracts.89 Because bias is hardwired in the data set, the decisions algorithms hand down are unlikely to be fair or just, as multiple investigations have already demonstrated. In 2014, former U.S. Attorney General Eric Holder requested the U.S. Sentencing Commission to review its use of risk scores, fearing they may be furthering prejudicial behavior in the court system.

Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice… [and] may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.90

When the Sentencing Commission failed to capitulate to Holder’s suggestion, ProPublica launched an investigation. After analyzing over 7,000 risk scores, ProPublica’s findings corroborated Holder’s concerns: algorithmic risk scores were extremely unreliable in their ability to forecast crime (only 20 percent of those predicted to commit violent crimes actually did so). ProPublica also demonstrated that the algorithm was more likely to label white defendants as low-risk and “falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.”91

Em‘bot’iments of Bias

We are only starting to witness the impacts of our prejudices embedded in the machines and systems we create. If we take a moment to extrapolate these trends to future creations, we can easily imagine a world in which our most appalling impulses and reprehensible ideas are built into the fabric of everyday technologies. As they become more widespread, there is particular concern that physical robots , automated bots, and other anthropomorphized tools will continue to be programed or designed without appropriate oversight and ethical considerations. Without meaningful civic discussion and appropriate governmental regulation, the machines we employ to do the work we relegate to them may amplify rather than alleviate current social problems, such as the inequality and discrimination uncovered by ProPublica and others.

Some of the most obvious examples of technological bias are the personas of bots and digital assistants , which have been fashioned almost exclusively to mimic women. In an article titled “We Don’t Need Robots That Resemble Humans,” Professor Evan Selinger points out that the names bestowed upon most bots “ring gendered bells” and the services they perform are “historically associated with stereotypes of women’s work and women’s emotional labor.”92 By assigning female rather than male voices and personas to popular digital assistants such as Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana, Google’s OK Google, and Facebook’s now defunct M, there is an implicit correlation between women and helping and administrative positions. As Adrienne LaFrance points out, “the whole point of having a digital assistant is to have it do stuff for you. You’re supposed to boss it around.”93 There are conflicting opinions about and rationales for using a female versus a male or gender-neutral bot as an assistant, but there is little conclusive evidence to suggest a reason for their prevalence that does not involve prejudice, objectification, and uneven power dynamics.94,95

In the same way assigning a gender to machines “risks amplifying social prejudices and incentivizing objectification,”96 the decision to anthropomorphize robots can take on a racial dimension as well. A quick online image search will confirm that the majority of domestic robots are white, while more menacing, Terminator-like robots, such as those developed by Boston Dynamics, tend to be darker. A 2018 study found “people perceive robots with anthropomorphic features to have race, and as a result, the same race-related prejudices that humans experience extend to robots.”97 One group of researchers found that when a robot “has the physical appearance of a member of another race [it] is treated as a member of an outgroup and perceived as less human-like by people with racial prejudices.”98 A second study found that when robots were perceived to be of the same group, participants were more likely to interact with and evaluate them positively.99 The phenomenon illustrated by the research above is known as ingroup-outgroup bias, which is a form of social classification in which people identify with and favor those they perceive as similar. In the same way developers should be aware of the implications of humanizing or assigning gender to robots, care must also be taken to avoid perpetuating racial or ethnic bias as the field of robotics becomes more prevalent in our everyday lives. While it may seem inconsequential to some that Alexa the digital assistant is a woman and Pepper the mall robot is white, it is useful to question why these are the default options engineers and roboticists have collectively deemed most appropriate. The problem is not that the individual developers and entrepreneurs in Silicon Valley are horribly racist, sexist people, but that we all exhibit subtle biases of which we are unaware. “We are all biased. We’re all racist and bigoted in ways that we wish we weren’t, in ways that we don’t even know,” explains O’Neil, “and we are injecting those biases into the algorithms.”100

When we consider the environment of Silicon Valley, we can safely observe that there remains work to be done. Beyond the effects of exclusion, discrimination, and algorithmic bias, the tech industry as a whole suffers as a result of the attitudes and prejudices it condones. The lack of women and people of color in engineering and leadership roles raises—or should raise—the question of what is lost because of their absence and what kind of environment and culture the industry would like to prioritize moving forward.

Footnotes

  1. 1.

    Stansfeld, S., & Candy, B. (2006). Psychosocial Work Environment and Mental Health—A Meta-Analytic Review. Scandinavian Journal of Work, Environment & Health, 32(6), 443–462.

  2. 2.

    Taylor, S. E., Repetti, R. L., & Seeman, T. (1997). Health Psychology: What is an Unhealthy Environment and How Does It Get Under the Skin? Annual Review of Psychology, 48(1), 411–447.  https://doi.org/10.1146/annurev.psych.48.1.411

  3. 3.

    Specific absorption rate, or SAR, is a measurement of the energy that is absorbed by the human body when it is exposed to radio frequency electromagnetic fields. It is used as a measurement for the radio wave power emitted by mobile phones and MRI machines. Both the U.S. Federal Communications Commission (FCC) and the E.U.’s European Committee for Electrotechnical Standardization (CENELEC) both limit the SAR level of mobile phones to 1.6 w/kg and 2.0 w/kg, respectively.

  4. 4.

    Baker , E. J. (2017, August 5). I Am Disappointed but Unsurprised by the News that an Anti-diversity, Sexist, Manifesto is Making…. Retrieved August 20, 2018, from Medium website: https://medium.com/projectinclude/i-am-disappointed-but-unsurprised-by-the-news-that-an-anti-diversity-sexist-racist-manifesto-is-5fdafbe19352

  5. 5.

    Goodwin, T. (2017, August 3). Interview with Tom Goodwin (K. Cook, Interviewer).

  6. 6.

    Feld, B. (2017, June 12). Go for Culture Add, Not Culture Fit. Retrieved August 21, 2018, from Feld Thoughts website: https://www.feld.com/archives/2017/06/go-culture-add-not-culture-fit.html

  7. 7.

    Googleyness is assessed both during a job candidate’s interview process and also in quarterly reviews, where employees receive a “Googleyness score.”

  8. 8.

    Solon, O. (2018, March 16). ‘They’ll Squash You like a Bug’: How Silicon Valley Keeps a Lid on Leakers. The Guardian. Retrieved from https://www.theguardian.com/technology/2018/mar/16/silicon-valley-internal-work-spying-surveillance-leakers

  9. 9.

    Baron, E. (2016, July 5). Federal Investigators Probe Google over Age-Discrimination Complaints. Retrieved October 26, 2018, from The Mercury News website: https://www.mercurynews.com/2016/07/05/federal-investigators-probe-google-over-age-discrimination-complaints/

  10. 10.

    Glantz, A. (2012, January 28). Bay Area Technology Professionals Can’t Get Hired as Industry Moves On. The New York Times. Retrieved from https://www.nytimes.com/2012/01/29/us/bay-area-technology-professionals-cant-get-hired-as-industry-moves-on.html

  11. 11.

    Baron, E. (2018, September 12). ‘Googley’ Does Not Mean ‘Young,’ Google Claims in Age-Bias Lawsuit. Retrieved October 26, 2018, from The Mercury News website: https://www.mercurynews.com/2018/09/12/googley-does-not-mean-young-google-claims-in-age-bias-lawsuit/

  12. 12.

    Ibid.

  13. 13.

    Ibid.

  14. 14.

    Lu, Y. (2018, January 19). Silicon Valley’s Youth Problem. The New York Times. Retrieved from https://www.nytimes.com/2014/03/16/magazine/silicon-valleys-youth-problem.html

  15. 15.

    Google. (2018). Google Diversity Annual Report 2018. Retrieved from https://diversity.google/annual-report/

  16. 16.

    Facebook. (2017). Facebook Diversity Update: Building a More Diverse, Inclusive Workforce. Retrieved from https://fbnewsroomus.files.wordpress.com/2017/08/fb_diversity_2017_final.pdf

  17. 17.

    Microsoft. (2018). Inside Microsoft. Retrieved August 21, 2018, from http://www.microsoft.com/en-us/diversity/inside-microsoft/default.aspx

  18. 18.

    U.S. Equal Employment Opportunity Commission. (2014). Diversity in High Tech. Retrieved from https://www1.eeoc.gov/eeoc/statistics/reports/hightech/index.cfm?renderforprint=1

  19. 19.

    Williams, M. (2018). Facebook 2018 Diversity Report: Reflecting on Our Journey [Facebook Newsroom]. Retrieved from Facebook website: https://newsroom.fb.com/news/2018/07/diversity-report/

  20. 20.

    Microsoft. (2018). Inside Microsoft. Retrieved August 21, 2018, from http://www.microsoft.com/en-us/diversity/inside-microsoft/default.aspx

  21. 21.

    Google. (2018). Google Diversity Annual Report 2018. Retrieved from https://diversity.google/annual-report/

  22. 22.

    Tiku, N. (2019, January 18). Oracle Paid Women $13,000 Less Than Men, Analysis Finds. Wired. Retrieved from https://www.wired.com/story/analysis-finds-oracle-paid-women-13000-less-than-men/

  23. 23.

    Snyder, K. (2014, October 2). Why Women Leave Tech: It’s the Culture, Not because “Math is Hard.” Retrieved August 20, 2018, from Fortune website: http://fortune.com/2014/10/02/women-leave-tech-culture/

  24. 24.

    Google. (2018). Google Diversity Annual Report 2018. Retrieved from https://diversity.google/annual-report/

  25. 25.

    Pao, E. K. (2017). Reset: My Fight for Inclusion and Lasting Change (p. 8). New York: Spiegel & Grau.

  26. 26.

    Ibid.

  27. 27.

    Ibid., p. 254.

  28. 28.

    Baker , E. J. (2014, November 4). The Other Side of Diversity. Retrieved August 20, 2018, from This is Hard website: https://medium.com/this-is-hard/the-other-side-of-diversity-1bb3de2f053e

  29. 29.

    Ibid.

  30. 30.

    Ibid.

  31. 31.

    Lee, D. (2017, August 8). Google Fires Diversity Memo Author. BBC News. Retrieved from https://www.bbc.com/news/technology-40859004

  32. 32.

    Ibid.

  33. 33.

    Ibid.

  34. 34.

    Zell, E., Krizan, Z., & Teeter, S. R. (2015). Evaluating Gender Similarities and Differences Using Metasynthesis. American Psychologist, 70(1), 10–20.  https://doi.org/10.1037/a0038208

  35. 35.

    Schulte-Rüther, M., Markowitsch, H. J., Shah, N. J., Fink, G. R., & Piefke, M. (2008). Gender Differences in Brain Networks Supporting Empathy. NeuroImage, 42(1), 393–403.  https://doi.org/10.1016/j.neuroimage.2008.04.180

  36. 36.

    Betz, M., O’Connell, L., & Shepard, J. M. (1989). Gender Differences in Proclivity for Unethical Behavior. Journal of Business Ethics, 8(5), 321–324.  https://doi.org/10.1007/BF00381722

  37. 37.

    Bear, J. B., & Woolley, A. W. (2011). The Role of Gender in Team Collaboration and Performance. Interdisciplinary Science Reviews, 36(2), 146–153.  https://doi.org/10.1179/030801811X13013181961473

  38. 38.

    Farrell, W., Seager, M. J., & Barry, J. A. (2016). The Male Gender Empathy Gap: Time for Psychology to Take Action. New Male Studies, 5, 6–16.

  39. 39.

    Walsh, C. M., & Hardy, R. C. (1999). Dispositional Differences in Critical Thinking Related to Gender and Academic Major. Journal of Nursing Education, 38(4), 149–155.  https://doi.org/10.3928/0148-4834-19990401-04

  40. 40.

    Groves, K. S. (2005). Gender Differences in Social and Emotional Skills and Charismatic Leadership. Journal of Leadership & Organizational Studies, 11(3), 30–46.  https://doi.org/10.1177/107179190501100303

  41. 41.

    Unfortunately, a report by Davos has found the opposite to be the case: Progress for women in fields such as IT and biotech have stagnated, reversing a decades-long trend towards greater equality. Unfortunately, just as they are needed the most, women—along with their ideas, perspectives, and skills—are being lost. Based on diversity reports and research on underrepresented minorities in tech, this appears to be the case for other minorities as well. In all cases, unless the homogeny in tech is reversed—and quickly—the trends of inequality already associated with tech will continue to increase.

  42. 42.

    Zunger, Y. (2017, August 6). So, about this Googler’s Manifesto. Retrieved August 20, 2018, from Yonatan Zunger website: https://medium.com/@yonatanzunger/so-about-this-googlers-manifesto-1e3773ed1788

  43. 43.

    Wyman, B. (2017, August 6). Back in the 1970’s, When I First Got in the Software Business, I Remember there being a Much Higher…. Retrieved August 21, 2018, from Medium website: https://medium.com/@bobwyman/back-in-the-1970s-when-i-first-got-in-the-software-business-i-remember-there-being-a-much-higher-f70e8197fbd9

  44. 44.

    Ibid.

  45. 45.

    Lagerberg, F. (2015). Women in Business: The Value of Diversity (pp. 1–4). Retrieved from Grant Thornton website: https://www.grantthornton.global/globalassets/wib_value_of_diversity.pdf

  46. 46.

    Campbell, K., & Mínguez-Vera, A. (2008). Gender Diversity in the Boardroom and Firm Financial Performance. Journal of Business Ethics, 83(3), 435–451.  https://doi.org/10.1007/s10551-007-9630-y

  47. 47.

    Herring, C. (2009). Does Diversity Pay?: Race, Gender, and the Business Case for Diversity. American Sociological Review, 74(2), 208–224.  https://doi.org/10.1177/000312240907400203

  48. 48.

    Hunt, V., Yee, L., Prince, S., & Dixon-Fyle, S. (2018). Delivering Growth Through Diversity in the Workplace. Retrieved August 21, 2018, from McKinsey website: https://www.mckinsey.com/business-functions/organization/our-insights/delivering-through-diversity

  49. 49.

    David, S. (2017). The Gift and Power of Emotional Courage. Retrieved from https://www.ted.com/talks/susan_david_the_gift_and_power_of_emotional_courage/

  50. 50.

    Ross, H. J. (2014). Everyday Bias: Identifying and Navigating Unconscious Judgments in Our Daily Lives (p. 3). Rowman & Littlefield.

  51. 51.

    Ibid., p. 4.

  52. 52.

    Ibid., p. 6.

  53. 53.

    Ibid., p. xxi.

  54. 54.

    McCarthy, C. (2017, September 25). Confessions of a Failed Female Coder. Retrieved August 20, 2018, from Hacker Noon website: https://hackernoon.com/confessions-of-a-failed-female-coder-956cbe138c69

  55. 55.

    Fowler, S. (2017, February 19). Reflecting on One Very, Very Strange Year at Uber. Retrieved August 20, 2018, from Susan Fowler website: https://www.susanjfowler.com/blog/2017/2/19/reflecting-on-one-very-strange-year-at-uber

  56. 56.

    Ibid.

  57. 57.

    Bensinger, G. (2018, July 16). Uber Faces Federal Investigation Over Alleged Gender Discrimination. The Wall Street Journal. Retrieved from https://www.marketwatch.com/story/uber-faces-federal-investigation-over-alleged-gender-discrimination-2018-07-16-171034149

  58. 58.

    Levin, S. (2018, March 16). Uber Accused of Silencing Women Who Claim Sexual Assault by Drivers. The Guardian. Retrieved from https://www.theguardian.com/technology/2018/mar/15/uber-class-action-lawsuit-sexual-assault-rape-arbitration

  59. 59.

    Vassallo, T., Levy, E., Madansky, M., Mickell, H., Porter, B., Leas, M., & Oberweis, J. (2017). The Elephant in the Valley. Retrieved from Women in Tech website: https://www.elephantinthevalley.com/

  60. 60.

    Lee, D. (2017, July 1). Silicon Valley’s Women have Spoken. Now What? BBC News. Retrieved from https://www.bbc.com/news/technology-40465519

  61. 61.

    O’Connor, C. (2017, September 15). SoFi CEO Mike Cagney Out Immediately Amid Sexual Harassment Investigation. Retrieved August 21, 2018, from Forbes website: https://www.forbes.com/sites/clareoconnor/2017/09/15/sofi-ceo-mike-cagney-out-immediately-amid-sexual-harassment-investigation/

  62. 62.

    Streitfeld, D. (2017, September 15). Lurid Lawsuit’s Quiet End Leaves Silicon Valley Start-Up Barely Dented. The New York Times. Retrieved from https://www.nytimes.com/2017/09/15/technology/lurid-lawsuits-quiet-end-leaves-silicon-valley-start-up-barely-dented.html

  63. 63.

    Wong, J. C. (2018, June 14). Tesla Workers Say they Pay the Price for Elon Musk’s Big Promises. The Guardian. Retrieved from https://www.theguardian.com/technology/2018/jun/13/tesla-workers-pay-price-elon-musk-failed-promises

  64. 64.

    California Civil Rights Law Group files lawsuit against Tesla Motors Inc. after Oakland man endured months of racial discrimination, sexual harassment and violent threats from co-workers. (2017, March 27). Retrieved August 28, 2018, from PR Newswire website: https://www.prnewswire.com/news-releases/california-civil-rights-law-group-files-lawsuit-against-tesla-motors-inc-after-oakland-man-endured-months-of-racial-discrimination-sexual-harassment-and-violent-threats-from-co-workers-300430149.html

  65. 65.

    Levin, S. (2017, February 28). Female Engineer Sues Tesla, Describing a Culture of “Pervasive Harassment.” The Guardian. Retrieved from https://www.theguardian.com/technology/2017/feb/28/tesla-female-engineer-lawsuit-harassment-discrimination

  66. 66.

    Wong, J. C. (2017, May 18). Tesla Factory Workers Reveal Pain, Injury and Stress: “Everything Feels like the Future But Us.” The Guardian. Retrieved from https://www.theguardian.com/technology/2017/may/18/tesla-workers-factory-conditions-elon-musk

  67. 67.

    Does Silicon Valley have a sexism problem? (2017, February 21). BBC News. Retrieved from https://www.bbc.com/news/world-us-canada-39025288

  68. 68.

    Ibid.

  69. 69.

    Ibid.

  70. 70.

    O’Connor, C. (2017, September 15). SoFi CEO Mike Cagney Out Immediately Amid Sexual Harassment Investigation. Retrieved August 21, 2018, from Forbes website: https://www.forbes.com/sites/clareoconnor/2017/09/15/sofi-ceo-mike-cagney-out-immediately-amid-sexual-harassment-investigation/

  71. 71.

    Streitfeld, D. (2017, September 15). Lurid Lawsuit’s Quiet End Leaves Silicon Valley Start-Up Barely Dented. The New York Times. Retrieved from https://www.nytimes.com/2017/09/15/technology/lurid-lawsuits-quiet-end-leaves-silicon-valley-start-up-barely-dented.html

  72. 72.

    Einarsen, S., Raknes, B. I., & Matthiesen, S. B. (1994). Bullying and Harassment at Work and their Relationships to Work Environment Quality: An Exploratory Study. European Work and Organizational Psychologist, 4(4), 381–401.  https://doi.org/10.1080/13594329408410497

  73. 73.

    Simonite, T. (2017, August 21). Machines Taught by Photos Learn a Sexist View of Women. Wired. Retrieved from https://www.wired.com/story/machines-taught-by-photos-learn-a-sexist-view-of-women/

  74. 74.

    Kun, J. (2015, July 13). What does It Mean for an Algorithm to be Fair? Retrieved September 12, 2018, from Math ∩ Programming website: https://jeremykun.com/2015/07/13/what-does-it-mean-for-an-algorithm-to-be-fair/

  75. 75.

    Simonite, T. (2017, August 21). Machines Taught by Photos Learn a Sexist View of Women. Wired. Retrieved from https://www.wired.com/story/machines-taught-by-photos-learn-a-sexist-view-of-women/

  76. 76.

    Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016, May 23). Machine Bias. Retrieved August 31, 2018, from ProPublica website: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

  77. 77.

    United Nations Broadband Commission. (2015). Cyber Violence Report Press Release. Retrieved from http://www.unwomen.org/en/news/stories/2015/9/cyber-violence-report-press-release

  78. 78.

    Lah, K. (2010, March 31). “RapeLay” Video Game Goes Viral Amid Outrage. Retrieved June 18, 2019, from CNN website: http://www.cnn.com/2010/WORLD/asiapcf/03/30/japan.video.game.rape/index.html

  79. 79.

    Row over Amazon “rape” T-shirt. (2013, March 2). BBC News. Retrieved from https://www.bbc.com/news/business-21640347

  80. 80.

    Harari, Y. N. (2017). Homo Deus: A Brief History of Tomorrow (p. 97). London: Vintage.

  81. 81.

    O’Neil, C. (2017). The Era of Blind Faith in Big Data Must End. Retrieved from https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end/transcript

  82. 82.

    Ibid.

  83. 83.

    Zou, J., & Schiebinger, L. (2018). AI can be Sexist and Racist—It’s Time to Make It Fair. Nature, 559(7714), 324.  https://doi.org/10.1038/d41586-018-05707-8

  84. 84.

    Ibid.

  85. 85.

    Ibid.

  86. 86.

    Ibid.

  87. 87.

    Sweeney, L. (2013). Discrimination in Online Ad Delivery. Queue, 11(3), 10:10–10:29.  https://doi.org/10.1145/2460276.2460278

  88. 88.

    Bolukbasi, T., Chang, K.-W., Zou, J., Saligrama, V., & Kalai, A. (2016). Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. Advances in Neural Information Processing Systems (pp. 4349–4357).

  89. 89.

    Miller, C. (2018, August 21). The Terrifying, Hidden Reality of Ridiculously Complicated Algorithms. Times Literary Supplement. Retrieved from https://www.the-tls.co.uk/articles/public/ridiculously-complicated-algorithms/

  90. 90.

    Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016, May 23). Machine Bias. Retrieved August 31, 2018, from ProPublica website: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

  91. 91.

    Ibid.

  92. 92.

    Selinger, E. (2018, March 1). We Don’t Need Robots that Resemble Humans. Medium. Retrieved from https://medium.com/s/when-robots-rule-the-world/we-dont-need-robots-that-resemble-humans-37bc79484f18

  93. 93.

    LaFrance, A. (2016, March 30). Why Do so Many Digital Assistants have Feminine Names? The Atlantic. Retrieved from https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistants-have-feminine-names/475884/

  94. 94.

    Selinger, E. (2018, March 1). We Don’t Need Robots that Resemble Humans. Medium. Retrieved from https://medium.com/s/when-robots-rule-the-world/we-dont-need-robots-that-resemble-humans-37bc79484f18

  95. 95.

    LaFrance, A. (2016, March 30). Why Do so Many Digital Assistants have Feminine Names? The Atlantic. Retrieved from https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistants-have-feminine-names/475884/

  96. 96.

    Selinger, E. (2018, March 1). We Don’t Need Robots that Resemble Humans. Medium. Retrieved from https://medium.com/s/when-robots-rule-the-world/we-dont-need-robots-that-resemble-humans-37bc79484f18

  97. 97.

    Ackerman, E. (2018, July 18). Humans Show Racial Bias Towards Robots of Different Colors: Study. IEEE Spectrum. Retrieved from https://spectrum.ieee.org/automaton/robotics/humanoids/robots-and-racism

  98. 98.

    Złotowski, J., Proudfoot, D., Yogeeswaran, K., & Bartneck, C. (2015). Anthropomorphism: Opportunities and Challenges in Human–Robot Interaction. International Journal of Social Robotics, 7(3), 347–360.  https://doi.org/10.1007/s12369-014-0267-6

  99. 99.

    Kuchenbrandt, D., Eyssel, F., Bobinger, S., & Neufeld, M. (2013). When a Robot’s Group Membership Matters. International Journal of Social Robotics, 5(3), 409–417.  https://doi.org/10.1007/s12369-013-0197-8

  100. 100.

    O’Neil, C. (2017). The Era of Blind Faith in Big Data Must End. Retrieved from https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end/transcript

Copyright information

© The Author(s) 2020

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  • Katy Cook
    • 1
  1. 1.Centre for Technology AwarenessLondonUK

Personalised recommendations