Our individual psychology is complemented and formed, to a large degree, by our environment, which we can think of as the surroundings and conditions of any place we live, work, or spend our time. We exist in multiple environments simultaneously, including our home and family, work life, social circle, and the geographic or physical spaces we inhabit, each of which has its own distinctive patterns, qualities, and behaviors that are considered the norm within that group. Two of the most salient features of any environment are the people who comprise it and the quality of the relationships and interactions that occur within it.

Just as we can work to better understand our individual psychology, we can also infer certain features and qualities about the psychological health of our environments. The disciplines of environmental and cultural psychology explore the relationships, social dynamics, norms, and customs of different cultures and groups. In order to understand a social environment, psychologists might study the makeup of the community: who comprises it? Is it a diverse or homogenous group? Is it inclusive, welcoming, and connected, or exclusive, inhospitable, and isolated? They would also observe the quality of the relationships: are the interactions between members of the group healthy or unhealthy? Do members of the group support each other and treat each other with respect? Finally, we might ask more about the cultural norms: What is permissible? Do all members of the community feel psychologically safe? Answering these questions would not only help us to better understand different environments, but also provide an indication of the relative health of a group’s culture.

At an individual level, the quality of our environment can profoundly impact both our physical and mental health.Footnote 1,Footnote 2 Collectively, the culture of an organization informs the broader psychosocial health of its workforce. At an even higher level, the psychological health of an industry—its behavioral norms, group dynamics, and in-group relationships—can have profound social consequences as these bleed beyond the confines of the industry. If the psychology and values of an industry are sound, or at least not grossly negligent or unhealthy, we may never notice or even think about an industry’s culture. When group psychological health is in some way compromised, however, its effects may be observable or experienced beyond the industry itself. A thorough study of pre-2008 Wall Street culture, for example, might have uncovered a male-dominated, risk-oriented, and profit-hungry industry focused on short-term returns. The financial typhoon that resulted from specific behaviors and priorities was, to a large degree, the result of the culture of the industry at that time.

Cupertino, We Have a Problem

My time in Silicon Valley can generally be divided into two categories: things I felt privileged to see and things I wished I could un-see. The former was almost always a result of products shown or described to me: apps that laid out non-partisan local voting information, impact investing platforms, phones designed to reduce specific absorption ratesFootnote 3 to protect users from radiation, early warning indicators for medical imaging, drones that tracked and predicted poaching patterns. The things I wanted to un-see were almost always social: the way people spoke to each other, venture capitalists (VCs) bullying young CEOs, women feeling unwelcome in their jobs, sexual harassment , a lack of awareness of others’ feelings, and a staggering amount of unconscious bias . As the months passed, I realized everything I wanted to un-see came back to a problem of culture, of what was permissible within the working relationships of the industry that elsewhere would not have been acceptable.

A number of unflattering realities, including skewed hiring practices, rampant bias, and a shocking degree of insularity have led to what engineer Erica Joy Baker calls a “catastrophic failure in [the] culture”Footnote 4 of Silicon Valley. Perpetuated by what Tom Goodwin describes as a “tribe of people that have come together and reinforce questionable values with each other,”Footnote 5 the cultural problems in Silicon Valley tend to come back to three primary issues, from which a variety of other complications arise. First, tech tends to be an uncommonly homogenous culture, marked by a lack of diversity and an unwillingness to embrace pluralism; second, it is rife with discrimination, including sexism , ageism, and racism, as well as harassment ; and third, there is a disturbing level of immaturity that permeates many corporations, often emanating from the highest levels of the company. You can probably already see these issues are interrelated: a homogenous culture is more likely to exhibit discriminatory behaviors; discrimination is more likely to run rampant and unchallenged in an immature organization. Without the awareness necessary to recognize such behaviors as inappropriate, tendencies become patterns, which become increasingly embedded not only in the industry’s culture, but also in its products.

One dynamic that perpetuates the homogeny of the industry is what companies in Silicon Valley refer to as “culture fit ,” which is the idea that to be a good addition to the organization, you must possess the same qualities as those already employed within it. Author and venture capitalist Brad Feld explains that culture fit is essentially the practice of “hiring people like everyone else in the company,” and has become the norm in many Silicon Valley companies.Footnote 6 The result is an industry that has a great deal in common with itself and is comprised primarily of people with similar backgrounds, perspectives, and experiences. The idea of culture fit is so deeply embedded within the vocabulary of Silicon Valley that Google famously has its own word for it: Googley.Footnote 7 There are two primary problems with Googleyness, aside from the cringe factor. The first is the lack of transparency about what the term encompasses. There is no list of qualities that spell out what would make someone Googley or un-Googley, and therefore there is little insight into whether the qualities Google prioritizes promote a fair and nondiscriminatory work environment. The main problem, however, is the suggestion that there is a single mold of the ideal Google employee, which encourages fitting in rather than standing out, prioritizes homogeny over diversity, and puts pressure on employees, according to former employee Justin Maxwell, to act in a “Googley way.”Footnote 8

A focus on preserving its existing culture has led many to charge that Googleyness is a vehicle for discrimination . Norman Matloff, who studies age discrimination in tech, explains that unlike gender and racial discrimination , which are captured in annual diversity reports, “the magic word ‘diversity’ doesn’t seem to apply to age in Silicon Valley,”Footnote 9 despite the fact that age discrimination lawsuits and investigations have plagued Google and other tech giants for years. In 2004, Google fired 52-year-old manager Brian Reid just over a week before the company went public. Reid filed a discrimination suit, citing comments from “his supervisors, including the company’s vice president for engineering operations, allegedly called him a poor ‘cultural fit ,’ an ‘old guy’ and a ‘fuddy-duddy’ with ideas ‘too old to matter.’”Footnote 10 (The suit settled out of court for an undisclosed amount.) A more recent case charged that “Googleyness or culture fit are euphemisms for youth and Google interviewers use these to intentionally discriminate on the basis of age.”Footnote 11 While Google continues to deny charges of ageism and discrimination , the Department of Labor found the company guilty of repeatedly engaging in “extreme” age discrimination .Footnote 12

The problem of ageism, unfortunately, is not Google’s alone, but an industry-wide bias. Matloff explains that prioritizing younger workers began largely as a cost-cutting exercise, wherein older staff were increasingly replaced with younger and cheaper employees willing to do the same work for less money.Footnote 13 Yiren Lu has suggested that if tech is “not ageist, then at least increasingly youth-fetishizing,” noting the average age at Facebook is 26 (at the more mature Hewlett-Packard, by contrast, the median age is 39).Footnote 14 However we label it, the prioritization of youth has resulted not only in destructive patterns of age-related complaints and lawsuits, but the perpetuation of uniformity in an already highly uniform culture.

99 Problems and Diversity’s Just One

In addition to prioritizing youthful employees, tech has historically failed to welcome women and people of color into its ranks across a variety of roles. Year-on-year, diversity reports at tech companies reflect the abysmal demographics of Silicon Valley’s workforce, which remains largely white and predominantly male. While such reports may not capture the complex dynamics behind the industry’s failure of diversity, they remain a useful tool to understand the scale of the problem.

When it comes to gender, recent diversity reports at Google, Facebook, and Microsoft show men make up 70%,Footnote 15 65%,Footnote 16 and 74% of all staff,Footnote 17 respectively, statistics which are broadly reflective of gender demographics across the industry. In technical roles, the numbers skew even higher: at all three companies, men make up approximately 80% of engineering roles. In leadership roles across tech firms in the Bay Area, over 72% of positions are held by men.Footnote 18 A joint study conducted by Wired and Element AI, for example, found only 12 percent of leading machine learning researchers were women,Footnote 19 a statistic that has profound implications for future bias embedded in systems that rely on AI. When it comes to ethnic diversity, the numbers are even worse. At Google, Facebook, and Microsoft , white and Asian staff make up 87%, 89%,Footnote 20 and 90% of all roles, respectively,Footnote 21 and in technical and leadership roles, the numbers again increase dramatically. For women and people of color who do make it into these roles, data suggests their pay is typically far less than that of their white and Asian male colleagues.Footnote 22

The problem with focusing on diversity statistics alone is that numbers fail to offer insight into the attitudes, behaviors, and cultural norms of the industry that drive these dynamics. Diversity reports provide quantitative data—which the tech industry loves—but they do not provide qualitative information about why the numbers are the way they are or how to make them better. While some have suggested there are simply not enough women and people of color applying for engineering jobs, research shows that even when under-represented employees are appointed to technical or leadership roles, many tech companies have difficulty retaining them. In a survey of 716 women who had worked in tech, over a quarter cited an “overtly or implicitly discriminatory” environment as their primary reason for leaving the industry.Footnote 23

Attrition rates in tech are indeed much higher for women and people of color, particularly black and Latin American employees,Footnote 24 suggesting it is likely the industry’s culture, rather than its pipeline, that makes many tech corporations unwelcoming, unfair, and unhealthy environments for those not in the majority. The suggestion that there should be more women and people of color in tech is not wrong—there should be—but embarking on a hiring spree of non-white, non-male employees will not alone change the culture of the industry, which is deeply embedded in its social and organizational psychology. Social change can be a slow and often painful process and it may take years to effectively modify the norms of a large group or an entire industry. Thankfully, there are many people stepping up to the challenge.

In her book Reset: My Fight for Inclusion and Lasting Change, Ellen Pao describes her experience working in the white, male dominated world of venture capital at Kleiner Perkins. Pao’s account of Silicon Valley portrays an industry that is not only unwelcoming, but “designed to keep people out who aren’t white men.”

You can’t always get ahead by working hard if you’re not part of the ‘in’ crowd. You will be ostracized no matter how smart you are, how bone-crushingly hard you work, how much money you make for the firm, or even how many times they insist they run a meritocracy. Year after year, we hear the same empty promises about inclusion , and year after year we see the same pitiful results.Footnote 25

Reset chronicles years of discrimination against both Pao and her female colleagues, including pay disparities and promotions that were repeatedly reserved for male colleagues. Women were consistently driven out of the firm; few lasted more than two to three years. The world Pao portrays in Reset is one of homogeny perpetuated by bias and favoritism. She recalls her former boss speaking to the National Venture Capital Association, describing ideal tech founders as “white, male, nerds who’ve dropped out of Harvard or Stanford” and have absolutely “no social life,”Footnote 26 perpetuating the false narrative of the consummate engineer: young, Caucasian, and socially skill-less.

Pao eventually filed a discrimination lawsuit against Kleiner Perkins, which she lost, but not before bearing an onslaught of abuse, harassment, and retaliation in her final weeks at the firm. Following her departure, Pao founded Project Include, an initiative that advocates for diversity and inclusion in tech. Both Project Include and Reset make strong cases for amending the psychosocial norms of the tech industry such that they are open to and inclusive of everyone, regardless of gender, race, ethnicity, disability, or age. “To make tech truly diverse,” Pao argues, “we need to make all sorts of people feel welcome and set them up to succeed.”Footnote 27

Erica Joy Baker, a founding advisor at Project Include, is an engineer who has worked in tech for over a decade. She recounts a similar environment and dynamic within her Silicon Valley engineering teams, each of which Baker describes as comprised predominantly of young, white men. As an African-American woman, Baker recalls feeling that she stuck “out like a sore thumb” in what she soon realized were consistently homogenous surroundings, where she was often neither welcomed nor recognized as an engineer.Footnote 28

I have been mistaken for an administrative assistant more than once. I have been asked if I was physical security (despite security wearing very distinctive uniforms). I’ve gotten passed over for roles I know I could not only perform in, but that I could excel in. Most recently, one such role was hired out to a contractor who needed to learn the language the project was in (which happened to be my strongest language).Footnote 29

Baker describes her time in the Bay Area as great for her career but bad for her as a person, noting the cultural dynamics that were, at best, inhospitable, at worst, sexist, racist, and discriminatory.Footnote 30 The psychological scars such treatment can inflict, particularly if sustained over a period of time, is what led many of the women I spoke to not only to leave the industry, but to do so with the knowledge they would never return.

The environment Pao and Baker describe is emblematic of a pattern in Silicon Valley that has been largely ignored and, in many cases, condoned. The homogeny, bias, and, at times, hostile culture towards those who don’t “fit” have forced Silicon Valley companies to acknowledge an industry-wide working environment that is fundamentally broken and unhealthy, and which no amount of free lunches or company perks can fix. It also illustrates an industry that fails to understand the distinction between diversity and pluralism. Where the former implies a culture or group that is mixed, pluralism is defined by a sense of inclusion, engagement, and power-sharing. Diversity is measured in numbers; pluralism is demonstrated in environments that value inclusion, equality, and respect. Facebook can hire as many women and people of color as their HR department will allow, but without engaging with the voices, talents, and experiences different people bring, diversity in itself remains a rather meaningless aim that ends with quotas and hiring targets. There are perfectly diverse populations where discrimination and harassment are still alive and well. While diversity should continue to be fought for—particularly as a first step towards a more inclusive environment—diversity on its own is not enough, and the complex problems in Silicon Valley’s environment will not be fixed without examining the culture that allows such homogeny to thrive.

The discrimination Pao and Baker depict quietly communicates the belief that women and people of color cannot perform engineering and leadership roles to the same standard as their young, white, male counterparts. In 2017, Google employee James Damore published an internal memo outlining his belief that the “abilities of men and women differ in part due to biological causes and that these differences may explain why we don’t see equal representation of women in tech and leadership.”Footnote 31 Damore suggested companies should “stop assuming that gender gaps imply sexism,” and that women were simply more prone to choose different career paths.Footnote 32 The memo received criticism both within and outside Google and Damore was soon fired for what CEO Sundar Pichai called “advancing harmful gender stereotypes” in the workplace by suggesting “a group of our colleagues have traits that make them less biologically suited to” their work.Footnote 33 Damore’s memo is at once awful and illuminating; perhaps without intending to, Damore illustrated precisely the type of discrimination that runs rampant, unchecked, and unspoken within many Silicon Valley tech companies, and has thus pushed the problem of discrimination in tech to the fore.

The question of how men and women are different, and if these differences might affect their work, is actually an interesting one—though the research does not point in the direction people like Damore might like. A 2015 study from Iowa State University found that the psychological differences between men and women were far less pronounced than most people assume. In 75 percent of the psychological qualities that were measured, including morality, risk taking, and occupational stress, men and women’s responses overlapped approximately 80 percent of the time. The study’s researchers explain these results suggest that men and women are actually “more similar than most people think, and the majority of perceived differences can be attributed to gendered stereotypes.”Footnote 34 A separate study found that where there are measureable psychometric differences between men and women, these tend to be constellated around characteristics such as empathy, compassion, problem-solving, psychological awareness, and social sensitivity, which women collectively are inclined to demonstrate more frequently.Footnote 35 A separate study on gender differences found men were more than twice as likely as women to engage in behaviors regarded as unethical.Footnote 36 (Whether these are learned or innate qualities the studies do not say.) Other researchers have mirrored these results, and shown that qualities such as collaboration,Footnote 37 empathy,Footnote 38 open-mindedness and maturity,Footnote 39 and social and emotional skills,Footnote 40 tend to be more prevalent amongst women than men. When we consider the value that more gender diversity may bring to the tech industry, the very skills research suggests may be more common in female employees are precisely those that would benefit the industry as it enters the third year of its identity crisis.Footnote 41

Yonatan Zunger, a former Google engineer has argued that the skills women bring to tech are a welcome addition to the field. “It’s true that women are socialized to be better at paying attention to people’s emotional needs and so on—this is something that makes them better engineers, not worse ones.”Footnote 42 Bob Wyman, who has worked in the industry for over forty years, has written that while men and women may differ in some respects, any purported differences “which are relevant to ‘software’ are culturally imposed.”Footnote 43 Where woman are often different, Wyman suggests, is in their refusal to celebrate and adhere to the distinctly “dysfunctional” male culture that encourages working “ridiculously hard for stupidly long hours… while exhibiting no cultural awareness or social skills.”Footnote 44 Zunger and Wyman’s accounts are the exact opposite of Damore’s: where Damore believes women are biologically less equipped to work as engineers, Zunger and Wyman recognize not only that such beliefs are unfounded, but also that the qualities women do bring to the industry are exactly those it needs most.

There is strong evidence that increasing diversity in the industry would not only elevate the psychological and emotional skillsets that are lacking in tech, but also increase profitability. Studies show companies perform better when they have at least one female executive on the boardFootnote 45,Footnote 46 and companies with a more diverse workforce across all demographic measures tend to have higher profits and earnings. Both racial and gender diversity are associated with “increased sales revenue, more customers, greater market share, and greater relative profits.”Footnote 47 A report by McKinsey similarly found that gender diversity positively impacted profitability and value creation and that the most ethnically diverse executive teams were on average 33 percent more profitable.Footnote 48 Not only, then, could the inclusion of more women and people of color in tech help shift the mindset and social priorities of Silicon Valley that are so desperately needed, but could also increase financial returns in the process.

This is not only true of gender and ethnicity, of course; increasing diversity across the industry is important in less obvious ways as well. People with different backgrounds and experiences have the capacity to consider Silicon Valley’s issues from a different perspective, which may encourage greater empathy for those using the industry’s products and more effectively consider the long-term impacts of those products on society. In a 2017 TED talk, Harvard psychologist Susan David reminds her listeners that “diversity isn’t just people, it’s also what’s inside people.” David persuasively contends that this includes thinking of diversity in terms of how we experience emotion. A greater capacity for emotional intelligence, according to David, will result in organizational dynamics that are more agile and resilient across the board. Those capable of asking questions such as “What is my emotion telling me?” “Which action will bring me towards my values?” “Which will take me away from my values?” encourages greater self-awareness and emotional agility, which tend to lead to what David describes as more “values-connected” behaviors.Footnote 49

Attitudes that have allowed beliefs such as Damore’s to proliferate in the tech community are largely the result of unconscious bias, rather than conscious malicious intent. Howard J. Ross, author of Everyday Bias, compares the unconscious assumptions we accumulate throughout our lives to a “polluted river”Footnote 50 that runs through our conscious mind, silently informing what we believe about ourselves or others, often based on false information and mistaken ideas. Ross explains that no one is exempt: we all draw on conscious bias, unconscious bias, “and stereotypes, all of the time… without realizing we are doing it.”Footnote 51 The process of stereotyping is actually a result of evolution. Stereotypes, Ross explains, “provide a shortcut that helps us navigate through our world more quickly, more efficiently, and, our minds believe, more safely,”Footnote 52 and keep us from having to reassess each situation from scratch every time we encounter something or someone new. The downside, of course, is that the same beliefs that ease our decision-making also cause a proliferation of biases, particularly in relation to people who we consider to be in some way different from us.

When left unchecked, discrimination is the inevitable precursor to a host of other issues. Not only does bias influence hiring, interviews, job assignments, and promotions,Footnote 53 it can also drive harassment, bullying, and dysfunctional cultures. Combined with the imbalance of power in Silicon Valley, which generally sits in the hands of white male executives, discrimination has led to intimidation, gross abuses of power, and inappropriate behavior throughout the industry and has given birth to what Caroline McCarthy, a former Google engineer, calls the “rampant and well-documented sexism and sexual harassment ”Footnote 54 endemic in Silicon Valley.

One of the first and most famous examples of discrimination and harassment in Silicon Valley is Susan Fowler’s account of her time working at Uber. Her first day at the company, Fowler was sexually propositioned by her manager on Uber’s internal chat system. She took screenshots and brought them to HR, but was told it was the man’s first offense and, given his status as a “high performer,” the company was unwilling to punish him. Instead, he received a warning. Soon after, the same high performer was reported again; HR reiterated to his new accuser that it was his first offense. The situation was escalated to upper management, but no action was taken. When Fowler attempted to transfer to a different team, despite her excellent performance reviews, a book contract with O’Reilly publishing, and multiple speaking engagements, she was blocked from moving within the company. When she attempted to transfer again, she was told her performance reviews had been changed; it was now noted that she showed no signs of “an upward career trajectory.” The wide-spread sexist attitudes at Uber ran deeply throughout the organization, resulting in an exodus of female employees, including Fowler, who left after a year, calling it “an organization in complete, unrelenting chaos.”Footnote 55 When she joined the company, Fowler’s department was over 25% female; when she attempted to transfer, that number had dropped to 6%; by the time she left, only 3% of the SRE engineers in the company were women. Before her departure, Fowler attended an all-hands meeting, where she asked a director what was being done to address the depleted numbers of women in the organization: “his reply was, in a nutshell, that the women of Uber just needed to step up and be better engineers.”Footnote 56 At the time of this writing, the company is being investigated by the EEOC over charges of gender inequity,Footnote 57 and has also been accused of attempting to silence not only its own employees, but female riders who have reported harassment and rape by the company’s driver partners .Footnote 58

Fowler’s case may be one of the most notable, but her experience is hardly an anomaly. For every Ellen Pao, Erica Joy Baker, and Susan Fowler, there are countless cases of discrimination , bullying, and harassment that go unreported or which tech companies keep out of the public eye. A 2017 survey found that 60 percent of female employees working in tech in Silicon Valley had experienced unwanted sexual advances.Footnote 59 Thanks to women like Pao, Baker, and Fowler, as well as the #metoo movement, more cases than ever have been reported in the past several years. Some of the most prominent inquiries and investigations of gender and racial harassment and discrimination include:

  • Justin Caldbeck, a Venture Capitalist and founder of Binary Capital, was accused of multiple charges of sexual harassment in a suit brought against him by 6 women. While he immediately denied the claims, he soon took a leave of absence and later resigned.Footnote 60

  • Mike Cagney, CEO of SoFi, stepped down following accusations of harassment and a lawsuit by former employee Brandon Charles, who was fired after reporting the harassment of female co-workers and “alleging a toxic culture of gender-related discrimination and harassment .”Footnote 61

  • Elizabeth Scott filed a suit against VR company Upload in 2017, after she was fired for issuing a complaint about the inappropriate and “hostile atmosphere” of the company, which Scott alleged included a room in the office with a bed “to encourage sexual intercourse at the workplace,” colloquially known as the “kink room.”Footnote 62

  • A number of charges have been leveled against Tesla, including lawsuits filed on the basis of harassment, racism, discrimination, and homophobia.Footnote 63 One example includes an 11-count suit filed by California Civil Rights Law Group on behalf of DeWitt Lambert that alleges instances of “Race Harassment, Race Discrimination, Sexual Harassment, Retaliation, Failure to Prevent Harassment, Discrimination and Retaliation, Threats of Violence in Violation of the Ralph Act, Violation of the Bane Act, Failure to Accommodate, Failure to Engage in the Interactive Process, and Assault and Battery.”Footnote 64 In addition to refuting the claims, Tesla has criticized those who bring charges or complaints against the company, including engineer AJ Vandermeyden, who sued Tesla for harassment and discrimination,Footnote 65 and Tesla factory workers who have complained about working conditions and safety concerns.Footnote 66

  • Software engineer Kelly Ellis, accused her senior male colleagues at Google of harassment in 2015, including one manager telling her during a company trip to Hawaii that it was “taking all of [his] self control not to grab” her.Footnote 67

  • Whitney Wolfe sued Tinder in 2014, after she alleged the company’s Chief Marketing Officer, Justin Mateen, referred to her as a “slut” and “whore.” Wolfe also alleged she was not given the co-founder title she deserved because she was female.Footnote 68

  • Tom Preston-Werner, founder of GitHub, resigned in 2014 following sexual harassment charges and an investigation into his behavior toward female colleagues. The company found there to be no “legal wrongdoing” on Preston-Werner’s part, but “evidence of mistakes and errors of judgement.”Footnote 69

In some of the above cases, the accused were found guilty, in others they were not; some left their companies voluntarily, while others were forced out; some cases were found not to have sufficient evidence, while many settled with the plaintiffs out of court. The sheer volume of harassment lawsuits in tech have thrown light onto a culture one case described as “male bravado” combined with “unchecked arrogance” and “a laser focus on growth and financial success while ignoring workplace regulations.” The lawsuit explained how the attitudes of the organization had “filter[ed] down from the leadership team… throughout the company, empowering other managers to engage in sexual conduct in the workplace.” The result was an environment in which sexual harassment was not only condoned, but those who spoke out against it were punished.Footnote 70 Even the most forgiving employees of one tech organization under investigation described it as “a company run by young, immature men who were flush with cash and did not know how to handle their power.”Footnote 71

Research has demonstrated, somewhat unsurprisingly, that bullying and harassment lead to a low-quality work environment, not only for those who are victimized, but also for those who witness inappropriate behaviors, which may take the form of “insulting remarks and ridicule, verbal abuse, offensive teasing, isolation, and social exclusion, or the constant degrading of one’s work and efforts.”Footnote 72 Decreased job satisfaction, decreased productivity, and high turnover are among the most common organizational consequences, to say nothing of the psychological effects on those involved, which can include depression , anxiety, and post-traumatic stress .

How Bias Is Encoded

When sexism , racism, and ageism are written into the cultural norms of an industry, it is naïve to think these would somehow not be coded into the products and services that industry produces. Given the homogeny and dysfunctional behavior of an appreciable cohort of Silicon Valley, we shouldn’t be surprised when Google’s photo service tags black people as gorillasFootnote 73; when predatory loans are targeted at racial minoritiesFootnote 74; when research photo collections supported by Facebook and Microsoft associate women with cooking and men with sportsFootnote 75; when parole decisions and risk scores used by courts are grossly biased against black peopleFootnote 76; when hostile online communities target, harass, and threaten women and minoritiesFootnote 77; when video games called RapeLay go viralFootnote 78; or when algorithms automatically produce and sell t-shirts with the words “Keep Calm and Rape A Lot,” “Keep Calm and Grope A Lot,” and “Keep Calm and Knife Her.”Footnote 79 We should be outraged, but we shouldn’t be surprised.

Though they can be incredibly complex and inexplicable even to their creators, algorithms are, at their core, machines that employ a “set of steps that can be used to make calculations, resolve problems and reach decisions.”Footnote 80 And because humans program algorithms, algorithms are encoded with human biases. Cathy O’Neil, author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, explains that the danger of training computers using existing data is that our existing data is littered with our own biases. “Algorithms are opinions embedded in code. It’s really different from what most people think of algorithms. They think algorithms are objective and true and scientific. That’s a marketing trick.”Footnote 81 O’Neil also points out that once bias is built into a system, it is incredibly difficult to remove, making it harder to correct our previous stereotypes and assumptions down the road. Instead of making things more fair, as we assume they should, O’Neil argues algorithms “automate the status quo” and encourage us to “repeat our past practices, [and] our patterns.”Footnote 82 Imagine yourself in high school. Would you act the same way, hold the same beliefs, or even use the same phrases you did back then? There’s every chance you’ve changed and matured quite a bit since your teens, given the opportunity to grow and expand your understanding of the world around you. When we take a snapshot of our values and beliefs and freeze them in time, we limit our ability to progress beyond them. When this happens individually, it’s a shame; when we freeze our prejudices, beliefs, and biases in time collectively, it may limit our capacity to grow and advance as a species.

A 2018 study in the journal Nature explains how using large data sets to program algorithms—whether they are social, research, or legal systems—will naturally perpetuate the biases, both conscious and unconscious, that we hold as a society.

A major driver of bias in AI is the training data. Most machine-learning tasks are trained on large, annotated data sets. Deep neural networks for image classification, for instance, are often trained on ImageNet, a set of more than 14 million labelled images. In natural-language processing, standard algorithms are trained on corpora consisting of billions of words. Researchers typically construct such data sets by scraping websites, such as Google Images and Google News, using specific query terms, or by aggregating easy-to-access information from sources such as Wikipedia. These data sets are then annotated, often by graduate students or through crowdsourcing platforms such as Amazon Mechanical Turk.Footnote 83

When a small subset of individuals are responsible for programming algorithms that are used throughout the world, there are bound to be disparities between the world represented in such systems and the world as it actually is. The researchers explain that in the majority of data sets used to program systems and inform research, certain groups are over-represented, while others are under-represented.

More than 45% of ImageNet data, which fuels research in computer vision, comes from the United States, home to only 4% of the world’s population. By contrast, China and India together contribute just 3% of ImageNet data, even though these countries represent 36% of the world’s population. This lack of geodiversity partly explains why computer vision algorithms label a photograph of a traditional US bride dressed in white as ‘bride’, ‘dress’, ‘woman’, ‘wedding’, but a photograph of a North Indian bride as ‘performance art’ and ‘costume.’Footnote 84

The result of having predominantly Western, white, male input into systems such as ImageNet, Google Images, and Mechanical Turk is the assumption of white, male dominance and the proliferation of racial and gendered stereotypes . When converting Spanish articles written by women into English, for example, Google Translate often defaults to “he said/wrote,” assuming the writer is male.Footnote 85 Software developed for Nikon cameras, meant to warn when subjects are blinking, routinely tag Asian subjects as blinking. Algorithms designed to process naming data tend to classify Caucasian names as “pleasant” and African American names as “unpleasant.”Footnote 86 A 2013 study by Harvard researcher Latanya Sweeney found that a greater number of ads on Google and Reuters mentioning “arrest” appeared beside searches for black identifying names than white identifying names.Footnote 87 A 2016 study by Boston University and Microsoft found that software “trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent,” yielding responses such as “Man is to computer programmer as woman is to homemaker.”Footnote 88

As algorithms increasingly take on ever more significant jobs, they will not only perpetuate grossly racist and sexist stereotypes, but will also have profound, tangible effects on peoples’ lives. This is particularly problematic in cases where automated systems are used to assist judges in parole decisions, predict areas of future crime, help employers find job candidates, and negotiate contracts.Footnote 89 Because bias is hardwired in the data set, the decisions algorithms hand down are unlikely to be fair or just, as multiple investigations have already demonstrated. In 2014, former U.S. Attorney General Eric Holder requested the U.S. Sentencing Commission to review its use of risk scores, fearing they may be furthering prejudicial behavior in the court system.

Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice… [and] may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.Footnote 90

When the Sentencing Commission failed to capitulate to Holder’s suggestion, ProPublica launched an investigation. After analyzing over 7,000 risk scores, ProPublica’s findings corroborated Holder’s concerns: algorithmic risk scores were extremely unreliable in their ability to forecast crime (only 20 percent of those predicted to commit violent crimes actually did so). ProPublica also demonstrated that the algorithm was more likely to label white defendants as low-risk and “falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.”Footnote 91

Em‘bot’iments of Bias

We are only starting to witness the impacts of our prejudices embedded in the machines and systems we create. If we take a moment to extrapolate these trends to future creations, we can easily imagine a world in which our most appalling impulses and reprehensible ideas are built into the fabric of everyday technologies. As they become more widespread, there is particular concern that physical robots , automated bots, and other anthropomorphized tools will continue to be programed or designed without appropriate oversight and ethical considerations. Without meaningful civic discussion and appropriate governmental regulation, the machines we employ to do the work we relegate to them may amplify rather than alleviate current social problems, such as the inequality and discrimination uncovered by ProPublica and others.

Some of the most obvious examples of technological bias are the personas of bots and digital assistants , which have been fashioned almost exclusively to mimic women. In an article titled “We Don’t Need Robots That Resemble Humans,” Professor Evan Selinger points out that the names bestowed upon most bots “ring gendered bells” and the services they perform are “historically associated with stereotypes of women’s work and women’s emotional labor.”Footnote 92 By assigning female rather than male voices and personas to popular digital assistants such as Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana, Google’s OK Google, and Facebook’s now defunct M, there is an implicit correlation between women and helping and administrative positions. As Adrienne LaFrance points out, “the whole point of having a digital assistant is to have it do stuff for you. You’re supposed to boss it around.”Footnote 93 There are conflicting opinions about and rationales for using a female versus a male or gender-neutral bot as an assistant, but there is little conclusive evidence to suggest a reason for their prevalence that does not involve prejudice, objectification, and uneven power dynamics.Footnote 94,Footnote 95

In the same way assigning a gender to machines “risks amplifying social prejudices and incentivizing objectification,”Footnote 96 the decision to anthropomorphize robots can take on a racial dimension as well. A quick online image search will confirm that the majority of domestic robots are white, while more menacing, Terminator-like robots, such as those developed by Boston Dynamics, tend to be darker. A 2018 study found “people perceive robots with anthropomorphic features to have race, and as a result, the same race-related prejudices that humans experience extend to robots.”Footnote 97 One group of researchers found that when a robot “has the physical appearance of a member of another race [it] is treated as a member of an outgroup and perceived as less human-like by people with racial prejudices.”Footnote 98 A second study found that when robots were perceived to be of the same group, participants were more likely to interact with and evaluate them positively.Footnote 99 The phenomenon illustrated by the research above is known as ingroup-outgroup bias, which is a form of social classification in which people identify with and favor those they perceive as similar. In the same way developers should be aware of the implications of humanizing or assigning gender to robots, care must also be taken to avoid perpetuating racial or ethnic bias as the field of robotics becomes more prevalent in our everyday lives. While it may seem inconsequential to some that Alexa the digital assistant is a woman and Pepper the mall robot is white, it is useful to question why these are the default options engineers and roboticists have collectively deemed most appropriate. The problem is not that the individual developers and entrepreneurs in Silicon Valley are horribly racist, sexist people, but that we all exhibit subtle biases of which we are unaware. “We are all biased. We’re all racist and bigoted in ways that we wish we weren’t, in ways that we don’t even know,” explains O’Neil, “and we are injecting those biases into the algorithms.”Footnote 100

When we consider the environment of Silicon Valley, we can safely observe that there remains work to be done. Beyond the effects of exclusion, discrimination, and algorithmic bias, the tech industry as a whole suffers as a result of the attitudes and prejudices it condones. The lack of women and people of color in engineering and leadership roles raises—or should raise—the question of what is lost because of their absence and what kind of environment and culture the industry would like to prioritize moving forward.