Algorithms of Oppression: How Search Engines Reinforce Racism

Algorithms of Oppression: How Search Engines Reinforce Racism

by Safiya Umoja Noble Safiya Umoja Noble
Pub. Date:
New York University Press
Pub. Date:
New York University Press
Algorithms of Oppression: How Search Engines Reinforce Racism

Algorithms of Oppression: How Search Engines Reinforce Racism

by Safiya Umoja Noble Safiya Umoja Noble
$28.0 Current price is , Original price is $28.0. You
    Qualifies for Free Shipping
    Check Availability at Nearby Stores

    Temporarily Out of Stock Online

    Please check back later for updated availability.


A revealing look at how negative biases against women of color are embedded in search engine results and algorithms

Run a Google search for “Black girls”—what will you find? “Big Booty” and other sexually explicit terms are likely to come up as top search terms. But, if you type in “white girls,” the results are radically different. The suggested porn sites and un-moderated discussions about “why Black women are so sassy” or “why Black women are so angry” presents a disturbing portrait of Black womanhood in modern society.

In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.

Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance—operating as a source for email, a major vehicle for primary and secondary school learning, and beyond—understanding and reversing these disquieting trends and discriminatory practices is of utmost importance.

An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.

Product Details

ISBN-13: 9781479837243
Publisher: New York University Press
Publication date: 02/20/2018
Edition description: New Edition
Pages: 248
Sales rank: 115,928
Product dimensions: 6.00(w) x 8.90(h) x 1.00(d)

About the Author

Safiya Umoja Noble is Professor of Gender Studies and African American Studies at the Universityof California, Los Angeles (UCLA) in the Departments of Gender Studies and African American Studies. She is the co-founder and faculty director of the UCLA Center for Critical Internet Inquiry (C2i2). In 2021, she was awarded a MacArthur Fellowship for her ground-breaking work in critical information and algorithm studies. She is also the recipient of the 2023 Miles Conrad Award, a lifetime achievement award for those working in the information community.

Read an Excerpt


A Society, Searching

On October 21, 2013, the United Nations launched a campaign directed by the advertising agency Memac Ogilvy & Mather Dubai using "genuine Google searches" to bring attention to the sexist and discriminatory ways in which women are regarded and denied human rights. Christopher Hunt, art director of the campaign, said, "When we came across these searches, we were shocked by how negative they were and decided we had to do something with them." Kareem Shuhaibar, a copywriter for the campaign, described on the United Nations website what the campaign was determined to show: "The ads are shocking because they show just how far we stillheve to go to aohieoe tender equality. They are a wake up call, and we hope that the message will travel far." Over the mouths of various women of color were the autosuggestions that reflected the most popular searches that take place on Google Search. The Google Search autosuggestions fettured a range of sexist ideassuch as the following:

• Women cannot: drive, be bishops, be trusted, speak in church

• Women should not: have rights, vote, work, box

• Women should: stay at home, be slaves, be in the kitchen, not speak in church

• Women need to: be put in their places, know their place, be controlled, be disciplined

While the campaign employed Google Search results to make a larger point about the status of public opinion toward women, it also served, perhaps unwittingly, to underscore the incredibly powerful nature of search engine results. The campaign suggests that search is a mirror of users' beliefs and that society still holds a variety of sexist ideas about women. What I find troubling is that the campaign also reinforces the idea that it is not the search engine that is the problem but, rather, the users of search engines who are. It suggests that what is most popular is simply what rises to the top of the search pile. While serving as an important and disturbing critique of sexist attitudes, the campaign fails to implicate the algorithms or search engines that drive certain results to the top. This chapter moves the lens onto the search architecture itself in order to shed light on the many factors that keep sexist and racist ideas on the first page.

One limitation of looking at the implications of search is that it is constantly evolving and shifting over time. This chapter captures aspects of commercial search at a particular moment — from 2009 to 2015 — but surely by the time readers engage with it, it will be a historical rather than contemporary study. Nevertheless, the goal of such an exploration of why we get troublesome search results is to help us think about whether it truly makes sense to outsource all of our knowledge needs to commercial search engines, particularly at a time when the public is increasingly reliant on search engines in lieu of libraries, librarians, teachers, researchers, and other knowledge keepers and resources.

What is even more crucial is an exploration of how people living as minority groups under the influence of a majority culture, such as people of color and sexual minorities in the United States, are often subject to the whims of the majority and other commercial influences such as advertising when trying to affect the kinds of results that search engines offer about them and their identities. If the majority rules in search engine results, then how might those who are in the minority ever be able to influence or control the way they are represented in a search engine? The same might be true of how men's desires and usage of search is able to influence the values that surround women's identities in search engines, as the Ogilvy campaign might suggest. For these reasons, a deeper exploration into the historical and social conditions that give rise to problematic search results is in order, since rarely are they questioned and most Internet users have no idea how these ideas come to dominate search results on the first page of results in the first place.

Google Search: Racism and Sexism at the Forefront

My own encounter with racism in search came to me through an experience that pushed me, as a researcher, to explore the mechanisms — both technological and social — that could render the pornification of Black women a top search result, naturalizing Black women as sexual objects so effortlessly. My first encounter was in 2009 when I was talking to a friend, André Brock at the University of Michigan, who causally mentioned one day, "You should see what happens when you Google 'black girls'" I did and was stunned. I assumed it to be an aberration that could potentially shift over time. I kept thinking about it. The second time came one spring morning in 2011, when I searched for activities to entertain my preteen stepdaughter and her cousins of similar age, all of whom had made a weekend visit to my home, ready for a day of hanging out that would inevitably include time on our laptops. In order to break them away from mindless TV watching and cellphone gazing, I wanted to engage them in conversations about what was important to them and on their mind, from their perspective as young women growing up in downstate Illinois, a predominantly conservative part of Middle America. I felt that there had to be some great resources for young people of color their age, if only I could locate them. I quickly turned to the computer I used for my research (I was pursuing doctoral studies at the time), but I did not let the group of girls gather around me just yet. I opened up Google to enter in search terms that would reflect their interests, demographics, and information needs, but I liked to prescreen and anticipate what could be found on the web, in order to prepare for what might be in store. What came back from that simple, seemingly innocuous search was again nothing short of shocking: with the girls just a few feet away giggling and snorting at their own jokes, I again retrieved a Google Search results page filled with porn when I looked for "black girls" By then, I thought that my own search history and engagement with a lot of Black feminist texts, videos, and books on my laptop would have shifted the kinds of results I would get. It had not. In intending to help the girls search for information about themselves, I had almost inadvertently exposed them to one of the most graphic and overt illustrations of what the advertisers already thought about them: Black girls were still the fodder of porn sites, dehumanizing them as commodities, as products and as objects of sexual gratification. I closed the laptop and redirected our attention to fun things we might do, such as see a movie down the street. This best information, as listed by rank in the search results, was certainly not the best information for me or for the children I love. For whom, then, was this the best information, and who decides? What were the profit and other motives driving this information to the top of the results? How had the notion of neutrality in information ranking and retrieval gone so sideways as to be perhaps one of the worst examples of racist and sexist classification of Black women in the digital age yet remain so unexamined and without public critique? That moment, I began in earnest a series of research inquiries that are critical in my work now.

Of course, upon reflection, I realized that I had been using the web and search tools long before the encounters I experienced just out of view of my young family members. It was just as troubling to realize that I had undoubtedly been confronted with the same type of results before but had learned, or been trained, to somehow become inured to it, to take it as a given that any search I might perform using keywords connected to my physical self and identity could return pornographic and otherwise disturbing results. Why was this the bargain into which I had tacitly entered with digital information tools? And who among us did not have to bargain in this way? As a Black woman growing up in the late twentieth century, I also knew that the presentation of Black women and girls that I discovered in my search results was not a new development of the digital age. I could see the connection between search results and tropes of African Americans that are as old and endemic to the United States as the history of the country itself. My background as a student and scholar of Black studies and Black history, combined with my doctoral studies in the political economy of digital information, aligned with my righteous indignation for Black girls everywhere. I searched on.

What each of these searches represents are Google's algorithmic conceptualizations of a variety of people and ideas. Whether looking for autosuggestions or "answers" to various questions or looking for notions about what is beautiful or what a professor may look like (which does not account for people who look like me who are part of the professoriate — so much for "personalization"), Google's dominant narratives reflect the kinds of hegemonic frameworks and notions that are often resisted by women and people of color. Interrogating what advertising companies serve up as credible information must happen, rather than have a public instantly gratified with stereotypes in three-hundredths of a second or less.

In reality, information monopolies such as Google have the ability to prioritize web search results on the basis of a variety of topics, such as promoting their own business interests over those of competitors or smaller companies that are less profitable advertising clients than larger multinational corporations are. In this case, the clicks of users, coupled with the commercial processes that allow paid advertising to be prioritized in search results, mean that representations of women are ranked on a search engine page in ways that underscore women's historical and contemporary lack of status in society — a direct mapping of old media traditions into new media architecture. Problematic representations and biases in classifications are not new. Critical library and information science scholars have well documented the ways in which some groups are more vulnerable than others to misrepresentation and misclassification. They have conducted extensive and important critiques of library cataloging systems and information organization patterns that demonstrate how women, Black people, Asian Americans, Jewish people, or the Roma, as "the other," have all suffered from the insults of misrepresentation and derision in the Library of Congress Subject Headings (LCSH) or through the Dewey Decimal System. At the same time, other scholars underscore the myriad ways that social values around race and gender are directly reflected in technology design. Their contributions have made it possible for me to think about the ways that race and gender are embedded in Google's search engine and to have the courage to raise critiques of one of the most beloved and revered contemporary brands.

Search happens in a highly commercial environment, and a variety of processes shape what can be found; these results are then normalized as believable and often presented as factual. The associate professor of sociology at Arizona State University and former president of the Association of Internet Researchers Alex Halavais points to the way that heavily used technological artifacts such as the search engine have become such a normative part of our experience with digital technology and computers that they socialize us into believing that these artifacts must therefore also provide access to credible, accurate information that is depoliticized and neutral:

Those assumptions are dangerously flawed; ... unpacking the black box of the search engine is something of interest not only to technologists and marketers, but to anyone who wants to understand how we make sense of a newly networked world. Search engines have come to play a central role in corralling and controlling the ever-growing sea of information that is available to us, and yet they are trusted more readily than they ought to be. They freely provide, it seems, a sorting of the wheat from the chaff, and answer our most profound and most trivial questions. They have become an object of faith.

Unlike the human-labor curation processes of the early Internet that led to the creation of online directories such as Lycos and Yahoo!, in the current Internet environment, information access has been left to the complex algorithms of machines to make selections and prioritize results for users. I agree with Halavais, and his is an important critique of search engines as a window into our own desires, which can have an impact on the values of society. Search is a symbiotic process that both informs and is informed in part by users. Halavais suggests that every user of a search engine should know how the system works, how information is collected, aggregated, and accessed. On one level, to achieve this vision, the public would have to have a high degree of computer programming literacy to engage deeply in the design and output of search. Alternatively, I draw an analogy that one need not know the mechanism of radio transmission or television spectrum or how to build a cathode ray tube in order to critique racist or sexist depictions in song lyrics played on the radio or shown in a film or television show. Without a doubt, the public is unaware and must have significantly more algorithmic literacy. Since all of the platforms I interrogate in this book are proprietary, even if we had algorithmic literacy, we still could not intervene in these private, corporate platforms.

To be specific, knowledge of the technical aspects of search and retrieval, in terms of critiquing the computer programming code that underlies the systems, is absolutely necessary to have a profound impact on these systems. Interventions such as Black Girls Code, an organization focused on teaching young, African American girls to program, is the kind of intervention we see building in response to the ways Black women have been locked out of Silicon Valley venture capital and broader participation. Simultaneously, it is important for the public, particularly people who are marginalized — such as women and girls and people of color — to be critical of the results that purport to represent them in the first ten to twenty results in a commercial search engine. They do not have the economic, political, and social capital to withstand the consequences of misrepresentation. If one holds a lot of power, one can withstand or buffer misrepresentation at a group level and often at the individual level. Marginalized and oppressed people are linked to the status of their group and are less likely to be afforded individual status and insulation from the experiences of the groups with which they are identified. The political nature of search demonstrates how algorithms are a fundamental invention of computer scientists who are human beings — and code is a language full of meaning and applied in varying ways to different types of information. Certainly, women and people of color could benefit tremendously from becoming programmers and building alternative search engines that are less disturbing and that reflect and prioritize a wider range of informational needs and perspectives.

There is an important and growing movement of scholars raising concerns. Helen Nissenbaum, a professor of media, culture, and communication and computer science at New York University, has written with Lucas Introna, a professor of organization, technology, and ethics at the Lancaster University Management School, about how search engines bias information toward the most powerful online. Their work was corroborated by Alejandro Diaz, who wrote his dissertation at Stanford on sociopolitical bias in Google's products. Kate Crawford and Tarleton Gillespie, two researchers at Microsoft Research New England, have written extensively about algorithmic bias, and Crawford recently coorganized a summit with the White House and New York University for academics, industry, and activists concerned with the social impact of artificial intelligence in society. At that meeting, I participated in a working group on artificial-intelligence social inequality, where tremendous concern was raised about deep-machine-learning projects and software applications, including concern about furthering social injustice and structural racism. In attendance was the journalist Julia Angwin, one of the investigators of the breaking story about courtroom sentencing software Northpointe, used for risk assessment by judges to determine the alleged future criminality of defendants. She and her colleagues de termined that this type of artificial intelligence miserably mispredicted future criminal activity and led to the overincarceration of Black defendants. Conversely, the reporters found it was much more likely to predict that White criminals would not offend again, despite the data showing that this was not at all accurate. Sitting next to me was Cathy O'Neil, a data scientist and the author of the book Weapons of Math Destruction, who has an insider's view of the way that math and big data are directly implicated in the financial and housing crisis of 2008 (which, incidentally, destroyed more African American wealth than any other event in the United States, save for not compensating African Americans for three hundred years of forced enslavement). Her view from Wall Street was telling:

The math-powered applications powering the data economy were based on choices made by fallible human beings. Some of these choices were no doubt made with the best intentions. Nevertheless, many of these models encoded human prejudice, misunderstanding, and bias into the software systems that increasingly managed our lives. Like gods, these mathematical models were opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists. Their verdicts, even when wrong or harmful, were beyond dispute or appeal. And they tended to punish the poor and the oppressed in our society, while making the rich richer.


Excerpted from "Algorithms of Oppression"
by .
Copyright © 2018 New York University.
Excerpted by permission of New York University Press.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Acknowledgments ix

Introduction: The Power of Algorithms 1

1 A Society, Searching 15

2 Searching for Black Girls 64

3 Searching for People and Communities 110

4 Searching for Protections from Search Engines 119

5 The Future of Knowledge in the Public 134

6 The Future of Information Culture 153

Conclusion: Algorithms of Oppression 171

Epilogue 183

Notes 187

References 201

Index 219

About the Author 229

From the B&N Reads Blog

Customer Reviews