Most conversations about racism in America focus on individual acts and people. Was LuAnn wearing blackface as Diana Ross racist? When white people use the n-word ironically is it racist? Is Steve King of Iowa racist? (Yes, yes, and yes.) Unfortunately, those conversations consume so much of the air in the room, there is little energy to focus on the structures that reinforce and sustain racism, systems that guarantee that racist oppression and disparities would continue even if not one single racist epithet is uttered ever again anywhere in the world.

The problem is structural or systemic racism is usually invisible, which makes books like Algorithms of Oppression from Safiya Umoja Noble so critical to advancing justice. Noble happened to google “black girls” and the autocomplete prediction that came up was appallingly pornographic. She wondered what kind of messages that sends, how the imprimatur of search, which many people mistake for research, implicitly endorses the results no matter how problematic they may be. This motivated years of research on those algorithms that have so much power to define and shape our world. These are instructions, code that is private and proprietary, code that operates with a kind of impunity. It’s code, how can code be racist?

But these sorts of errors have been part of our information systems all along. Consider the ubiquitous Dewed Decimal System or the Library of Congress classifications. When activists lobbied successfully to get the Library of Congress to stop classifying immigrants without documents as illegal aliens, Congress hurriedly introduced a law to prohibit political influence in the classifications, as though the original classifications were not political already. In another example, although most people in the world belong to other faiths and most of the texts in the world are from other faiths, eighty percent of the classification labels in the system are related to Christianity. Even the geography of Asia and Africa get the short end of the classification stick. No one sat down to create a racist classification system, it was just a matter of attention to and awareness of white culture and attention withheld and ignorance of non-white cultures and sources.

White supremacy is often a matter of simply being the default. By being the default, nothing needs to be said, it just is. Noble makes the critical point that bad search results are the result of decisions made in creating the algorithms that govern what they produce. That they can be modified for better results is evident by Google’s decision to downgrade the rank of porn results. That result ranking can be manipulated is evidenced by the big Search Engine Optimization industry, by Dan Savage’s cyber-prank santorum, and by Google’s own actions.


Noble does an excellent job making her point. She builds her case carefully and expands beyond the readily obvious to looking deeper, at card catalogs, at the Library of Congress, making the argument that if information retrieval is biased, the basic facts needed to redress inequity are not even available. When examining information classification, archival, and search algorithms through the lens of critical race theory, Noble strives to be dispassionate and analytical. It makes those chapters a bit of a struggle. Some might say she becomes “overly academic”, but that reminds me of this tweet.

It’s actually not that there is too much academic jargon. There isn’t. It is that sometimes she writes dispassionately about something we know she cares about passionately. If we care for justice, we must be passionate about it, too. The thing is, when people write about race and gender and challenge the power structure, they will be attacked. Writing dispassionately, analytically, dare we say academically, is a necessary bulwark against those attacks.

Her passion comes through loud and clear in the final chapter. There is a riff on how technology will not save us that would make Gil Scott-Heron proud. She takes down a lot of popular shibboleths of the technocrats, the idea the colorblind is good, that racism has been solved, that the solution to racism is an individual practice, that we are postracial, that all we need to fix racism is fix the feeder system to education and industry. She presents the evidence and has receipts. This is a myth-busting book, busting the comfortable myth that technology and computers can’t be racist. They are made by people, of course they can be.

I received an e-galley of Algorithms of Oppression from the publisher through NetGalley.

Advertisements