Categories
Artificial Intelligence Education Google Network Reading Technology

Latest Read: Algorithms of Oppression

Algorithms of Oppression: How Search Engines Reinforce Racism
by Safiya Umoja Noble. Safiya is an associate professor at UCLA and is the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry. Safiya’s research as a result, considers how bias has been embedded into search engines.

Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble

Clearly, search engine algorithms are not neutral by any means. This was indeed proving to be a very disturbing issue at the time of publication in 2018.

So, how did this happen in the first place? It is rather shocking to understand that a seemingly simple search term “black girls” results in such disgusting results.

Safiya certainly reveals this unforgiving gap and Google has made efforts to fix their errors. The result of her work has brought about the term algorithmic oppression.

Safiya explores how racism, especially anti-blackness, is generated and maintained across the internet, yet is focused squarely on Google.

In addition, Safiya reveals the impact of AdWords, Google’s advertising tool. I found it interesting that since search results are altered by paid advertising, Google is more of an advertising company than a search engine company.

Is Google doing evil?

Was Google unknowingly writing algorithms that resulted in racist outcomes? Maybe. However, when the American Jewish community was inquiring to search results influenced by far-right organizations, it appears results were filtered.

Google’s assertion that its search results, though problematic, were computer generated (and thus not the company’s fault) was apparently a good-enough answer for the Anti-Defamation League (ADL), which declared, “We are extremely pleased that Google has heard our concerns and those of its users about the offensive nature of some search results and the unusually high ranking of peddlers of bigotry and anti-Semitism.”43 The ADL does acknowledge on its website its gratitude to Sergey Brin, cofounder of Google and son of Russian Jewish immigrants, for his personal letter to the organization and his mea culpa for the “Jew” search-term debacle.
p. 104

Besides, we continue to be an internet-driven society. Safiya has well documented impacts upon the African American, and to a lesser extent Hispanic and Asian American communities as well.

https://www.wired.com/story/new-formula-help-black-patients-access-kidney-care/

In addition, Safiya provides data regarding Yelp. As a result, this reveals their marketing team mimics a criminal enterprise. This too may surprise the reader.

Who is writing Google’s search algorithms?

So, how did we get to this place? How can such racist search results be permitted for the darling silicon valley company who promoted their “do no evil’ company slogan? Their initial response was weak. Google initially indicated they were not responsible for search outcomes:

If Google software engineers are not responsible for the design of their algorithms, then who is?
p.149

Be the difference you wish to see in the world

This is certainly one of the most striking examples of technology mimicking society in reproducing marginalized minorities. Certainly Google understood the error of their ways. For example, searching Google for the term “black girls,” legacy results including “Big Booty” and sexually explicit photos no longer appear. Google has certainly applied filtering to search.

This is a far wider issue than just search results. On the other hand, Propublica reveals how bias is certainly written into algorithms used by courts in criminal sentencing. Furthermore, we continue learning how misleading algorithms are impacting society today. Correspondingly, this mimics the work of Cathy O’Neil’s Weapons of Math Destruction, addressing how hidden algorithms result in the firing of an award winning teachers.

In conclusion, Algorithms of Oppression is a very well researched book. This book really certainly needs a wider audience. So, humans write algorithms. We certainly need to ensure equity within those algorithms.


Harvard Carr Center for Human Rights Policy | Algorithms of Oppression: A Conversation with Dr. Safiya Umoja Noble

The University of British Columbia | Safiya Umoja Noble – “Just Google It”: Algorithms of Oppression

ACLU | Google This: Algorithmic Oppression

Above The Noise | YouTube Algorithms: How To Avoid the Rabbit Hole

UCIBrenICS | Algorithms of Oppression

re:publica | Safiya Umoja Noble: Algorithms of Oppression

PdF YouTube | Challenging the Algorithms of Oppression

TEDxUIUC | How biased are our algorithms?

Reuters | What is data bias, and why should journalists pay attention to it?