In this week’s reading, we focused on algorithms. In Algorithms of Oppression by Safiya Umoja Noble the book brings up many examples of the ways that algorithms can fail, and in her specific text it looks at marginalized groups. In chapter 1, the author goes into personal experiences of how Google has often offered sexualized results when searching for simple yet underrepresent groups, in their search bars. Chapter 2 shows a specific scope on how, or more importantly why, black people and the BLM group were able to become monetized due to Google’s lack of diverse in their work force. Chapter 3 then looks at the motive behind the terrorist attack in Charleston, South Carolina, and how Dylan Roof’s search for racial crimes created a false narrative, and therefore incited his massacre. Chapter 4 looks at data retention and how that potentially become harmful and impede on public rights. Chapter 5 gives many examples, like racial slurs, on how information becomes stores in a database and then used as fact in specific informational systems, like libraries. And finally, chapter 6 reflects on why the entirety of the book is relevant within the bounds of algorithms and needs lawful regulations.
I found this book incredibly interesting, as Noble connects many notions to civil experiences that made navigation easier. I was specifically intrigued by the chapters that tackled the false google results that lead to false “racial attitudes”, and data retention (p. 110 & 119).
Noble goes into detail on the events that transpired before the deadly shooting in South Carolina, and the risk that was unaccounted for through Google’s results. I immediately made the connection to the way the current political climate has resulted in the phrase “fake news.” In both instances, we see very powerful entities giving false information that essentially incites violence through false notions, and therefore should be simply regulated. In the case of the president, his tweets were (finally) flagged for his nonsense. In a similar sense Noble is only asking for management on Google’s behalf; to fact check, to change its algorithms past practices. Noble states “search results can reframe our thinking and deny us the ability to engage deeply with essential information,” (p. 116). The comparison made of Roof to any other individual looking for resources, shows the grasp that Google has to reliable definite data. We never question Google, we simply search, find what we’re looking for, and move on. Other than a college student or two, I’ve never experienced anyone asking for sources when they look up any sort of significant information. And as of now, Google will bold the most significant result and push it to the top of the page (was the algorithm doing that in 2018? I’m not sure).
The section on data retention also caught my attention, and the reason why is because I never thought about how significant it was for data to be erased or forgotten. I understand why certain things should be stopped when looking through public data, and the stories although compelling, had me wondering if this issue of privacy is Googles’ doing, or more so the surrounding area’s issue with certain hot topics (i.e. sex work or employing past criminal offenders). I understand the issues that algorithms have when attaching themselves to minority groups, and I’d like to clarify that I am not at all arguing that Google is responsible due to their lack of forgetting and erasing data after it has falsely promised to do so.
Questions:This book was published not too long ago. With that said, in what ways have platforms changed to help combat the racism that is produced by search engines? Or in platforms in general?