Racism & Technology – Kirstin Yip

For 18th November’s discussion, Katie and I led a discussion on the first three chapters of “Algorithms of Oppression” by Safiya Umoja Noble. As a class, we discussed the ways in which we individually interact with Google and the Internet and how it shaped (or did not shape) our upbringing. We shared about how Google has changed racist or sexist search results over the years since the book first started collecting data and saw how algorithms warped our ideas of both race and gender. We delved into the topics of censorship, corporate interests and finding agency and familiarity within technology (in the creation of safe spaces on the Internet).

We linked Noble’s book to several theories and past readings: Google and Silicon Valley are seen as white spaces, per Anderson’s theory, where people of color (or even people that are not men) are systematically being denied access or treated as outsiders when they are there. Linking to our previous discussions around DiAngelo’s White Fragility, we discussed who the onus should fall on for representation in algorithms: the dominant group already present in the industry or the marginalized people of color, through programs such as Black Girls Code. Noble also brought up the idea of Omi and Winant’s theory on racial formation on page 79, but we did not discuss much about that.

As a class, we were introduced to new concepts to further our understanding of capitalism vis-a-vis our discussion on Google and algorithms. We learned about the idea of soft power and hard power and Max Weber’s iron cage of rationality. Weber’s idea of the iron cage becomes relevant to analyze technology’s place in our lives as it grows in dominance. Are we okay with giving up our privacy and data and conforming to using racist and sexist software if it’s the most convenient? Is it even possible to avoid using Google and all its algorithms?

I was surprised to learn how many people still used Google as their main search engine, expecting that there would be more push back since more people across the globe are beginning to understand its more nefarious, profit-driven intentions. Upon reflection with Prof. Greene I also understood how connected I still was to Google, through Gmail, Google Docs, Google Slides, Duo and the use of my Gmail account to link to accounts on other websites. Even if I wasn’t using the main search engine, I was still stuck in the iron cage that is Google’s services.

We also analyzed how soft and hard power could explain whether Google was a form of American or Western imperialism. Noble raised a point about how Google is seen as an international product but is predicated on U.S.-based norms, which several classmates refuted, or attributed to more ‘Western’ ideals rather than American ideals.

I found this particularly interesting. As someone who grew up outside of the United States, I knew Google was from the U.S., but I never doubted its knowledge of regional contexts given that it had special extensions like .sg for Google Singapore, or .uk for Britain. I couldn’t see how something like Google – which to me, was more like a sandbox – could impose any norms on its users. It’s still difficult for me to see it in my own life at this point, but now I have a new lens to view it with as I go about my further interactions with Google.

When I presented the article about using your forearm as a ‘pass’ to get into the Black People Twitter, I was surprised at the responses from the class and how many people thought it was unacceptable. I disagree and think that people are entitled to safe spaces, even digitally, especially people who have been denied and degraded for so long in these same spaces. I don’t think using your forearm is the best way to go about it, for efficiency’s sake, but I understand the intention behind it.

Technological redlining and algorithmic oppression as terms to frame the discussion were things that could have been discussed more. Noble also brought up suggestions of legal protections and education that we could have touched on, with more general deliberation on what we can do to counter this technological iron cage.

Abby raised an interesting question about which advertisers pay Google the most. I wasn’t able to find any information for recent years, but a chart revealed that Amazon was the top spender in 2013, followed by Priceline, AT&T, and Expedia. (Source) More recent data from 2018 shows the industries that spend the most on Google advertising, which are retail, automotive and telecom. These are some interesting things to consider about how capitalism and technology have become so deeply intertwined.

5 thoughts on “Racism & Technology – Kirstin Yip

  1. cbritotr

    After a great presentation by Kirstin and Katie on the first three chapters of Algorithms of Oppression, Connor and I followed up with our own discussion of some topics brought up in the latter half of the same book. We spoke of a variety of things in our presentation, highlighting some everyday encounters with racism in technology. Included in this were the soap dispensers that could only function on lighter skin tones, and the Shelby camera films that were created to make people of lighter skin look better.

    We then looked in depth at the most recent diversity report released by Google. We highlighted the ways in which these diversity reports by tech companies like Google are often very misleading and do not tell the full story of what is truly going on within the company. I am very curious to learn what people of color within these companies believe of the job hierarchy within those companies. I wonder if they truly believe they can be promoted to higher positions, or maybe they are just simply happy to be a person of color within the company in the first place. My gut tells me that a lot of these people go into these companies with tons of ambition and expectation to succeed, but then are hit with a sad reality. A sad reality in which they realize that the job hierarchy within big incorporations like Google almost always favor whiteness, whether it is intentional or not. This brings me to the notion of colorblind racism within these companies, and how much it baffles me that people believe colorblind racism does not exist. If you can look further into the numbers that these diverse companies produce, I really suggest you go ahead and do so. It’s amazing how these companies produce this diverse image where in reality it’s all fake.

    We also looked at this huge trend of revenge porn, a very awful trend that is not only allowed through technology, but almost encouraged. It baffles me that there are whole websites dedicated to revenge porn. These revenge porn websites create a safe space for you to potentially ruin another person’s life. I just don’t understand how this can be allowed, and I’m even more frustrated with how long it takes to remove something off of a site. It all comes down to the right of privacy and a lack of privacy that is extremely overlooked, often pushed under the rug. These companies give you giant lists of terms and agreements that almost everyone just skips through and never reads. These giant lists act as barriers for the customer, a barrier which has an easy fix, a quick click agreeing to a set of terms. This allows for millions of users to opt into things that they have no idea they are opting into. It’s just wild to think that so many people including myself are being taken advantage of, losing the privacy that we perceive we have. The scariest part is that no one seems to mind this loss of privacy. We are very aware of it, but it’s just not something that affects you directly. It only affects you directly when you face your own personal problem and realize that loss of privacy.

  2. kcarter2

    I think what shocked me the most about this unit was the revenge porn. First of all, I didn’t know that that was a thing and I felt that it was an odd thing that our society contributes too. I also agree with Kirsten when she mentions that it’s frustrating how these sites create spaces for revenge porn. Which I think is absurd, because sites should be monitoring things like then, and when they don’t think it opens the doors for things like revenge porn to be acceptable.

  3. jscotlan

    I thought it was a very interesting discussion about Google and its algorithms. Our talk about algorithms being inherently biased resonated with me because we discussed how the code is typically as biased as its programmer. Thinking of the instances you brought up during your presentation on times where Google has come under fire for offensive images and results and used the algorithm bias as an excuse makes me think about who in the Google headquarters is making the decision about algorithms. Is it even possible to completely control what an algorithm spews out? This concern is exacerbated by our discussion of Google and Silicon Valley being white spaces. If the technological workforce is filled with white people, then there is not a balanced system doing the work to decrease algorithm bias. It is important that more people of color are given that educational push into the computer sciences and coding at a young age, which, as we discussed, is not our current reality.

  4. slisle

    I thought it was really interesting how the class agreed that even though we’re freaked out by how much google tracks us and stores our personal information, we’re unwilling to give up the luxury of these modern technologies. It makes me think about how big of an influence google has on our lives and how powerful the corporation is. I also thought it was interesting how a lot of the software is designed for white people- like how facial recognition software easily recognizes white faces but doesn’t detect black people as well. I think that in order to change the way these technologies are designed in the future, google needs to hire more people of color for tech positions. They claim that they’re a diverse company, but in reality, a lot of the women and people of color are hired to work jobs that have nothing to do with the software.

  5. scuevasl

    This conversation was very interesting throughout because it examines so many of the implicit dynamics of our data-driven society and how powerful it is. In a way, we are being monitored constantly through our data and what we feed our computers. The conversation revolving around convenience and privacy is important to consider in what we value now and it is absolutely insane. Considering our class doesn’t really know much about living in a world without computers, this highlights other issues about what we value. Furthermore, the way this data is used to push a company’s agenda is scary because it also pushes other power dynamics and priorities certain races over others. Individual users have this same power as we saw with the pictures of forearms conversation and the subjectivity of the issues makes it difficult to consider different perspectives.

Comments are closed.