Module 3: Screen Media Cultures

3.9 Algorithms of Oppression

The final segment of today’s module turns away from representation and toward algorithmic culture. Our everyday lives are mediated by algorithms: targeted ads on Facebook, Google search results, and social media feeds are all the result of a murky, invisible, machinic process. For example, many online activists, particularly queer people and women of colour, find that their content is falsely flagged for offensive content or ‘shadowbanned’ on Instagram and Facebook (see this report by Salty, a digital newsletter: https://saltyworld.net/algorithmicbiasreport-2/). Shadowbanning refers to when social media algorithms show some accounts, products, posts, and stories to very few people.

It is difficult for many of us to find out or understand what algorithms influence our digital experiences. Often, we don’t know who created them and what social and political frameworks or financial incentives affect the way they function. While algorithms are often misrepresented/misunderstood as neutral and non-biased (unlike humans!), algorithms are produced by people within cultures and are anything but neutral.

The guiding text for this section of the class is Safiya Noble’s book Algorithms of Oppression (2018). Noble articulately and thoughtfully dissects the mainstream attitude that search results are democratic, reminding us that search results on Google are often paid-for ads masquerading as popular results, and that companies have found ways to use keywords to maximize their search results.

Oftentimes, companies and organizations that are more homogenous online, that is, more like one another, are more successful in search algorithms (Caplan and Boyd, 2018). Widely shown social media accounts and top search results on search engines are not necessarily more ‘useful’ or popular, but companies and accounts that conform to normative standards of what is popular.

Noble discusses the white masculine culture of Silicon Valley where so many of the algorithms that govern our apps and social media platforms are crafted, which often results in gender and racial biases in our technologies. Noble insists that “where men shape technology, they shape it to the exclusion of women, especially Black women” (p. 107). Furthermore, “popular” search results that emerge in a culture shaped by white supremacy and patriarchy lead to the reinforcement of racist and sexist stereotypes online. For example, Noble writes that Google image results for professional hairstyles typically return pictures of white women, while Google image searches for unprofessional hairstyles are more likely to show natural Black hair (p. 83).

Noble is not the only scholar concerned with the undemocratic internet and with challenging the myth that computer processes are neutral. In The People’s Platform (2014), Astra Taylor writes that

“the algorithms being created are likely to reflect the dominant social norms of our day and, perhaps, be even more discriminatory than the people who devised them” (p. 132).

Furthermore, Pamela Graham, in her article “An Encyclopedia, Not an Experiment in Democracy: Wikipedia Biographies, Authorship, and the Wikipedia Subject” explains that the key contributor demographic for Wikipedia contributors is college-educated middle class white men, typically from North America or Europe (p. 230). Subsequently, an authorship bias has been demonstrated in the creation and revision of Wikipedia pages (i.e. the privileging of Western, white men as considered worthy of having their own pages, overrepresentation of whiteness and masculinity in the encyclopaedia, etc.). Ultimately, Graham concludes that,

“the power dynamics that exist in traditional print, television, and filmic biographies— where some subjects are deemed more worthy than others—do not simply disappear because biography is constructed in a “new” digital space. The hierarchical structure of the site, the cultural conditions and discourse, combined with the authorship bias, create a space that is not as open and egalitarian as the Wikipedia brand initially suggests” (p. 231).

An image of a white box with black text inside that says, "This Photo is Currently Unavailable"
An image of a photo that has been deleted. Source: Thomas Hawk on Flickr, licensed under CC-BY-NC 2.0.

It’s also important to keep in mind that content filters (algorithms that filter out certain search results, or flag content as ‘in violation of terms of use’ or, for example, a nipple on a feminine body), are also created by corporations reinforcing particular beliefs, boosting some search results and burying others. Meanwhile, content moderators are often restrained by profit margins. For example, when scholar Sarah T. Roberts interviewed commercial content moderators, she was told that sometimes when they flag content that violates the company’s terms of use (one example was of blackface), if the post has gone viral and is popular, they’ll be instructed to leave it up.

Algorithms operate according to profit motives and an adjacent desire to increase or prolong usership. The values, ideologies, and world that they construct in this process are somewhat of an afterthought. It’s important, however, to not get lost in ‘technological determinism’, or the view that as individuals or as a society we must powerlessly receive the world that tech giants are constructing for us. The internet is ultimately just a technology for communicating information; whether that technology is used for profit and social control, for creativity, for joy or for war, is determined by human agency. Remaining observant and critical of the ways that technology is currently being used, we can also encourage more promising uses of digital technology, aligned with a more just vision of the present and the future. The internet, like everything else, can be cripped.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Digital Methods for Disability Studies Copyright © 2022 by Esther Ignagni is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book