Using Your Programming Mindset to Reverse Engineer Algorithms
Algorithms are increasingly a mundane part of everyday life (c.f. Neyland, 2019) and are embedded in all sorts of social interactions. For instance, selective algorithms govern what news we see first on news websites, which of our friends’ social media posts to look at, which YouTube videos we watch next, what TV programmes and music we might like, what products we might want to buy, and so on. However, it’s often the case that the social role of these algorithms are poorly-understood inasmuch as it’s not always clear how such algorithms inform our digital experiences and shape what we do online. Worse, we might note that when analysed, these algorithms are not objective, neutral or benign tools, but in fact can reproduce and amplify existing social injustice. For instance, before it was corrected, an earlier version of Google’s Translate algorithm was known to translate text from Turkish (as a gender neutral language that referred to people with a single pronoun ‘o’) to English (a language featuring separate words for ‘he’ and ‘she’ and so on) in such a way that ungendered sentences became laden with gender biases (e.g. ‘he is a doctor’, but ‘she is a nurse’). In this way, we might argue that rather than simply translate one language to another, this earlier version of Google’s algorithm served to reinforce dominant ideological biases in discriminatory ways.
However, with your knowledge of Python (as a tool through which such algorithms are built and executed), you now have a much better angle on how those systems might work in practice and what might be going on ‘under the hood’ of the algorithms that are integral to how we live our lives. There are several reference points that are helpful in terms of thinking about how algorithms are, increasingly, acknowledged to be social objects worth exploring, and here are a few that help us think about what algorithms are, how they work, and what we can say and do about/with them:
Beer, D. (2016) The social power of algorithms, Information, Communication & Society 20(1): 1–13.
Kitchin, R. (2016) Thinking critically about and researching algorithms, Information, Communication & Society 20(1): 14–29.
Neyland, D. (2019) The Everyday Life of an Algorithm. Palgrave Pivot.
ZIewitz, M. (2017) A not quite random walk: Experimenting with the ethnomethods of the algorithm, Big Data & Society 4(2): 1–13.
So, given what we know about how programming and algorithms work:
Can you identify an algorithm that performs some role in social interaction – some examples are listed above – and ‘reverse engineer’ that algorithm to understand better about how it works and whether we might identify social injustices as being built into it on the level of code?
This exercise is a little different from the others inasmuch as there is no code-writing to do here, but nonetheless it will provide a good opportunity to flex your skills with thinking in code. What we can do is apply our knowledge of how code is built to think through what might be happening in order for an algorithm to produce specific outputs that we can see, on the basis of specific inputs that we can see (or at least infer). Using your programming mindset in this way can help unpack those ‘black boxes’ of society such as recommender algorithms, targeted adverts, translation software and so on, and point more clearly towards (and shout more loudly about!) areas that might be problematic.