Professional Documents
Culture Documents
Algorithm Social Divide
Algorithm Social Divide
Algorithm Social Divide
Beverley Skeggs
To cite this article: Beverley Skeggs (2020): Algorithms for “hers”: in whose interests?, Feminist
Media Studies, DOI: 10.1080/14680777.2020.1783798
Computational algorithms don’t just enable speeded up trade, they enable repeated
trade; they are not based on a single form of exchange such as the wage given for labour
in a contract of employment and exploitation. Rather, they enable different combinations
of our data points, ones that we have freely given through our browsers that enable us to
become a potential source of value. Unlike labour, there is no compulsion to give away
information about ourselves, in fact, we often think we are just talking to friends. This is
radically different to entering a labour contract, which in most places in the world, if we
don’t work, we don’t eat. And we don’t just give away our information once, we cumula
tively build up our information over time, through our friend networks, consumer pat
terns, TV viewing, measuring steps, actually just anything we do with a smartphone,
laptop, computer.4
Algorithms are not just put to use by advertisers. The next most frequent use and trade
of one’s data points is by surveillance agencies of some kind. These range from national
level CIA/GCHQ to state surveillance; to Palantir5; to global security company snooping,
such as the recent breach at WhatsApp by Israeli security company NSO of human rights
activists; to local government checks on social media; to use of algorithms in predictive
policing6; to employment agencies. The separation between commodification and sur
veillance is porous. Many of the surveillance sites are private companies who sell your
data between themselves, or in the case of the Indian government sell all their citizen’s
data to other nations and to private security companies,7 actually to anyone.
Trackers, traders, and surveillance agencies insert themselves through apps, browsers,
any computer use, and we let them in: did you click the agree to use button? Over
100million Alexa devices had been sold by January 2019, devices powered by machine
learning algorithms that listen and learn our every move. They are “pre-emptive.” They
insert themselves into time that was once protected from exploitation, time that is not
labour time. They make the boundaries between labour, exploitation, and life porous and
ambiguous, just as we do every time, we use social media. They build on our history and
behaviour, trying to predict our every move. It is the prediction that entraps us in our
structural classifications of gender, race, and class, making different intersectional “hers.”
Remember Belle and Lara? Belle was targeted for her lack of value. She was sold for
financialization. Had Belle taken out the debt she was offered through advertising, she
would have been further locked into her impoverished position. We know of the massive
increases in female debt, that working-class women in the UK bear the burden of debt,8
and that Black women in the US bear an even larger debt burden (Louise Seamster 2019).
Belle was not just monetized, with her data points sold to debt providers and agencies,
but also commodified for financialization, i.e., turned into a product to be sold again and
again, to accrue profit for debt financing agencies through astronomical interest charges.
The algorithms are most definitely not in Belle’s interests, the effects of their stealthy
operations are organized against her, they would make her life much harder. The algo
rithms also made lack a key data signal to be aggregated into future tracking and trading
of Belle, repeatedly concretizing her “lack.” These value identifying algorithms (of which
there are millions) are computational compound classifications of class, race, and gender
(amongst others), constantly stratifying by stealth, operating as machine learning abstrac
tions of inequality and injustice.
To conclude, algorithms produce very different “hers.” At the extremes, many women are
trapped in debt bondage whilst others free to play, experiment, and even capitalize upon
FEMINIST MEDIA STUDIES 3
their own networks. By making some “hers” visible to debt they reproduce centuries-old
systems of valuation and classification “Hers” are being algorithmically produced through
different interests, mainly by those who pay to develop, use, and capitalize upon people for
profit, for surveillance, often both. Basic non-computational algorithms were used for
accounting purposes within slave financing. Computational power extends and amplifies
their function and potential. What is consistent is that although the computational algorithm
may be (relatively) new they build on histories: histories of colonialism, capitalism, misogyny
(Beverley Skeggs 2019). But in the present, they are much more difficult to challenge we
can’t see them, they operate faster than the speed of light, through stealth, and we welcome
them in, we even take pleasure in their effects. During our tracking project, we tried many
tactics to confuse, obfuscate, torment, and game the algorithms that we saw. But we came
to the conclusion that the only solution is “just say no!” which in a world where convenience
has become an imperative, is difficult. But convenience for what, whom and why? To be
tracked and traded more effectively? To free up oneself for more inequality and injustice?
Notes
1. With Dr Simon Yuill https://www.thesociologicalreview.com/value-and-values-interaction-
infrastructures-and-accumulation/and for a full description see public lecture “You are
being tracked, evaluated and sold: Digital Inequalities”, http://www.lse.ac.uk/International-
In/Videos-Podcasts/You-Are-Being-Tracked
2. See the hilarious novel “Red Plenty” (Francis Spufford 2010) and the review by Nick Dyer-
Witheford 2013
3. See Sara Harrison (2019)
4. https://blogs.lse.ac.uk/equityDiversityInclusion/2017/09/wake-up-algorithms-are-trawling-
your-phone-while-you-sleep/
5. See: https://www.palantir.com/about/. Palantir was established by Peter Theil, Donald
Trump’s tech advisor, a Facebook founder, founder of PayPal (to de-regulate finance across
national borders), a well-known libertarian, who once argued in the Stanford Review that
women should not be given the vote. As a key algo designer one could argue that his
interests do not align with A4 H.
6. On predictive policing see Mark Andrejevic 2013.
7. See Ranjan Balakumaran @ https://realmedia.press/function-creep-fintech-india-aadhar-id-
system-part-1-trading-faces/
8. http://www.lse.ac.uk/International-Inequalities/Videos-Podcasts/You-Are-Being-Tracked
Women’s Budget Group Report (2019) Household Debt and Gender hhtps://wbg.org.uk/wp-
content/uploads/2019/10/DEBT-2019.pdf
Disclosure statement
No potential conflict of interest was reported by the author.
References
Andrejevic, Mark. 2013. Infoglut: How Too Much Information Is Changing the Way We Think and Know.
London: Routledge.
Dyer-Witheford, Nick. 2013. “Red Plenty Platforms.” Culture Machine 13. https://culturemachine.net/
wp-content/uploads/2019/05/511-1153-1-PB.pdf.
4 B. SKEGGS
Harrison, Sarah. 2019. “Five Years of Tech Diversity Reports and Little Progress. https://www.wired.
com/story/five-years-tech-diversity-reports-little-progress/
Seamster, Louise. 2019. “Black Debt, White Debt.” American Sociological Association 18 (1): 30–35.
Skeggs, Beverley. 2019. “The Forces that Shape Us: The Entangled Vine of Gender, Race and Class.”
The Sociological Review 67 (1): 28–35. doi:10.1177/0038026118821334.
Skeggs, Beverley, and Simon Yuill. 2016. “The Methodology of a Multi-model Project Examining How
Facebook Infrastructures Social Relations.” Information, Communication & Society 19 (10):
1356–1372. doi:10.1080/1369118X.2015.1091026.
Spufford, Francis. 2010. Red Plenty. London: Faber and Faber.
Women’s Budget Group Report. 2019. Household debt and Gender. https://wbg.org.uk/wpcontent/
uploads/2019/10/DEBT-2019.pdf