Illustration by Somnath Bhatt

Embodied resistance to technological demands

A guest post by Sareeta Amrute. Sareeta is an anthropologist exploring data, race, caste, and capitalism in global South Asia, Europe, and the United States. She is Associate Professor at the University of Washington. Twitter: @SareetaAmrute

This essay is part of our ongoing “AI Lexicon” project, a call for contributions to generate alternate narratives, positionalities, and understandings to the better known and widely circulated ways of talking about AI.


Social scientists amply document algorithmic harms and algorithmic bias, such as discrimination in hiring, medical settings, or the criminal justice system, and for good reason — these systems produce wide effects and are deployed on a massive scale. Yet there is less attention on how different forms of pleasure, affect, and desire produce and drive both normative and renegade repurposings of these systems. Pleasure, or pleasures, which I take to encompass the expressive life, range of feelings, affective charges routed through technical systems, and the systems of drives that animate social, ecological, and technical worlds, needs to be thought of as an essential, and not always positive, aspect of our technological systems.

As AI systems develop and critiques of these systems mount, we will have to come to terms with the following realities: First, forms of desire for control, power, knowledge, and progress gave rise to techno-solutionism in the first place, and repudiating those forms will require cultivating other modes of pleasure. The belief that the problems produced by algorithms can be solved by ever newer forms of technology is deep-seeded and seated in a heady mixture of (white) male cultural norms, ideas about progress that treat those on the receiving end of technological harms as ‘backward,’ and colonial norms that separate out a particular technology from the larger environmental, economic, social, and cultural contexts in which they unfold (Ricaurte 2019, Ullman 2017, Broussard 2018, Forsythe 2001, Heyward-Rotimi 2021).

All of these discourses are animated by feelings and impulses: for domination and control, yes, but also for a particular version of improvement (Berlant 2011). To take three salient examples: the desire to maintain online gaming as a male space has little to do with argument, and everything to do with creating a space for play for men (Losh 2016). In a different context, the desire, rooted in development discourse, to distribute welfare to the poor in an efficient and supposedly incorruptible manner contributed to the development of Aadhaar in India, arguably the world’s largest biometric surveillance system (Amrute 2020, Khera 2019, Singh and Jackson 2021). Similarly, the desire to know the population leads to the creation of vast policing databases that make Black bodies killable as an ongoing consequence of that drive toward comprehensive, racialized, knowledge (Noble 2018, Weheliye 2014, Puar 2017). It would be very hard to disrupt such patterns of desire rooted in histories of inequity, racism, and caste-oppression through terms like bias and harm, since such terms fail to capture the cultural categories of affiliation and disgust and the uneven distribution of power and attention that also shape AI systems.

Second, new directions in algorithmic management will come from the junk science of correlating outward signs to states of mind, as the pleasures associated with the spread of technological solutions shift terrain and become subjects of those technological systems themselves. In other words, going forward, the dream of AI systems will be to understand not only the outward characteristics but the correlation between outward signs and states of mind, in order to make predictions. To give a concrete example of this trend: an AI system like affect recognition uses visual, auditory, and other biometric sensors to pick up behavioral cues that supposedly can reveal a person’s underlying emotion. Such systems, were they to be put in place, would demand of subjects wearing these sensors that they express themselves in a manner that sensors could pick up. Such trends emerge from a love of the technological imaginary of total control coupled with a love of AI technologies’ ability to mimic human capacities (Atanasoski and Vora 2019, Latour 1996). It is out of this expression that such systems produce the evidence that will make affect recognition systems desirable for purchase by such institutions as police departments and corporations. At the same time, these systems have been largely discredited as junk science that should be banned (AI NOW 2019). Such moves to ban affect recognition move AI technologies from the domain of a desire for social control to the domain of articulating another purpose for our emotional life.

Other kinds of pleasures will rise up alongside this exclusive focus on sentiment as a sign of intention. Understanding these developments lets us ask, what other pleasures, for instance, of community and of the body, might rise up to disrupt these atomizing emotions? The emotions that have thusfar dominated discourses around AI, from exuberance to disgust, are only one part of the emotional life of AI, which also includes widespread sentiments of resignation and endurance, along with acknowledging the pleasures in reimagining these systems that move us outside the overused cycles of hype, promise, and betrayal (Amrute 2018, Berardi 2009). A recent example of these other forms of pleasure comes from the Debt Collective, who are working to build a debtors union. Realizing that algorithmic systems traded on aspiration and shame to trap people, especially Black women, into taking out loans and paying back predatory lenders, the Debt Collective organizes resistance to debt burdens by re-valuing the shared experience of shame into the power of solidarity with other debtors to negotiate debt forgiveness.

The terms that scholars like Latoya Peterson and Andre Brock use to describe pleasure in its multiplicity include eros and joy. While these terms have different connotations and sit within different conceptualizations of technology and society, scholars who wield them share an interest in unpacking what is produced in excess of a given technology’s purpose (Foucault 1990). As such, scholars thinking through pleasure as eros or joy move debate away from causal logics surrounding technologies that presume to know how negative effects within AI systems come into being. Instead, eros moves toward an understanding of embodied resistance to technology’s demands (Lorde 1978, Peterson 2021) as well as the dispersed yet powerful attachments that make social media spaces so fun to use, whether that use facilitates Black joy or white supremacy (Brock 2020, Rankine 2020). Above all, thinking about technological eros has to imagine technological pleasure from within, but orthogonal to, a given constellation of sociotechnical norms.

Eros should not be thought of as purely liberatory, or outside the circle of technological control. Instead, this essay holds on to the radical potential of eros by locating the drive toward liberation within and alongside those more everyday, coercive, and sometimes nasty affects that these systems emerge from and create as they map our worlds (Amrute 2016, Ngai 2005). To take one example, while studies of Al systems for delivery drivers catalogue harms, such as the way that driving and food delivery apps create pressure on drivers for speed, for ignoring the limits of their bodies, and for flaunting road safety (Guest 2021), it is also noteworthy that drivers develop forms of mutual support and care, such as keeping their own account books and forming roadside societies (Raval 2020, Qadri 2020). Following the forms of pleasure that unfold in these sites provides clues to the kinds of bottom-up, self-organized movements that might challenge the hegemony of the algorithm, if they are developed and drawn on.

Understanding techno-pleasures — the realm of passion, pleasure, emotion, and excess — that animate and flow through AI systems helps us come to terms with how people use algorithmic systems, why patterns within these systems endure, and where alternatives to these patterns may be seated.


Citations

AI Now 2019 Report. https://ainowinstitute.org/AI_Now_2019_Report.pdf

Amrute, Sareeta 2020. ‘After Veillance’ Series Editor for Interactions: the Magazine for the Association for Computing Machinery (ACM), Interview with Reetika Khera: http://interactions.acm.org/archive/view/november-december-2020/aadhaar-and-the-creation-of-barriers-to-welfare_____ 2018. Disgust. https://culanth.org/fieldsights/disgust_____ 2016. Encoding Race Encoding Class: Indian IT Workers in Berlin. Durham: Duke.

Atanasoski, N., & Vora, Kalindi. (2019). Surrogate humanity : Race, robots, and the politics of technological futures(Perverse modernities). Durham: Duke University Press.

Berardi, Bifo 2009. The Soul at Work. Boston: MIT.

Berlant, Lauren 2011. Cruel Optimism. Durham: Duke.

Brock, Andre 2020. Distributed Blackness: African American Cybercultures. New York: NYU.

Broussard, Meredith 2018. Artificial Unintelligence. Boston: MIT Press.

Forsythe, Diana 2001. Studying Those Who Study Us. Palo Alto: Stanford University Press.

Foucault, Michel 1990. History of Sexuality, vol 2, The Use of Pleasure. New York: Vintage.

Guest, Peter 2021. “We’re All Fighting the Giant: Gig Workers Around the World are Finally Getting Organized”. Rest of World. September 2021. https://restofworld.org/2021/gig-workers-around-the-world-are-finally-organizing/?utm_source=pocket-newtab

Khera, Reetika 2019. Dissent on Aadhaar : Big data meets Big Brother. Hyderabad: Orient Blackswan.

Losh, Elizabeth 2016. “Hiding Inside the Magic Circle: Gamergate and the End of Safe Space” boundary2o: an online journal. https://www.boundary2.org/2016/08/elizabeth-losh-hiding-inside-the-magic-circle-gamergate-and-the-end-of-safe-space/

Latour, Bruno 1996. Aramis, or the Love of Technology. Cambridge: Harvard University Press.

Lorde, Audrey 1978. Uses of the Erotic: the Erotic as Power. London: Sage.

Marcuse, Herbert 1955. Eros and Civilization. Boston: Beacon.

Noble, Safiya 2018. Algorithms of Oppression. New York: NYU Press.

Peterson, Latoya 2021. “Grounding Ourselves from Tech: How to Reclaim Pleasure in a Pandemonium” https://calendar.usc.edu/even/grounding_ourselves_from_tech_how_to_reclaim_pleasure_in_a_pandemonium

Puar, Jasbir 2017. The Right to Maim : Debility, Capacity, Disability. Durham: Duke University Press.

Ngai, Sianne 2005. Ugly Feelings. Boston: Harvard University Press.

Qadri, Rida 2020. Algorithmized but not Atomized? How Digital Platforms Engender New Forms of Worker Solidarity in Jakarta. Proceedings of the AAAI/ACM Conference on Ai, Ethics, and Society, 144.

Rankine, Claudia 2020. Just Us. New York: Greywolf Press.

Raval, Noopur 2020. “Hisaab Kitaab in Big Data: Finding Relief from Calculative Logics”in Mertia, Sandeep, ed. Lives of Data: Essays on Computational Cultures from India. Amsterdam: Institute of Networked Cultures.

Ricaurte, Paula 2019. Data Epistemologies, The Coloniality of Power, and Resistance. Television & New Media, 20(4), 350–365.

Singh, Ranjit and Steven J. Jackson 2021 “Seeing Like an Infrastructure: Low REsolution Citizens and the Aadhaar Identification Project” in Proceedings of the ACM on Human Computer Interaction 5:315.

Weheliye, Alexander 2014. Habeas viscus : Racializing assemblages, biopolitics, and black feminist theories of the human. Durham: Duke University Press.