Why Queering Surveillance Capitalism with Art Matters: A Microfilm by Lynn Hershmann Leeson


In the 21st century, data has become the most valuable commodity on Earth. The new mining lords are nowadays the Big Tech companies. Armed with their artificial intelligent (AI) algorithms, they dig into gigantic pools of data to sort profiles that can be targeted by companies and governments. Shoshana Zuboff coined this new economic system that commodifies human experiences for profit as ‘surveillance capitalism’. She highlights its capacity to surveil, control, influence, and predict human behavior. Surveillance capitalism puts privacy, democracy and civil liberties under threat. In a time of political polarizations, artistic practices that subvert surveillance, referred to as surveillance art or ‘artveillance’ aims to raise awareness, critique, resist and counter surveillance and thereby transform society.

In the microfilm Shadow Stalker (2019 - see below), artist Lynn Hershman Leeson highlights the consequences of integrating AI algorithms into the social sphere. The video is part of a larger interactive installation that aims to highlight the biases in AI, but also to raise awareness of data protection and privacy. At the entrance of the exhibition space, visitors are invited to give their e-mail address. An algorithm especially conceived for Shadow Stalker collects data linked to the e-mail address and projects the retrieved personal data such as pictures, bank account numbers or
telephone numbers onto the exhibition walls. With this ‘digital shadow’, Hershman Leeson doxes the visitors to draw their attention to the vulnerability of their data and how their privacy can be compromised. Part of the installation is also a website where visitors can enter a zip code and see the percentage of predicted crimes in the area based on AI calculations. Shadow Stalker demonstrates how AI algorithms are used to surveil, control and predict human behavior, processes that often go unchecked.

Data collection is ubiquitous. From browsing the web to using apps, from facial recognition systems to the Internet of Things (IoT) and from financial transactions to healthcare records, telecommunications and drones, escaping the surveillance of data or ‘dataveillance’ is hardly an option anymore. Moreover, dataveillance disproportionately affects marginalized groups of people that are underrepresented in the data the algorithms are trained on. The algorithms hence perpetuate and reinforce discriminatory norms on identities and bodies. Facial recognition systems are, for instance, more biased with dark skin tones, especially those of females, transgender persons are misidentified and persons with disabilities are not recognized by the systems. These biases carry significant political and social implications since they are used in many areas such as law 
enforcement, banking, employment, health, education and more.

Artists such as Hershman Leeson play a crucial role in exposing, disrupting and countering these systems. Ephemerals wants to probe and speculate on the implications of contemporary AI systems through art and queer theory - both open-ended terms - that enable a great
er hack into systems of surveillance and control.

Shadow Stalker (2019), a microfilm by Lynn Hershman Leeson that is part of a larger installation of the same name.


BACK