Why Queering Surveillance Capitalism with Art Matters: A Microfilm by Lynn Hershmann Leeson
In the microfilm Shadow Stalker (2019 - see below), artist Lynn Hershman Leeson highlights the consequences of integrating AI algorithms into the social sphere. The video is part of a larger interactive installation that aims to highlight the biases in AI, but also to raise awareness of data protection and privacy. At the entrance of the exhibition space, visitors are invited to give their e-mail address. An algorithm especially conceived for Shadow Stalker collects data linked to the e-mail address and projects the retrieved personal data such as pictures, bank account numbers or
Data collection is ubiquitous. From browsing the web to using apps, from facial recognition systems to the Internet of Things (IoT) and from financial transactions to healthcare records, telecommunications and drones, escaping the surveillance of data or ‘dataveillance’ is hardly an option anymore. Moreover, dataveillance disproportionately affects marginalized groups of people that are underrepresented in the data the algorithms are trained on. The algorithms hence perpetuate and reinforce discriminatory norms on identities and bodies. Facial recognition systems are, for instance, more biased with dark skin tones, especially those of females, transgender persons are misidentified and persons with disabilities are not recognized by the systems. These biases carry significant political and social implications since they are used in many areas such as law
Artists such as Hershman Leeson play a crucial role in exposing, disrupting and countering these systems. Ephemerals wants to probe and speculate on the implications of contemporary AI systems through art and queer theory - both open-ended terms - that enable a greater hack into systems of surveillance and control.
BACK