Read alsoData as the Contemporary Panopticon
The Cyber (In)Visibility Paradox
How to Queer Your Body
Algorithmic Violence by Authoritarian Regimes



WHY QUEERING SURVEILLANCE CAPITALISM WITH ART MATTERS. A VIDEO ESSAY BY LYNN HERSHMANN LEESON.



In the 21st century data has become the most valuable commodity on Earth. The new mining lords are nowadays the Big Tech companies. Armed with their artificial intelligent (AI) algorithms, they dig into gigantic pools of data to sort profiles that can be targetted by companies and governments. 

Shoshana Zuboff coined this new economic system that commodifies human experiences for profit as ‘surveillance capitalism’. She highlights its capacity to surveil, control, influence and predict human behavior. 

Privacy, democracy and civil liberties are under serious threat, especially during a time of political polarizations. That’s why Ephemerals wants to showcase artistic practices that subvert surveillance capitalism. 

Art that questions surveillance, referred as surveillance art or ‘artveillance’ aims to raise awareness, critique, resist or counter surveillance and thereby transform society. As Ingram suggests (2012), artistic practice on surveillance is “not just a form of resistance, refusal or critique but as an index of and contributor to political and spatial transformation”.

Data collection is ubiquitous. From browsing the web to using apps, from facial recognition systems to the Internet of Things (IoT) and from financial transactions to healthcare records, telecommunications and drones, escaping the surveillance of data or ‘dataveillance’ is hardly an option anymore.

Moreover, dataveillance disproportionately affects marginalized groups of people that are underrepresented in the data the algorithms are trained on. The algorithms hence perpetuate and reinforce discriminatory norms on identities and bodies. Facial recognition systems are for instance more biased with dark skin tones especially those of females, transgender persons are misidentified and persons with disabilities are not recognized by the

systems. These biases carry significant political and social implications since they are used in many areas such as law enforcement, banking, employment, health, education and more. 

In the video Shadow Stalker (see below) by the American artist Lynn Hershman Leeson, actress Tessa Thompson narrates the issues raised with predictive policing. She is followed by the Spirit of the Deep Web that zooms out on the larger consequences of integrating AI algorithms into the social sphere.

The video is part of a larger interactive installation that aims to highlight the biases in AI, but also to raise awareness on data protection and privacy. At the entrance of the exhibition space, visitors are invited to give their e-mail address. An algorithm then collects data linked to that address and projects the retrieved personal data such as pictures, bank account numbers or telephone numbers onto the exhibition walls. With this ‘digital shadow’, Hershman Leeson doxes the visitors to draw their attention on the vulnerability of their data and how their privacy can be compromised. 

Part to the installation is also a website where visitors can enter a zip code and see the percentage of predicted crimes in the area based on AI calculations. ‘Shadow Stalker’ demonstrates how AI algorithms are used to surveil, control and predict human behavior, processes that often go unchecked. Artists such as Hershman Leeson play a crucial role in exposing, disrupting and queering these systems. 

The term ‘to queer’ has shifted throughout time. First used as an insult, it was reappropriated by non-conforming sexual and gender identities during the 1990s to question and transform the perception of sexual and gender identities. Nowadays, ‘to queer’ and the gerund ‘queering’ are often used to address oppressions of ‘non-conforming’ identities more generally based on race, 
class, disability, etc. It is with this intersectional meaning that Ephemerals uses the term.

Stark and Crawford (2019) explain that “as computational tools become standard, artists working to illuminate privacy and surveillance have an important role to play in the wider data ethics discussion”. In fact, many highly influential philosphers such as Gilles Deleuze who wrote extensively on surveillance or Herbert Marcuse that also investigated the impact of technology on society consider art as one of the most powerful tools to bring social change. 

Ephemerals wants to probe and speculate on  the implications of contemporary AI systems through art and queer theory - both open-ended terms - that enable a greater hack into systems of surveillance and control.

Shadow Stalker, a microfilm by Lynn Hershman Leeson that is part of a larger installation of the same name.


Read also

The Cyber (In)Visibility Paradox
How to Queer Your Body
Algorithmic Violence by Authoritarian Regimes



DATA AS THE CONTEMPORARY PANOPTICON:
A LOOK THROUGH SHU LEA CHEANG’S ‘3x3x6’



Shu Lea Cheang’s Foucault X (2019) film still


Shu Lea Cheang presented her site-specific installation ‘3x3x6’ (2019) at the Taiwanese pavilion of the 2019 Venice Biennale . The work was curated by queer theorist Paul B. Preciado who also contributed to the writing and research of the work. The title 3x3x6 refers to the contemporary dimensions of a prison cell (3x3m2) that are monitored by six cameras.

The work, that expanded over various rooms of the 16thcentury prison Palazzo delle Prigioni, touches upon the criminalization of non-conforming sexualities and contemporary uses of surveillance techniques. 

The installation compromised ten videos. Each of them centered around a character that was convicted for sexual misconduct. It stretched from Casanova, who was imprisoned at the specific site, to Sade or Foucault. They are presented in subverted ‘counter-historical’ videos that deconstructed the visual and legal hegemonies and demonstrate how norms are created on sexuality and gender. 

Michel Foucault was imprisoned in Poland because of his homosexuality. A prolific writer, he did not only lay the basis of surveillance studies with Discipline and Punish (1975) (Galic et al., 2017), but he also paved the way for queer theory with The History of Sexuality (1976)(Spargo, 1999).

In Discipline and Punish, Foucault claims that power and control are gained solely by observing someone. He illustrates this through the ‘panopticon’, an architectural prison concept designed by Jeremy Bentham in the 18th century that would allow a guard to watch over all inmates from one specific point. Unknowing when the prisoners are being surveilled, they would adapt their behavior at all times. 

For Foucault, modern day Western civilizations are disciplinary societies with the concept of the panopticon being used throughout institutions such as prisons, schools, factories, hospitals and the military. Watching over individuals with the use of scientific knowledge turns the individuals into ‘docile bodies’ that are easier to control, discipline, normalize and punish (Galic et al., 2017).

Building on the Foucauldian notion of panopticism, Gilles Deleuze anticipated the expansion of surveillance through data in Postscript on the Societies of Control (1990). He noticed a shift from Foucault’s ‘disciplinary societies’ to what he named ‘societies of control’ that were made possible through data collection (Beaulieu, 2006). 

Surveillance moved from concealed structures and became ubiquitous and invisible. Control is no longer exercised only through the body, but also through the transformation of data into statistics (a word deriving from the word ‘state’) that enables governing powers to control its citizens (Galic et al., 2017). Today’s facial recognition cameras link body features with personal digital data. 

In 3x3x6, Cheang installed facial recognitions cameras at the entrance of her installation to capture the visitors’ faces. They are projected in a later room the images are alterered in an aesthetic way. Cheang underlines the ethical aspect of capturing someone’s face and identity and the opaque and subjective uses of them. She also hacks the technology by using the faces for aesthetic purposes instead of domination.

In the final room of the installation, the visitors are invited to upload a video through an app of themselves dancing in support to a girl in Iran who convicted for posting a video online where she dances. The interactivity integrates the visitors in the work which raises their concern and empathy with the topic even further. 

Probing surveillance capitalism through gender and sexuality like Cheang does, is “fundamental at mounting a critique of surveillance” according to Kirstie Ball, scholar and co-founder of the journal ‘Surveillance and Society’ (Ball et. al., 2009). Surveillance has historically been used mainly on marginalized people such as political dissidents, slaves or sexual non-conforming people.

In History of Sexuality, Foucault parallels how scientific knowledge on sexuality – just like on criminality – enabled the modern control and domination of people through their sexuality. 

He notices that in the 20th century people felt obliged to confess about their sexual life to their physicians or therapists. They thus internalized norms on sexuality and aimed to conform to them. Sexuality became a fundamental social construct for moral and identity that dates back to the Victorian era where heterosexuality was considered the norm (Foucault, 1976).

To counter heteronormativity, queer theorist José Esteban Muñoz proposed his theory of ‘disidentification’. It suggests that queer artists are able to showcase the gap between how the world is represented by majority culture in comparison to their own queer realities. Closing this gap enables a more
  
 inclusive, ‘queer utopian’ vision of society (Muñoz, 1999). For Muñoz, queerness is an ideality, a utopia in the current system of privileged white heteronormativity (Muñoz, 2009). 

Shu Lea Cheang’s work can also be read through the writings of Paul B. Preciado. In Testo Junkie and The Countersexual Manifesto, he presents a counterhegemonic vision of sexuality that challenges normative constructions of gender and desire. The Countersexual Manifesto proposes a radical reimagining of sexual politics based on the rejection of heteronormativity and the embrace of non-normative forms of embodiment and desire. Preciado calls for a politics of pleasure that celebrates the diversity of sexualities and resists the regulatory forces of biopower.

With 3x3x6 Cheang challenges heteronormative norms and laws and reverses the gaze from 'being viewed' by technology to 'exposing' through it and ultimately celebrate each one’s uniqueness. By disrupting surveillance technologies, Cheang highlights the reality that “we live in a data panopticon today” while instilling a queer utopian perspective to it.

With these various strategies, Cheang disrupts technology to raise awareness on contemporary surveillance techniques, she deconstructs heteronormative historical formations and invites to collectively resist and queer contemporary surveillance systems in a symbolic and poetic way.

You can learn more about 3x3x6 on the video below.
Shu Lea Cheang disusses in 3x3x6 how data has become the contemporary panopticon.

Read also
How to Queer Your Body
Algorithmic Violence by Authoritarian Regimes



THE CYBER (IN)VISIBILITY PARADOX
AS SEEN IN HITO STEYERL’S
FUCKING DIDACTIC EDUCATIONAL .MOV FILE


In How Not To Be Seen: A Fucking Didactic Educational .MOV File (2013), Hito Steyerl delves into the complexities of visibility and invisibility within contemporary surveilled societies. Through a satirical lens, she parodies self-help videos while offering uncanny strategies to remain unseen.

Steyerl highlights the inherent dangers associated with both being seen and not being seen. On one hand, the disappearance from public discourse poses a significant threat to marginalized groups and non-conforming identities, as it can lead to erasure and further marginalization. Conversely, constant surveillance and visibility subject individuals to vulnerability, echoing Foucault's notion of surveillance as a mechanism of power.

This paradox is exacerbated in a datafied world, where surveillance from higher authorities, sousveillance from peers, and counterveillance from the dominated towards the dominating powers coalesce into an "omniveillant" digital society. The ownership and control of personal data by Big Tech companies often clash with individual agency and privacy rights, further complicating the issue.

Steyerl's video prompts reflection on these complexities, challenging viewers to consider the implications of their own visibility and invisibility in an increasingly data-driven and surveilled world. Her strategies for avoiding visibility underscore the difficulty of maintaining privacy and autonomy amidst pervasive surveillance, particularly for those whose identities or perspectives diverge from the mainstream.



Sex workers are a good example of these visbility-invisibilty paradox. While they can promote their services to a larger public, the also expose themselves to threats. To help sex workers witht he issue the collective Cypher Sex published How to Cypher Sex: A Manual for Collective Digital Self-Defense Guides. A manual that is specifically aimed to support sex workers in Belgium to hide their identities.

Talking to Donatella Portoghese from Constant that supported the publication, she explained that the collective carried a thorough research to understand the needs of sex workers in Belgium. They received demands for a similar publiction from other regions as well, but the the self-defense guide is specifically designed for the Belgian context and cannot be duplicated since it strongly  depends from the local context like local privacy laws, communication platforms, service providers, etc.). It provides however a blueprint on how to camouflage from digital surveillance.

How to Cypher Sex: A Manual for Collective Digital Self-Defense Guides can be freely downloaded here.

How to Cypher Sex: A Manual for Collective Digital Self-Defense Guides by the collective Cypher Sex
Hito Steyerl on the dangers of being both visible and invisible in today’s surveillance capitalist world in How Not To Be Seen: A Fucking Didactic Educational .MOV File.

Read also
Algorithmic Violence by Authoritarian Regimes


HOW TO QUEER YOUR BODY
LIKE MARY MAGGIC, JULIANA HUXTABLE
AND MARTINE SYMS



Artists use different strategies to explore the effects of technology and surveillance on the formation of identity. Mary Maggic, a non-binary Chinese-American artists and researcher, offer viewers a "6 Point Plan for Hormone Queering Resistance" (see below) in their manifesto for the MIT research project Open Source Estrogen that ignites a discourse on challenging conventional notions of gender and bodily autonomy. 

By posing the speculative question "what if it was possible to make estrogen in the kitchen?" Maggic confronts the biopolitical and biopower structures that seek to regulate and control bodies, akin to the intentions of dataveillance in extracting value from populations.

Illustrating Donna Haraway's seminal work A Cyborg Manifesto (1985), Maggic underscores the fluidity between organic and synthetic entities within the human body. They critique the patriarchal prejudices embedded within industrial and pharmaceutical influences on bodily composition, framing the human body as a site of continual construction and embodying elements of cyborgism.

Furthermore, Maggic draws on Judith Butler's theory of 'gender performance' from 'Gender Trouble' (1990), which challenges essentialist notions of gender by emphasizing the performative nature of identity. By facilitating 'Molecular Queering Agency,' Maggic invites participants to engage in biohacking practices, symbolically disrupting traditional gender binaries and advocating for bodily autonomy.

Biohacking - comparable to Haraway's vision of cyborg agency - offers a glimpse into a posthumanist feminist future that transcend traditional feminist paradigms. In today's digital landscape, where physical and virtual identities converge, humans exist as cyborgs in online realities, that further complicate the notions of identity and agency.

That same digital landscape is where Juliana Huxtable explores themes of identity, technology. In Untitled in the Rage (Nibiru Cataclysm), Huxtable challenges conventional notions of beauty and gender by presenting herself in exaggerated and fantastical forms, blurring the boundaries between human and machine, organic and artificial. With this image, she interrogates the ways in which technology mediates our understanding of the self and disrupts traditional categorizations of gender and sexuality.

Juliana Huxtable Untitled in the Rage (Nibiru Cataclysm)

Similarly, in Notes on Gesture, Martine Syms examines the relationship between language, gesture and identity. In this video installation, Syms combines found footage, scripted scenes and voiceover narration to explore the performative aspects of communication and the construction of identity through bodily expression.

In this work, Syms queers the body by deconstructing conventional modes of representation and challenging viewers to rethink their assumptions about race, gender and embodiment. Through her exploration of gesture and movement, Syms disrupts normative frameworks and invites viewers to consider the fluidity and multiplicity of identity and how this are mediated and interpreted through digital media.

Film still from Martine Syms’ Notes on Gesture





Mary Maggic’s Open Source Estrogen: a manifesto on hormone queering resistance.



Mary Maggic’s participatory performance Molecular Queering Agency.
Read also

The Cyber (In)Visibility Paradox
How to Queer Your Body



ALGORITHMIC VIOLENCE BY AUTHORITARIAN REGIMES


AI surveillance technologies give rise to echo chambers, political polarization and the rise of extremist and authoritarian regimes throughout the world. These regimes can use AI powered technologies such as facial recognition, databases or drones to target and attack more easily and more inhumanly groups of people. 

Gaza has been the setting for such recent devastating uses of AI technologiesn, such as ‘Lavender’, a database used by the Israeli army to kill some 37,000 Hamas targets. Lavender has been reported to have an error rate of 10% with little regard to the error rate of this technology. 

An extensive facial recognition programme based in part on Google Photos has also been deployed in the area, as well as Google’s Project Nimbus that has brought up public outcry for the unlawful collection of personal data of Palestinians.

The activitst website stopkiller.ai offers insights into these opaque proceedings. Artists have also been vocal around the Palestinian cause. In PalCoreCore (2023), the Palestinian multidisciplinary artist and researcher Dana Dawud steps out of the imagery of Palestinians as martyrs to create “an ode to the resilience of the human spirit against the forces of obliteration”. Through it Dawud wants to underscore a different way of resistance while not sapping the ongoing genocide.


Watch PalCoreCore by Dana Dawud
Screenshot of PalCoreCore (2023) by Dana Dawud