The 1990s offers a rich history of innovation through various technologies like the World Wide Web, the continuation of progress in biotechnology, and continuing advances in surveillance and artificial intelligence. Despite this, we have seen how these algorithmic technologies render users hypervisible while they remain opaque. In addition, these technologies also transform the individual into a data body—a being made of code, data, and information. Therefore, the body becomes ripe for extraction, manipulation, and the predictive logic of algorithmic technologies where we are no longer beings but simply understood as data. While many scholars have identified this phenomenon through the sciences and theory, it also stems from the field of art with artists and artist-activist collectives such as Critical Art Ensemble. Analyzing this often-forgotten history reveals the way that artists and activists were responding to this technological change by conceptualizing forms of resistance through artworks, performances, and activist projects. By looking back to the 1990s and the legacies of artists-activists as predecessors to many recent examples of algorithmic resistance by the recent generation of artists, we can think about how power persists and has evolved, how we are continuously rendered data bodies today, and what new forms of resistance may look like now.
Introduction
Recently, the concept of the dematerialization of the body to describe our digital selves has become a central topic in a variety of fields including information technology studies, surveillance studies, and critical theory.1 While many scholars have named this seemingly twenty-first-century phenomenon using a variety of different neologisms like the “duodividual,” “data self,” “data proxy,” “the pixelated person,” “algorithmic selves,” “database self,” “algorithmic identities,” and “data selves,” among others, the term in fact stems from the fields of art and activism from the 1990s.2 The term “data body” was first articulated by the artist-activist collective Critical Art Ensemble (CAE) to describe how our bodies are translated into code and information in different ways by digital technologies. Biotechnology, surveillance, and artificial intelligence, for example, convert genes into code, our faces to data points, while our online behaviors are used to make predictions that give us personalized products, ads, and services. This occurs through a specific logic of capitalism that harnesses technology to reduce individuals to commodities for the pursuit of profit. CAE also identified this version of capitalism as “pancapitalism”—the relationship between production, consumption, and order—and was one of the first to specifically connect the effects of this on the body.3 This history, however, is seldom discussed within recent analyses about our digital selves. Instead, analyzing this term through the context of art history not only helps us to better understand its forgotten origins, but also helps us envision possible modes of resistance.
While the data body has been defined throughout the years in tandem with the progression of algorithmic technologies, it also has nondigital beginnings. Colin Koopman’s recent How We Became Our Data: A Genealogy of the Informational Person (2019) identifies the “infoperson” and outlines how birth certificates, social security numbers, and the categorization of personality traits, intelligence, and race from the nineteenth century predate the types of surveillance and datafication of the individual via algorithmic technologies of today.4 Important as well is Simone Browne’s seminal work in Dark Matters: On the Surveillance of Blackness (2015), where she discusses how Black bodies have been subjected to early forms of surveillance since the transatlantic slave trade through a series of regulatory systems of branding, slave passes, and quantification that anticipate the experiences of BIPOC individuals with biometric technologies today.5
This paper will analyze the lineages of the term “data body” within the field of art, art history, and activism through a selection of key American artists and artist-activist collectives. The paper focuses on these artists because they are reflecting on their experiences of how these technologies emerged within the context of the United States. Additionally, while there is a history of artists who embraced technology’s potential, such as the Australian performance artist Stelarc who early on advocated for the “obsolescence of the body,” my focus on the critical engagement by artists on technology reveals a growing tension in the public’s reception of emerging technologies.6 While I take CAE’s term to frame and analyze the works by artist-collective subRosa from the 1990s, I also look to the recent works and writings of other artists like Zach Blas and Stephanie Dinkins to analyze our contemporary moment. These works from the 1990s offer unique insight into the burgeoning entanglements between the body and technology as well as the various affordances and implications that these entanglements engender. Contemporary artists also imagine forms of resistance to algorithmic technologies through performative works, while others look to possibilities within these technologies to transform them to be in service to people of color. In addition, while some scholars in surveillance studies have looked to recent artistic forms of resistance as a possible model for the future, the artworks remain devoid of historical contextualization.7 Instead, this paper begins with an analysis of the history of the artworks, activism, and writings by artists which, despite being overlooked, I argue have played and continue to play an important role in being at the forefront of critique.
1990s: The Data Body Emerges
CAE is a multidisciplinary artist-activist collective that emerged during the late 1980s. Beginning in Tallahassee, Florida, in 1987, the collective created public actions through multimedia events, performances, and artworks surrounding different social issues. These included Cultural Vaccines in collaboration with Gran Fury in 1989, which was a critique of the US handling of the HIV crisis; Peep Show alongside Prostitutes of New York (PONY) in 1990; and Fiesta Critica (with the Corn Maya Project in Indiantown, FL), a 1991 work that focused on the agricultural labor relations of migrant workers.8 CAE have written many theoretical and political texts and presenting them at conferences, festivals, and arts events such as Ars Electronica to disseminate their ideas to the wider public.9
Their writings demonstrate a progression of radical ideas that reflect a concern for different technologies emerging into public consciousness during the 1990s like the web and biotechnology. These ideas are connected by the term “tactical media,” which is a type of activism that uses various media to communicate a specific political message.10 Tactical media became popular during this time and was echoed in earlier anarchist texts by Hakim Bey and later artistic interventions across different media by artists and artist-collectives including RTMark, the Yes Men, and the Institute for Applied Autonomy.11 CAE’s work was foundational in demystifying not only a new form of control derived from pancapitalism and challenging utopian promises surrounding new technologies, but also in orienting individuals to updated forms of tactics of resistance.
Reflecting on the rise of the web, for example, CAE developed the concept of the “data body” that identified the transformation of the virtual body in this supposedly emancipatory virtual space. They wrote many theoretical and political texts describing how different power vectors (particularly corporations and governments) had quickly infiltrated cyberspace. In The Electronic Disturbance (1994), for instance, they primarily observed three things that I want to expand upon here: 1) how authoritarianism had infiltrated cyberspace, resulting in a dislocated model of power; 2) how this power subsequently led to the creation of the data body; and 3) how this new form of power could only be coopted within the digital sphere. Following these threads allows us to identify what type of power algorithmic technologies have today, how these continue to transform our bodies into data, and what resistance could look like now.
I. Power
First, CAE argued that despite the freedom anticipated of the internet as a medium for communication, access, and information, control by corporate interests and the government overshadowed its liberatory promise.12 One of the characteristics of this new form of authoritarian power and control was that it thrived in its absence. In other words, contrasting more archaic forms of power that relied on localized and physical spaces, power now existed virtually. This dematerialization recalls Gilles Deleuze’s pronouncement of a diffuse form of power working within flows and networks that anticipated the internet.13 Electronic space, CAE claimed similarly, creates a world without borders where control can flow easily, but this also has origins as a military technology that focused on the strategies of decentralization and mobility.14 Similar to CAE’s example of the Scythians during the Persian Wars, CAE identified how their autonomous and nomadic culture resisted any type of subjugation precisely because of their mobility. They named this type of power “nomadic power”—a form of power that is mobile, distributed, existing in an “ambiguous” space and that, according to them, also described capital’s power in cyberspace and the internet’s origins within the US military.15
As other scholars have pointed out following CAE, the paradox of freedom (free market capitalism and communication) and control (order and surveillance) creates a specific dynamic where today control comes in the form of surveillance online and freedom in the form of communication, information, and entertainment.16 Increasingly characterized as a medium for sharing content with the emergence of social media in Web 2.0, we also participate in our own online surveillance through sharing vast amounts of information and in the extraction of our free labor through advertising platforms that sell this data to advertising clients.17 In other words, freedom and control exist simultaneously. Power, however, has arguably changed since the time of CAE’s writings. While they correctly predicted how “people are reduced to data, surveillance occurs on a global scale, minds are melded to screenal reality, and an authoritarian power emerges that thrives on absence,” they could not have foreseen the degree of our participation in this dynamic but less so, our pacification and therefore, surveillance’s naturalization.18
II. The Data Body
Secondly, the dispersal and dislocation of power has had an enormous effect on how we conceptualize our bodies in virtual spaces. Initially, CAE did not call this the “data body,” but rather “electronic doppelgängers.” These are the result of the virtualization of the body that is “separate from the individual,” and “simultaneously present in numerous locations, interacting and recombining with others beyond the control of the individual and often to h/is detriment.”19 As an example, CAE describes how a person goes into a bank to apply for a loan. While this person has prepared by looking professional, anticipating the social cues, and memorizing a script, CAE reveals that the loan officer instead looks to their electronic double only to find out that this person has been late on payments and has had a bad experience with a different bank.20 CAE develops this example within the context of performance in which the person’s credit data, their electronic reality and electronic self overturn the “theater of everyday life.”21 This is true even today when our data online speak for us rather than our physical selves. Our habits, preferences, and activities from our “digital breadcrumbs” are used by companies like Google and Facebook to create a consumer profile.
Coined by CAE in a lecture at Ars Electronica (Linz, Austria) in fall 1995, the term “data body” was first published in their 1997 article “Utopian promises—Net realities” in the book of the Interface 3 Conference (Hamburg, Germany) proceedings and reproduced widely, including in the collective’s 1998 book Flesh Machine: Cyborg, Designer Babies, and New Eugenic Consciousness. It is worth quoting at length:
Payment was taken in the form of a loss of individual sovereignty, not just from those who use the Net, but from all people in technologically saturated societies. With the virtual body came its fascist sibling, the data body—a much more highly developed virtual form, and one that exists in complete service to the corporate and police state. The data body is the total collection of files connected to an individual. The data body has always existed in an immature form since the dawn of civilization. Authority has always kept records on its underlings. Indeed, some of the earliest records the Egyptologists have found are tax records. What brought the data body to maturity is the technological apparatus. With its immense storage capacity and its mechanisms for quickly ordering and retrieving information, no detail of social life is too insignificant to record and to scrutinize. From the moment we are born and our birth certificate goes online, until the day we die and our death certificate goes online, the trajectory of our individual lives is recorded in scrupulous detail. Education files, insurance files, tax files…travel files, criminal files, investment files, files into infinity.22
In other words, we are reduced to files, information, and records, and this is not a recent phenomenon, but rather one that can be traced back centuries. But the repercussions of this are twofold: 1) complete transparency for permanent surveillance, and 2) detailed information for marketers to target individuals for consumption.23 The degree to which this occurs is exacerbated by digital technologies that facilitate the instrumentalization of information for precisely these surveillance and consumerist ends. More specifically today, Big Data is captured differently and processed and analyzed in new ways using algorithms for analytics. As the surveillance scholar David Lyon warns, Big Data and what he calls the “data double,” refers to the associations of that data that consequently make us question ourselves but are also “informing us who we are, what we should desire or hope for, including who we should become.”24 Thus, much more than data used for surveillance or marketing ends, there are also implications on an ontological level where our very existence, who we are, is being molded.
III. Resistance
Finally, CAE explicitly affirmed that resistance to this new authoritative power had to take place in electronic space rather than relying on old strategies like physical protests or public sit-ins. The problem with these types of protests in the style of civil disobedience is that they cannot challenge nomadic power effectively. While civil disobedience is a form of nonviolent resistance where protesters seek concessions or negotiations, this was also bound to the spaces of power like blocking government buildings. But as this nomadic power retreats into electronic space, so too should the tactics of resistance. Electronic Civil Disobedience and Other Unpopular Ideas, CAE’s 1996 text, expounds on some of their earlier ideas about resistance. For example, they identify the need to stop the flow of information (capital) that governments, corporations, and the military have access to in order to destabilize those institutions.25 While hacking might come to mind as a form of electronic civil disobedience (ECD), and which was certainly emerging in the public consciousness with numerous cyber-attacks at the time, CAE quickly distinguished the intentional political activity of ECD from the criminality, immaturity, and self-interest of hacker culture.26
While devoting many texts to discussions of ECD and resistance at the digital level, CAE naturally moved to the field of biotech as they were already interested in the body. As they explain in an interview, “there was a revolution, but it wasn’t radicalized,” and that subversion could not keep up with the exponential growth of the internet.27 Instead, they focused more on performance, theater, and public participation in order to confront the biotech revolution. Flesh Machine (1997–98), for instance, is emblematic of many performance strategies they developed later. In a series of live performances that took place across Europe, they dressed in lab coats and, using a laboratory on stage, tested participants for their “reproductive potential.” In doing so, they not only demystified the scientific process for the public but also revealed the eugenicist implications masked by the reproductive technology industry. The tactic of educating the public through spectacles and participation became a tangible strategy that they could implement throughout their subsequent works to make various political statements.
While CAE only vaguely discusses the data body in an appendix to their text Flesh Machine, which arguably was the first to make the shift to a critique of biotechnology, the data body can still be analyzed through biotechnology and more recent technologies like surveillance and artificial intelligence. The data body is, after all, a being reduced to information, data, and files for surveillance and marketing ends. Similarly, as we shall see, biotechnology also transforms the body or, more specifically, genes to code through bioinformatics. CAE explained that the body is “invaded” by imaging technologies, thereby making the body transparent, but this is done also for economic ends, marketing to individuals with “flesh products and services” like reproductive technologies.28 In other words, the data body persists through the same logic of surveillance and marketing, and this continues throughout surveillance technologies and artificial intelligence. As we will see, however, artist-activists’ attempts to recuperate and rematerialize the body through performative demonstrations, like CAE’s embodied tactics of performance, challenge the datafication of the body today by different technologies.
Cyberfeminist Détournement
Performances and spectacles, including hoaxes, by tactical media artists were common tactics used during the late 1990s and into the 2000s. They were accompanied by online interventions like creating fake websites, hacking websites, and engaging in other online disturbances. In other words, these artists were working across virtual and real spaces, navigating these terrains that depart from CAE’s shift from ECD to performance, living theater, and spectacle. Contemporaneously, the cyberfeminist collective subRosa that emerged around Carnegie Mellon University in 1998 responded to the advancements of biotechnology in both online and offline artistic interventions. But subRosa’s activism and practice stand out because of their commitment to revealing the promissory rhetoric of biotechnology, its eugenicist subtext, and the surveillance and exploitation of women’s bodies in assisted reproductive technologies (ARTs). More specifically, their works rematerialize the body through embodied artistic practices that were otherwise understood as immaterial when perceived through informatics as a data body.29
Biotechnology is an interdisciplinary field that encompasses many different areas of research. In the 1990s alone, there were many scientific breakthroughs that were also polarizing both in the public and among artists. These included the first genetically modified food called the Flavr Savr Tomato sold in 1994; the first animal clone, Dolly, in 1996; and the draft of the human genome map in 1998. While some artists explored the field of bio art, others, like subRosa and CAE, used performance and spectacles to offer a critique. subRosa was concerned with creating media interventions, publications, and websites that critiqued biotechnologies like ARTs and the exploitation of women’s labor within them. subRosa’s works demystify the marketing tactics of the biotechnical revolution, exposing their threat to bodily integrity by using the tactic of “détournement,” commonly used by many tactical media artists.30 This tactic was especially useful in both online and offline performances to reveal the pancapitalist logic behind it. As subRosa identified, while early cyberfeminism quickly adopted Donna Haraway’s ideas of the human-machine assemblage, the body turned into data, on the other hand, brought up concerns over the body’s potential exploitation, instrumentalization, and informatization.
Early cyberfeminists made an impact in online culture by recuperating the fleshiness of the body that was otherwise presumed to be obsolete with images of body parts and discussions about viruses infecting and infiltrating the patriarchal cyberspace.31 subRosa, however, exposed early cyberfeminism’s lack of postcolonial thinking. Art historian María Fernández, who explored the lack of postcolonial theory within new media art and scholarship, reiterated this concern within cyberfeminism in her work with artist Faith Wilding.32 They questioned the naïve woman-machine hybrid initially expected to emancipate feminists in creating a postgendered world by revealing how technologies reinforce gender binaries, racism, and colonialist ideas. Using public interventions, they educated the public on how bodies (especially female bodies) perform labor in the biotech industry, how they are rendered commodities, how the patenting of life works, and how life tissue and fluids are reduced to data that then circulate within the global economy of the biosciences.
subRosa’s SmartMom (1999), by members Wilding and Hyla Willis, for example, used satire to create a fake website to reveal not only the military origins of technologies but also how these technologies were infiltrating and surveilling women’s lives. SmartMom included artwork, writings, and fictional programs and was a détournement of the “Smart Shirt” technology developed by DARPA in the late 1990s called the Georgia Tech Wearable MotherboardTM (GTWM). This smart wearable technology was created to monitor the body’s vital signs during combat by using optical fibers to detect the impact of projectiles.33 SmartMom satirizes GTWM by proposing an adaptation of the technology to surveil and monitor the behavior of pregnant women’s bodies.34 According to subRosa, while the problem for pregnant bodies was their “resistant” or “insubordinate” behavior toward surveillance, monitoring, and “cyborg adaptations,” the solution was the SmartMom Sensate Pregnancy Dress (1999). This fictional technology used code, fiberoptic lines, and radio transmitters to constantly surveil and monitor pregnant women’s bodies as well as create an imaging system that allowed for interior and exterior consideration by a remote obstetrician. By intervening telepresently, the doctor could regulate the mother’s biological functions if necessary. SmartMom could also act in a disciplinary way to dispense shocks, physical punishments, and restraints if the mother acted “irresponsibly.” Another satirical program stemming from SmartMom included Civilian Pregnancy Observation Program: Watching our Future Grow™ (1999), a civilian pregnancy observation program where volunteers could observe and monitor women and report any irregular behavior by the mother. subRosa makes evident how the cyborg that many celebrated had been stripped of her power and was reduced to her womb and reproductive potential, which no longer belonged to her but rather to a patriarchal society and government that surveils, colonizes, and exploits women’s bodies.
The artist-collectives subRosa and CAE at times collaborated to demystify the implication of new technologies on the body. In works like Cult of the New Eve (1999–2000), CAE, Wilding of subRosa, and artist Paul Vanouse, for instance, created a fictional cult in response to the Human Genome Project (HGP). Cult of the New Eve was specifically based on the female DNA sample used by HGP. Using a website and various performances, the artists placed the biotechnological industry on par with a cult to reveal its false utopian promises and similarities with eugenics. Creating public theater that turned the audience into participants, like having them ingest bread and beer that was made “using recombinant yeast supplied to us by the Human Genome Project,” their goal was to overturn the rhetoric of the biotechnology industry to create skepticism in the public.35
SmartMom was ultimately also intended to become an installation, including video and audience participation.36 Showing how an online work becomes or inspires offline performances, this anticipates subRosa member Wilding’s collaboration in Cult of the New Eve. Biotechnologies, as revealed by subRosa, reconfigure and reorganize the body into data and are also tied to surveillance practices of monitoring the body via militarized imaging technologies. Elsewhere, in bioinformatics, Eugene Thacker warns about the “database-body” and how “the very existence of genetic databases constitutes the formation of a significant type of body, doubly encoded by genetic sequences and computer bits, constantly at some distance from ‘the body itself.’”37 Instead, subRosa rematerializes the body with performances and reacquaints the audience with the materiality of tissue, fluids, and the fleshy body probed and analyzed in laboratory experiments. More than this, by involving the audience, they become active actors in their learning. As we shall see, the tactics of performance and participation in art have continued since the turn of the century with advanced surveillance and artificial intelligence technologies.
Tactics of Opacity
The data body emerges within advanced surveillance technologies like biometrics and facial recognition. The body is separated into discrete parts and made legible for machines that then render individuals as a unique set of data while also fitting within larger data sets. Privacy becomes a concern when these technologies make bodies transparent through scrutiny. Yet, a newer generation of artists has adopted a tactic of opacity to overturn surveillance technologies in different ways. Many artists use masks to resist how surveillance technologies isolate, reduce, and fragment our faces into discriminate parts as data points that are scrutinized and collected in databases. For example, in URME Surveillance (2013), Leo Selvaggio distributes 3D-printed masks of himself to the public so that they can walk the streets undetected. In How Not to Be Seen: A Fucking Didactic (2013), Hito Steyerl disappears into the green screen using paint while discussing a desire for invisibility. Adam Harvey uses camouflage through reflective cloaks that deter capture by heat-seeking drone cameras. In other works, he uses makeup and hair extensions to confuse facial recognition cameras. Zach Blas creates complex amorphous masks made from a collection of different faces, resulting in anonymity. Each of these artists plays within the margins and in-betweenness of visibility and invisibility. But despite their desire for opacity, they use highly visible methods like the mask to do so. Advanced technologies of capture are opaque; they are hidden behind complex systems that the public generally cannot see or comprehend. Blas, however, uses the same techniques of opacity that companies hide behind to perform resistance to these technologies.
Blas borrows theorist Édouard Glissant’s philosophy of “opacity” to reenvision it within the context of technics today. Rooted in identification, today’s technologies make it increasingly difficult for people to disappear. At the heart of the problem of facial recognition technologies is their reduction of identities to biological markers that are examined against universal or absolute truths. In Poetics of Relation (1990), Glissant examines the politics of visibility, recounting how individuals become fully intelligible and interpretable as objects for the development of Western knowledge. This consequently reduces them and renders them transparent, like how technology renders humans hypervisible objects of computation.38 This idea was also discussed previously by CAE in regard to how technologies render us transparent data bodies for surveillant control. Therefore, “the right to opacity,” according to Glissant, is thus an ethical stance, one that becomes the foundation of “Relation” and subsequently, liberation.39 Blas takes up this notion to conceive of an “informatic opacity,” one that resists its denigration.40
Blas’s works reflect this tactical use of opacity to deter identification to protect those most at risk. His Facial Weaponization Suite (2012–14), for instance, is a series of projects that seeks to resist facial recognition and biometric technologies by creating amorphous masks that cannot be detected as human faces by these technologies. Different masks were created to address the concerns of facial recognition technologies on different groups: queer and transgender individuals, BIPOC, women, and immigrants. Fag Face Mask (2012), for instance, was created in response to scientific studies that claimed that queerness could be detected by facial recognition technologies. One example of such a controversial study came out of Stanford University. In “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images” (2017), researchers Yilun Wang and Michal Kosinski claimed that software systems were better able to detect sexual orientation than humans.41 These studies have ethical and political implications, especially for queer individuals, as these studies also “scientifically” reinforce and legitimize a determinism in the differences between how a “gay face” and a “heterosexual face” signify. Fag Face Mask specifically aggregates queer men’s faces by utilizing three-dimensional face scanning by Microsoft’s Kinect, which produces unrecognizable and amorphous masks that in turn become irreducible and undetectable by biometric scanning. This technique is different from the way that nineteenth-century forensics and criminology, in particular by English polymath (and eugenicist) Francis Galton, attempted to establish an image of the average or “type” of a criminal through composite portraiture.42 Instead, Blas’s interventions, as he notes, share in the collective style of protests by Anonymous, Pussy Riot, and even the Zapatistas, who take to the streets to, in one way, perform concealment, opacity, and refusal.43 This performative gesture of refusal, especially alongside other performers, can be a powerful method that preserves our inherent right toward opacity.
Blas’s critique of surveillance technologies offers a nuanced consideration of gender and sexuality within them. Alongside Blas, the editors of the 2009 special issue of Surveillance & Society dedicated to the topic of gender and sexuality also discuss how fundamental gender and sexuality are to critiquing surveillance technologies.44 The contributors to the issue highlight how surveilled subjects’ experiences are nuanced within the oppressive systems of surveillance. This challenges the notion of a homogeneous experience of surveillance and a singular form of coercion. Contributors to the issue also negotiate the statement, “nothing to hide, nothing to fear,” which has become the leading discourse among those who try to systematize surveillance technologies across nations. As the editors correctly point out, “in the Anglo-American north, the politics of what is hidden and what is revealed are imbued with gendered and sexualised politics of heteronormativity and shame, and of vulnerability and fear.”45 For queer and transgender individuals, this carries “shame” when equated with concealment. Contributor Toby Beauchamp discusses this dynamic further in their book Going Stealth: Transgender Politics and U.S. Surveillance Practices (2019), by explaining how gender-nonconforming individuals are perceived as an “inherently deceptive object of state and public scrutiny,” which carries with it emotions of shame.46 However, as Beauchamp goes on to explain, the discourse of surveillance and security does not only render gender-nonconforming individuals as suspects, but the response to their perceived threat is generally an increase in surveillance through continued state profiling.47 Transgender and gender-nonconforming identities as well as BIPOC are continuously produced, regulated, and managed by the discourses and technologies of surveillance.48 While Blas’s works certainly offer us a visible form of invisibility or opacity that could work during moments of capture, other types of resistance like demanding representation in the datasets technologies use and representation within the technology sector can offer more powerful and long-lasting effects.
Embodied Possibilities, Technological Futurities
Pancapitalist power has continued to gain momentum via its very exponentialism, acceleration, and drive toward consumption through harnessing technology. As we have seen, resistance to these processes is not homogenous. Artists use a variety of tactics like creating fake websites, public hoaxes, and face scanning to create masks that disrupt the flow of information and the accessibility of data collection to critique the technologies of control. Other artists make visible the discourses and marketing tactics that companies and governments tend to use to legitimize their oppression. Yet, another group of artists strives for visibility and recognition within the very systems that target, profile, and render them hypervisible. In the context of BIPOC bodies as well as trans bodies, the question of visibility is much more complex. The experiences of BIPOC and trans bodies, as writer Os Keyes affirms, require that we apply a plurality of tactics that take place on multiple levels in the sociopolitical sphere rather than concentrated or individual efforts on the few that can afford opacity.49 There are many ways BIPOC and trans bodies can become visible through inclusivity, recognition, and representation in the datasets programmers use while also making transparent the biases programmed.
In the case of AI, the concept of the data body becomes more complex. No longer is it simply the datafication of the body as seen via surveillance or bioinformatics, but now, as Shoshana Zuboff has demonstrated, it is the datafication of our behaviors as well by “instrumentarian power.”50 Slowly, the body is reduced and understood simply by its genes, body tissue, fluids, irises, fingerprints, and, most recently, its behaviors. The datafication of the body is used by machine learning in areas like health care and biomedical sciences, insurance, school admissions, and predictive policing. These datasets are read by machine learning algorithms to organize, analyze, find trends, and make predictions. However, bias occurs with oversampling or undersampling, and skewed samples. Additionally, due to automation and software programs, these biases can result in loan rejections, wrongly identifying high-risk criminal reoffenders, and hiring discrimination that affects BIPOC individuals.51 As Cathy O’Neil questions in Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (2016), have we simply “camouflaged” racism with technology?52
Today, Black female scholars and scholars of color including Ruha Benjamin, Simone Browne, Joy Buolamwini, Timnit Gebru, and Safiya Umoja Noble are at the forefront of discussions that explore different forms of surveillance, their histories, and their impact on Black lives.53 Black artists like Keith Piper in the 1990s have also created works that demonstrate not only how algorithmic technologies prey on BIPOC, but more importantly, they also give us a glimpse into what a different future could look like. Recently, artists such as American Artist, Stephanie Dinkins, Rashaad Newsome, Karen Palmer, and Sondra Perry use new media art to bring the experiences of Black lives and alternative visions of the future within these technologies to the forefront.
Some artists, however, have fought to combat algorithmic bias by producing new forms of visibility to counter the effects of limited and homogenous datasets through artworks. Within the context of art, this strategy allows for nuance, representation, and complexity. Rather than becoming neo-Luddites, artists are transforming technology to bring visibility to underrepresented cultures and identities in creative ways that celebrate diversity. Dinkins, a multidisciplinary artist, for example, creates meaningful new discourses that highlight Black histories through embodied and performative works with AI. Following her series of recorded dialogues with BINA48—an African American AI robot by Hanson Robotics with whom she discussed questions of racism, robot rights, and robot-human relationships—Dinkins created Not The Only One (N’TOO) (2018–present). This AI chatbot is materialized as a medium-sized interactive vessel with reliefs of the faces of women in the artist’s family on its exterior. N’TOO uses a machine learning algorithm trained on the oral multigenerational histories of women from her family to create a technological memoir. The inputted data included the stories from her family and the context provided by Dinkins that forms the basis of the AI; the interaction with the audience allows it to evolve and learn. For example, it attempts to answer existential questions like “why do you exist?” that BINA48, otherwise, did not know how to answer.54 At once an archive, a creative storytelling device, and a memoir, it is also specific to Black culture and histories, demonstrating the need for technology to “reflect the goals of the communities making them.”55 It also represents and makes visible Black voices that are otherwise underrepresented in the technology industry.
Artists like Dinkins use technology to show its possibilities outside the pancapitalist logic. For instance, while N’TOO uses data, it also centers on the interactivity of work on mutual learning, as well as reciprocity between the audience and AI. The vessel depends on the intimate proximity of the audience to be activated. When the body is still reduced to data, decontextualized, and existing in digital flows and networks, resistance means recuperating the body’s diversity and complexity. More emphasis is needed on the intricacies of bodies in new media artworks. This sentiment is also important to rethinking technologies like AI. Dinkins reflects upon this as well, thinking about different authors, different stories, and different modes of thought that could create an “AI-mediated world of trust, compassion, and creativity that serves the majority in a fair, inclusive, and equitable manner.”56 Yet it is not only resistance but more importantly, as Dinkins reminds us, about creating alternative and more equitable futures today.57
Conclusion
CAE’s articulation of the data body helped identify what it is, how it emerged, and, as we have seen, how it developed with the progression of algorithmic technologies. They offer a framework from which to understand our contemporary condition but also how artists reimagine tactics to subvert the transformation of the body to code that includes rematerializing it via performances, spectacles, and audience participation. CAE’s shift from strictly online modes of subversion to performances also illuminates how, to combat pancapitalist power, we must orient ourselves to embodied forms of resistance. In our present society, when we are increasingly translated to data bodies, the need for discussions surrounding bod(ies), embodiment, identities, and nuance becomes vital. The dematerialization of ourselves and the world not only restructures the way we see the world but also how we understand and relate to it. Of particular concern is how this impacts us at an ontological level. New media art and its histories of pioneering artists and contemporary artists allow us to better understand our contemporary condition, but also enable us to imagine new ways of transforming it and relating to technology that celebrates our complexity and lived experiences.
Notes
While I discuss the presumed dematerialization of the body to code via digital technologies, the “material turn” in media scholarship has been critical to my understanding of the real, tangible, and material qualities, effects, ecologies, and entanglements of media and “the digital.” See Jussi Parikka, A Geology of Media (Minneapolis: University of Minnesota Press, 2015); Jane Bennett, Vibrant Matter: A Political Ecology of Things (Durham, NC: Duke University Press, 2010); Matthew Fuller, Media Ecologies: Materialist Energies in Art and Technoculture (Cambridge, MA: MIT Press, 2005); John Durham Peters, The Marvelous Clouds: Toward a Philosophy of Elemental Media (Chicago: University of Chicago Press, 2015).
Bernard E. Harcourt, Exposed: Desire and Disobedience in the Digital Age (Cambridge, MA: Harvard University Press, 2015); Rob Horning, “Notes on the data self,” The New Inquiry (blog), February 2, 2012, https://thenewinquiry.com/blog/dumb-bullshit; Gavin J D Smith, “Surveillance, Data and Embodiment: On the Work of Being Watched,” Body & Society 22, no. 2 (January 21, 2016): 108–39, https://doi.org/10.1177/1357034X15623622; Dana Greenfield, “Deep Data: Notes on the n of 1,” in Quantified: Biosensing Technologies in Everyday Life, ed. Dawn Nafus (Cambridge, MA: MIT Press, 2016); Frank Pasquale, “The Algorithmic Self,” The Hedgehog Review 17, no. 1 (Spring 2015), https://hedgehogreview.com/issues/too-much-information/articles/the-algorithmic-self; Natasha Dow Schüll, “Data for Life: Wearable technology and the design of self-care,” BioSocieties 11, no. 3 (2016): 317–33, https://doi.org/10.1057/biosoc.2015.47; John Cheney-Lippold, We Are Data: Algorithms and the Making of Our Digital Selves (New York: New York University Press, 2017); Deborah Lupton, Data Selves: More-than-Human Perspectives (Cambridge, UK: Polity, 2020).
Critical Art Ensemble, Flesh Machine: Cyborgs, Designer Babies, and New Eugenic Consciousness (New York: Autonomedia, 1998), 11. Interestingly, they tie pancapitalism to colonization. They write, “Conversely, why should capital refuse an opportunity that appears to be the greatest market bonanza since colonization? (31). This also anticipates the critique of data colonialism by Nick Couldry and Ulises A. Mejias’s later work in The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism (Redwood City, CA: Stanford University Press, 2019).
Colin Koopman, How We Became Our Data: A Genealogy of the Informational Person (Chicago: University of Chicago Press, 2019).
Simone Browne, Dark Matters: On the Surveillance of Blackness (Durham, NC: Duke University Press, 2015).
Stelarc’s statement about the obsolescence of the body also stems from the mind/body problem and the privileging of the mind over the body as famously elaborated by René Descartes and later expanded upon in popular culture and novels by such authors as Hans Moravec and William Gibson. See Stelarc, “Earlier Texts,” Stelarc.org, http://stelarc.org/?catID=20317; René Descartes, “Meditations on First Philosophy (II and VI),” in Philosophy of Mind: Classical and Contemporary Readings, ed. David J. Chalmers (Oxford, UK: Oxford University Press, 2002); Hans Moravec, Mind Children: The Future of Robot and Human Intelligence (Cambridge, MA: Harvard University Press, 1988); William Gibson, Neuromancer (New York: Ace Books, 1984).
The Surveillance & Society Journal, for example, includes “artistic presentations” as part of their article submission process with many scholars who have explored the field of surveillance art. For some examples see, Katherine Barnard-Wills and David Barnard-Wills, “Invisible Surveillance in Visual Art,” Surveillance & Society 10, no. 3/4 (2012): 204–14, https://doi.org/10.24908/ss.v10i3/4.4328, and Andrea Mubi Brighenti, “Artveillance: At the Crossroads of Art and Surveillance,” Surveillance & Society 7, no. 2 (2010): 175–86, https://doi.org/10.24908/ss.v7i2.4142. Elsewhere, Shoshana Zuboff, for example, identifies artistic works by Ai Weiwei, Benjamin Grosser, Adam Harvey, Trevor Paglen, and Leo Selvaggio, among others, as forms of active resistance. See The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: PublicAffairs Books, 2019), 488–91.
Critical Art Ensemble, “Critical Art Ensemble Timeline,” TDR 44, no. 4 (Winter 2000): 132–35, www.jstor.org/stable/1146867.
Critical Art Ensemble, “Critical Art Ensemble Timeline.”
Geert Lovink, “The ABC of Tactical Media,” email thread, nettime, May 16, 1997, www.nettime.org/Lists-Archives/nettime-l-9705/msg00096.html.
Anarchist scholars like Hakim Bey, for example, advocated for carving out spaces for freedom while avoiding confrontation with the state. Hakim Bey, T.A.Z.: The Temporary Autonomous Zone, Ontological Anarchy, Poetic Terrorism (1985), https://theanarchistlibrary.org/library/hakim-bey-t-a-z-the-temporary-autonomous-zone-ontological-anarchy-poetic-terrorism#toc45.
Critical Art Ensemble, The Electronic Disturbance (New York: Autonomedia, 1994), 11.
In “Postscripts on the Societies of Control” (1992), Deleuze illustrated a shift away from Michel Foucault’s enclosed spaces of disciplinary control of the eighteenth and nineteenth centuries toward twentieth-century societies of control. Whereas Foucault identified spaces like the factory, school, prison, hospital, and family as disciplinary structures, Deleuze argued that control had become decentralized, remarking that corporations had replaced the factory and were now more akin to a “spirit” or “gas” in their immateriality. The man of control, Deleuze reflects, “is undulatory, in orbit, in a continuous network. Everywhere surfing has already replaced the older sports.” Gilles Deleuze, “Postscript on the Societies of Control,” October 59 (Winter 1992): 3, 6.
Critical Art Ensemble, Flesh Machine, 142.
Critical Art Ensemble, The Electronic Disturbance, 14.
Wendy Hui Kyong Chun, Control and Freedom: Power and Paranoia in the Age of Fiber Optics (Cambridge, MA: MIT Press, 2006), 2–11. Timothy Murray also discusses the drawbacks of “cybersubjectivity,” namely its “remote control and remote division” by global capitalist power. See Timothy Murray, “Digital Terror: New Media Art and Rhizomatic In-Securities,” CTheory, April 14, 2004, https://journals.uvic.ca/index.php/ctheory/article/view/14543.
Christian Fuchs, “Critique of the Political Economy of Web 2.0 Surveillance,” in Internet and Surveillance: The Challenges of Web 2.0 and Social Media, ed. Christian Fuchs et al. (New York: Routledge, 2012), 35.
Critical Art Ensemble, The Electronic Disturbance, 3.
Critical Art Ensemble, The Electronic Disturbance, 58.
Critical Art Ensemble, The Electronic Disturbance, 59.
Critical Art Ensemble, The Electronic Disturbance, 59.
Critical Art Ensemble, Flesh Machine, 144-5. Fidèle A. Vlavo discusses CAE’s inversion of the body without organs as well as identifies how they recuperate body integrity and sovereignty. Fidèle A. Vlavo, Performing Digital Activism: New Aesthetics and Discourses of Resistance (New York: Routledge, 2018), 40–41.
Vlavo, Performing Digital Activism, 146.
David Lyon, “Surveillance, Snowden, and Big Data: Capacities, consequences, critique,” Big Data & Society 1, no. 2 (July 1, 2014): 7, https://doi.org/10.1177/2053951714541861.
Critical Art Ensemble, Electronic Civil Disobedience and Other Unpopular Ideas (New York: Autonomedia, 1996), 13.
Critical Art Ensemble, Electronic Civil Disobedience, 15.
Jon McKenzie and Rebecca Schneider, “Critical Art Ensemble, Tactical Media Practitioners,” TDR/The Drama Review 44, no. 4 (Winter 2000): 147, https://doi.org/10.1162/10542040051058537.
Critical Art Ensemble, Flesh Machine, 5.
Eugene Thacker, for instance, says, “What is unique about biotechnology generally and about the bioinformatics and pharmacogenomics industries specifically is that it is not only commodities that are becoming more immaterial, but also the bodies that consume those commodities.…[T]he hegemony of molecular genetics, when combined with new information and computer technologies, has created a context in which the biological body—‘life itself’—is increasingly understood and analyzed in informatic ways.” Eugene Thacker, The Global Genome: Biotechnology, Politics, and Culture (Cambridge, MA: MIT Press, 2006), 85–87.
Détournement was a 1950s French term popularized by the Letterist International and the Situationist International and typically involved a rerouting or hijacking technique that used satire and appropriated the images of the spectacle to overturn and critique systems of power. See Guy Debord and Gil J. Wolman, “A User’s Guide to Détournement,” Les Lèvres Nues 8 (1956), www.cddc.vt.edu/sionline/presitu/usersguide.html.
They were inspired by Donna Haraway’s “Cyborg Manifesto,” which applied the cyborg as a metaphor for the hybridized posthuman and post-gendered condition in an increasingly technological world overturning such dualisms as female/male, human/animal, and human/machine. Sadie Plant, Zeroes + Ones: Digital Women and The New Technoculture (New York: Doubleday, 1997); VNS Matrix, “The Cyberfeminist Manifesto for the 21st Century,” VNS Matrix, https://vnsmatrix.net/projects/the-cyberfeminist-manifesto-for-the-21st-century; Donna J. Haraway, “A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century” in Simians, Cyborgs and Women: The Reinvention of Nature (New York; Routledge, 1991).
María Fernández, “Postcolonial Media Theory,” Art Journal 58, no. 3 (Autumn 1999): 58–73, https://doi.org/10.2307/777861; Mária Fernández and Faith Wilding, “Situating Cyberfeminisms,” in Domain Errors!: Cyberfeminist Practices, ed. Mária Fernández, Faith Wilding, and Michelle M. Wright (New York: Autonomedia, 2002), 22, http://refugia.net/domainerrors/DE1a_situating.pdf.
Chandramohan Gopalsamy, Sungmee Park, Rangaswamy Rajamanickam, and Sundaresan Jayaraman, “The Wearable Motherboard™: The first generation adaptive and responsive textile structures (ARTS) for medical applications,” Virtual Reality 4 (1999): 152–68, https://doi.org/10.1007/BF01418152.
Faith Wilding and Hyla Willis, “SmartMom Rebooted: A Cyberfeminist Art Collective Reflects on Its Earliest Work of Internet Art,” Studies in the Maternal 8, no. 2 (2016): 2, https://doi.org/10.16995/sim.229.
Critical Art Ensemble, “Cult of the New Eve,” World-Information Institute, 2007, http://world-information.org/wio/program/objects/993058052/993058101.
Wilding and Willis, “SmartMom Rebooted.”
Eugene Thacker, “Redefining Bioinformatics: A Critical Analysis of Technoscientific Bodies,” Enculturation 3, no. 1 (Spring 2000), www.enculturation.net/3_1/thacker.html.
Glissant says, “If we examine the process of ‘understanding’ people and ideas from the perspective of Western thought, we discover that its basis is this requirement for transparency. In order to understand and thus accept you, I have to measure your solidity with the ideal scale providing me with grounds to make comparisons and, perhaps, judgments. I have to reduce.” Édouard Glissant, Poetics of Relation, trans. Betsy Wing (Ann Arbor: University of Michigan Press, 1997), 189–90.
Glissant, Poetics of Relation, 190.
Blas has an entry in this glossary. See Zach Blas, “Informatic Opacity” in Posthuman Glossary, ed. Rosi Braidotti and Maria Hlavajova (New York: Bloomsbury, 2018), 198–9.
Yilun Wang and Michal Kosinski, “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images,” Personality and Social Psychology 114, no. 2 (2018): 246–57, https://doi.org/10.1037/pspa0000098.
Allan Sekula, “The Body and the Archive,” October 39 (Winter 1986): 47, https://doi.org/10.2307/778312.
Zach Blas, “Escaping the Face: Biometric Facial Recognition and the Facial Weaponization Suite,” New Media Caucus Media-N Journal 9, no. 2, CAA Conference Edition (2013), http://median.newmediacaucus.org/caa-conference-edition-2013/escaping-the-face-biometric-facial-recognition-and-the-facial-weaponization-suite.
Kirstie S. Ball et al., “Surveillance Studies needs Gender and Sexuality,” Surveillance & Society 6, no. 4 (2009): 352–55, https://doi.org/10.24908/ss.v6i4.3266.
Ball et al., “Surveillance Studies,” 353.
Toby Beauchamp, Going Stealth: Transgender Politics and U.S. Surveillance Practices (Durham, NC: Duke University Press, 2019), 2. See also Toby Beauchamp, “Artful Concealment and Strategic Visibility: Transgender Bodies and U.S. State Surveillance After 9/11,” Surveillance & Society 6, no. 4 (2009): 356–66, https://doi.org/10.24908/ss.v6i4.3267.
Beauchamp, Going Stealth, 3.
Beauchamp, Going Stealth, 6.
Os Keyes, “Made-up Resistance,” Os’s Blog (blog), October 28, 2021, https://ironholds.org/madeup-resistance.
Zuboff, The Age of Surveillance Capitalism, 376.
Matthew Stewart, “Handling Discriminatory Biases in Data for Machine Learning,” Medium, March 30, 2019, https://towardsdatascience.com/machine-learning-and-discrimination-2ed1a8b01038.
Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, (New York: Crown Books, 2016), 25.
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (Cambridge, UK: Polity, 2019); Browne, Dark Matters: On The Surveillance of Blackness; Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York: New York University Press, 2018); Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” in Proceedings of the 1st Conference on Fairness, Accountability and Transparency (Conference on Fairness, Accountability and Transparency, PMLR, 2018), 77–91, http://proceedings.mlr.press/v81/buolamwini18a.html.
Tatum Dooley, “Stephanie Dinkins Is Turning Memoir Into AI,” Garage, August 15, 2019, https://garage.vice.com/en_us/article/43kdnm/stephanie-dinkins-is-turning-memoir-into-ai.
Stephanie Dinkins, “Not The Only One,” Stephanie Dinkins, www.stephaniedinkins.com/ntoo.html.
Stephanie Dinkins, “¿Human ÷ (Automation + Culture) = Partner?,” ASAP/Journal 4, no. 2 (2019): 296.
Stephanie Dinkins, “Afro-Now-Ism,” NOEMA, June 16, 2020, www.noemamag.com/afro-now-ism.