Introduction
This special issue of Afterimage explores critical intersections between creative practice and algorithmic culture. Contributors demonstrate the various ways that artists have worked in both historical and contemporary contexts to render the algorithmic intelligible, opening space for reflexive critique, meaningful resistance, and imaginative repurposing.
The issue responds to recent calls for a demystification of the algorithmic undercurrents of contemporary culture.1 The past decade has seen a growing public awareness of the role that algorithmic practices of classification, surveillance, and policing are playing in our everyday lives. Whether calling attention to the troubling ways that online platforms “see,” categorize, and target users in response to their tracked behaviors2 or exposing how the coupling of facial recognition technologies with machine learning is automating and expanding the efficacy of surveillance tactics across the internet and IRL,3 the invasive consequences of the algorithmic are becoming increasingly difficult to ignore.
Although efforts to demystify “the algorithm” have foregrounded the fact that algorithms are merely instructions that enable the automated operation of computational technologies,4 in the contemporary context they have become generative “culture machines.”5 Algorithmic technologies are increasingly responsible for enabling many of the practices that comprise our everyday lives.6 Countless scholars have identified the troubling consequences that algorithmic systems pose for racialized, gendered, sexualized, classed, and laboring subjects,7 yet developing strategies of resistance has been complicated by the seeming impossibility of critiquing or countering algorithmic operations8; unfolding outside of the phenomenal field of human experience,9 they are simply too small, too fast, and too expansive for people to grasp directly or comprehend exhaustively.10
Unpacking and making sense of the algorithmic has become pressing not only as a result of the impact that algorithmic technologies are having on bodies of all kinds, but also because we have reached a critical juncture where the traditionally human dimensions of these technologies are receding as algorithmic automation and the automation of the algorithmic takes hold.11 What is needed is an expanded repertoire of methods for engaging meaningfully with algorithmic techniques.
Although researchers have turned to the work of community activists to undertake similarly critical engagement with the algorithmic,12 this special issue looks to creative practice to overcome this impasse. Not only do researchers regularly leverage contemporary artworks as a means of illustrating scholarly claims,13 but artistic practice has a long history of exposing and critiquing technological media in general and the algorithmic in particular.14 From the “proto-algorithmic” instruction-based works of artists like Sol Lewitt, Yoko Ono, and Yvonne Rainer to critical engagements by contemporary media artists like Jake Elwes, Sondra Perry, Trevor Paglen, and Kate Crawford with the engines and outputs of algorithmic culture, creative practitioners and their artworks have worked to render algorithms intelligible, while also critiquing their broader political ramifications.
Despite the critical potential demonstrated by these and other works, very little research has sought to develop a comprehensive account of artists’ focused engagement with “the algorithm.” This gap in research and the need to further develop methods for algorithmic critique and possible resistance serves as the point of departure for the following special issue. In what follows, we begin by offering a brief review of critical scholarship on the algorithm, conceived in both pragmatic and cultural terms. This account of the algorithm and its cultural reverberations will be mapped across close readings of key illustrative works drawn from the histories of media art. This discussion is selective and intentionally limited in its breadth; it was guided by the desire to contextualize and further develop the articles and essays that comprise this special issue.
The Algorithm
The Pragmatist’s Algorithm
Efforts to demystify the algorithmic often draw parallels between algorithms and recipes. Offering a step-by-step guide toward a desired outcome, algorithms are “repeatable, practical solutions to problems.”15 Grounded within the computing context, algorithms might be understood more specifically as “encoded procedures for transforming input data into a desired output.”16 Ed Finn refers to this as the “pragmatic approach to algorithms.”17 Much like the ingredients that enable the actualization of a recipe, algorithms rely on their coupling with data to solve a problem. In fact, while the two can be pried apart for the sake of analysis and critique, “together, data structures and algorithms are two halves of the ontology of the world according to a computer.”18 Data must be collected and readied for its incorporation into algorithmic systems. In this vein, there is no such thing as “raw data.”19 Instead, data is actively generated, manufactured, and imagined. As Lisa Gitelman and Virginia Jackson explain, “data need to be imagined as data to exist and function as such.”20
A close examination of how data is prepared for use within algorithmic systems can help us to expose and understand the algorithms that enable them to operate.21 Techniques and technologies of “datafication,” which focuses more narrowly on the transformation of online activity into quantified data for the purpose of real-time tracking and predictive analyses,22 are being woven into ever-more-intimate corners of our everyday lives as behavioral data is transformed into the lucrative raw material that fuels the engines of contemporary surveillance capitalism.23 This is not simply a matter of unbridled capitalist expansion through data collection and the generation of increasingly powerful predictive analytics. Instead, datafication helps to advance, domesticate, and naturalize24 the project of “dataveillance,”25 through which the aggregation and sorting of individuals’ data is used to facilitate and legitimize disciplinary and control practices.26 The sheer volume and intimate nature of the data that is generated and collected paired alongside its seamless integration across platforms has intensified its authority and perceived inescapability.
Particularly troubling in this case is the conflation of individuals with data that is produced by and about them. As Simone Browne outlines, personal data “is often marked by gender, nation, region, race, socioeconomic status, and other categories where the life chances of many…are ‘more circumscribed by the categories into which they fall. For some, those categories are particularly prejudicial.’”27 Much contemporary scholarship has therefore documented how data collection and categorization techniques racialize individual users in a manner that not only results in targeted advertising, but legitimizes practices associated with overpolicing, restricted access to credit and housing, and limited employment opportunities.
Just as algorithms rely on data to do the things that have garnered them so much attention within the contemporary context, so too does the mass of data collected and stored rely on algorithmic technologies and techniques to be rendered intelligible and “actionable.” Among the most important properties of algorithms is that they are designed to run automatically, “to act when triggered without any regular human intervention or oversight.”28 According to Yuk Hui,29 algorithmic automation is achieved either linearly, through “the automatization of instructions (or pure repetition),”30 or through recursive automatization, in which the algorithm refers to itself, evaluates itself, and (re)acts accordingly.31 The move from linear repetition to increasingly recursive functionality not only marks an increase in the complexity and sophistication of contemporary algorithms, but it aligns them with the development and rise of machine learning technologies and resulting forms of artificial intelligence. Automation through recursivity is pushing algorithmic operations “beyond the capacity of human beings,”32 rendering points of intervention conceptually and literally inaccessible.
Algorithmic Culture
While the pragmatist focuses on the technical specificity of algorithms, it is rarely at this level that they are experienced. Just as critical examinations of data-driven practices help to expose the underlying design and political allegiances that inform algorithms, so too does a consideration of the relationship between algorithms and culture help to render algorithms more clearly.33 It is undeniable that algorithms play an increasingly central role in the articulation of everyday life in the twenty-first century. Rather than limiting the focus of algorithmic critique to the “bias and power in the technical workings of the algorithms themselves,”34 an approach grounded in the relationship between algorithms and culture recognizes the distributed and performative dynamism that is required for the algorithm(ic) to be actualized. It emphasizes the “recursive relationship between algorithms and people, acknowledging both the material underpinnings and mathematical logics of algorithms as ‘technical entities’ and the agency and sensemaking of people who experience, valorize, and perhaps tactically try to resist algorithmic operations”35 in their everyday lives.
Although algorithms and culture appear increasingly connected, the specific nature of this relationship has been approached differently by scholars working across media studies. Nick Seaver has differentiated between approaches in which algorithms are posited as being “in” culture, versus algorithms that are understood “as” culture. In the following section we will briefly unpack these perspectives in greater detail. We will first review the role that algorithms play in the production, circulation, and reception of cultural artifacts as well as the impact that algorithmic platforms have on the constitution of culture more broadly. This will be followed with a consideration of what it means for algorithms to be positioned as culture.
Algorithms in Culture
In the case of algorithms in culture, the pragmatist’s definition of algorithm as simply a solution to a problem is adopted; algorithms are treated as “discrete objects that may be located within cultural contexts or brought into conversation with cultural concerns.”36 From this perspective, algorithms “may shape culture (by altering flows of cultural material), and they may be shaped by culture (by embodying the biases of their creators),” but they remain distinct from it. Media studies scholars who have advanced accounts of “algorithmic culture” are aligned with this approach insofar as algorithms are understood as a “transformative force, exogenous to culture.”37 The focus in these cases is often on their contributions to the production and flow of cultural material, technical facilitation of cultural platforms, and entry into public discourse.38
Algorithms aid in the mediated production of cultural artifacts, thereby imprinting them with their underlying logics and protocols. Nowhere has this become more evident than in the recent public release of “Large language models” (LLMs) and corresponding text-to-image generators. Trained on millions of text-image pairings, collected from ImageNet and scraped from the web, OpenAI’s DALL-E 2, for example, uses neural networks to automatically generate prompt-driven images. The images are made possible through the affordances of algorithmic processing and automation. They also bear the aesthetic mark of the algorithmic, as the resulting pictures often capture strange and uncanny machine learning solutions to users’ prompts. The same can be said to varying degrees of other cultural artifacts produced and/or disseminated in algorithmic contexts. While this is certainly the case for those objects and phenomena created within digital environments, accounts of the “New Aesthetic”39 and “Post-Internet Art,”40 suggest that the algorithmic can increasingly be identified and mapped across nondigital objects and environments.
Among the cultural objects that algorithms contribute technically to are the platforms—the digital infrastructures, networked intermediaries, and attractive and extractive apparatuses that enable two or more groups to interact41—through which much contemporary culture unfolds and is shared. Algorithmically enabled functions drive and govern platforms’ operation. From Google’s early PageRank algorithm42 to the Netflix Prize43 and recent controversy over changes to Twitter’s “For You” algorithm,44 it has long been known that algorithms represent the proprietary “secret sauce,” the black-boxed goods that define, enable, and contribute to the value of online platforms. While platforms may claim to neutrally host content and represent the world objectively, critical here is the sense that baked into the algorithm(s) are platforms’ technical functions (real and imagined) as well as their underlying objectives and priorities, many of which are politically and financially inflected.45
As their centrality to platforms suggests, algorithms are increasingly given the task of “assembling the social,” which Ted Striphas identifies as culture’s chief responsibility. It is not only that algorithms undergird the platforms through which much of our social lives take place, but in their pursuit of better prediction products, algorithmic tools also “discover” and assert “statistical correlations within sprawling corpuses of data, correlations that would appear to unite otherwise disparate and dispersed aggregates of people.”46 What results is an algorithmic formulation of the “crowd”—a grouping of individuals whose behavioral data and predicted futures result in shared practices of categorization. How individuals are categorized determines the future content and connections that they will be exposed to. This has been connected to a new form of algorithmic identity that is taking hold within these contexts, as algorithmic systems use “statistical commonality models to determine one’s gender, class, or race in an automatic manner at the same time as [they define] the actual meaning of gender, class, or race themselves.”47 To this end, it is not just that algorithms are intervening in culture as they draw the social together, but that the act of identifying social correlations also has a significant impact on processes associated with identity formation.
This tendency has been connected to the rise of “algorithmic governance,” in which algorithmically enabled forms of data collection, processing, and modeling are folded back onto the users from whom they are generated. Corresponding efforts to “anticipate and preemptively affect possible behaviours”48 result in a normative delimitation of individuation such that it is made to fall predictably within parameters prescribed through algorithmic analysis and modeling. Activities that fall outside of these parameters are regularly disregarded as anomalous noise or glitches in the system. Pascal D. König has considered how these processes are short-circuiting previously established modes of subjectivation insofar as there is now no place for those subjected to the preemptive coordination activity of algorithms to challenge the outcomes they bring about.49
Scholars have focused much critical attention on what these systems make visible, invisible, and hypervisible and to whom. For example, both Ruha Benjamin and Safiya Umoja Noble have identified how a range of algorithmic and non-algorithmic technologies have worked to render Black bodies invisible when they want to be seen and hypervisible when they are seeking privacy. Brooke Erin Duffy and Colten Meisner have recently detailed the impact that algorithmically determined invisibility can have for racialized, female, and LGBTQ+ content creators. Within these contexts a lack of visibility works indirectly to normalize the visible while also having quantifiable consequences for creators (i.e., reach, earnings). It is not only invisibility that can prove troubling. As Noble’s analysis of the sexualized search results for “black girls” on Google demonstrates, visibility bears with it the politics of representation and interpretation. The critique of this has been rendered all the more urgent as the formal particularities of the visible become implicated in the feedback mechanisms and governance structures that make up algorithmic systems. The danger is in part the creation of myopic filter bubbles, a phenomenon that contributes to the delineation (and increasing polarization) of identifiable communities and has been associated with the catalyzation of contemporary social and political movements50 as well as the global rise of right-wing populism.51
Algorithms as Culture
To understand algorithms as culture is to see them less as stable objects with exogenous effects than as “the manifold consequences of a variety of human practices.”52 Here, algorithms contribute to, are a product of, and have become inseparable from culture—itself understood as a coalescing outcome of actions rather than something fixed. To conceive of the algorithmic in this way is to shift from thinking of algorithms solely in symbolic terms and to begin identifying their convergence with the material unfolding of reality and everyday life. It is not simply that the algorithmic is increasingly mediating culture, but that culture—and all of the reality that it entails—is increasingly being replaced by algorithmic ways of being and knowing. Algorithmic culture marks the culmination of contingency between the algorithmic and the world, through the exteriorization, standardization, and globalization of algorithmic processing and the modes of reason that undergird it.53 In this sense, Hui warns, the computational is displacing the world; the fundamental incalculability of the world as such, as a result, “withdraws further from us, until the question itself disappears.”54 It is not just that algorithms are increasingly deputized to undertake the “forms of decision-making and contestation that comprise the ongoing struggle to determine the values, practices and artifacts—the culture, as it were—of specific social groups,”55 but that the “thing in itself” is increasingly silenced and hidden from view.
As this suggests, individuals too are reshaping themselves in an effort to render themselves recognizable to algorithmic systems. We may speak more clearly, pose unnaturally, or adjust our gait in accordance with algorithmic standards, parameters, and terms of legibility. Others perform similar forms of contortion to evade algorithmic recognition. Each of these identifies an instance of “tacit negotiation”56 in which users undertake a “mundane, strategic reorientation of practices”57 to elicit a desired relationship with algorithmic tools and technologies. Individuals are not unchanged by these negotiations. As Danaher et al. write, “algorithms are increasingly being used to nudge, bias, guide, provoke, control, manipulate and constrain human behaviour.”58 A number of scholars have undertaken ethnographic studies of the impact that this is having on individuals. Taina Bucher, for example, has analyzed twenty-five users’ affective experience with Facebook. In an effort to parse what she refers to as the “algorithmic imaginary,” an account of how people come to feel and think about algorithms in their everyday lives, she charts the effect that algorithms seem to have on experience, mood, and sensation.59 Similarly, Nicolas Malevé has detailed how microworkers who complete image annotation tasks to help train computer vision technologies ultimately “learn” to see at increasingly small fractions of a second—rates that are intended to approximate as closely as possible the temporal rate at which algorithmic systems are “seeing” images.60 Significant are the embodied, material, and ontogenic effects that algorithms are having.
As this sweeping review of algorithmic scholarship and criticism has sought to capture, the stakes of algorithmic critique cannot be overstated. From the algorithm’s technical contribution to contemporary computing, to its sociocultural impact, effect on processes of subjectivation and individuation, and obfuscation of the world as such, it has become a central component of contemporary life. The preceding discussion has identified scholarly analyses of code, data, cultural formations, visual culture, and personal narratives as potential entry points for critique. This special issue, as well as the exhibition and symposium from which it has emerged, identify creative practice and the histories of art as another avenue through which to undertake algorithmic critique.
Grounded within understanding of the algorithm in technical and cultural terms, this special issue turns to art—making, history, and criticism—to further expose and critique the role of the algorithm in contemporary culture. The following section will offer a selective review of historical and contemporary interventions into the algorithm. It will first consider how artists of the mid-twentieth century began to contend with the instructional dimension(s) of the algorithmic, whether directly computational or not. This will be followed by an analysis of how artists associated with the net.art and software art movements engaged with the increasingly embodied impact that algorithms were having at the end of the twentieth century. Finally, the discussion will turn to a series of works drawn from the Contingent Systems exhibition that helped to shape this special issue and that take up contemporary experiences of the algorithm in/as culture.
Art and the Algorithm
Computer Art
Histories of algorithmic art, particularly those that intersect with computation, often begin with computer art, an area of creative practice that emerged in the mid 1960s and privileged the computer above all other media and technologies.61 Computer art began in public and private research labs and post-secondary institutions where a select group of engineers, scientists, and students started to “investigate the computer as a revolutionary image maker.”62 Among the most notable practitioners associated with the first and second generations of this movement were Max Bense (Denmark), Béla Julesz (US), Hiroshi Kawano (Japan), Kenneth Knowlton (US), Leslie Mezei (Canada), Vera Molnár (Hungary), Frieder Nake (Denmark), and A. Michael Noll (US).
Whereas other art movements of the time were responding to established art historical practices and tendencies, computer art intervened primarily in then-contemporary understandings and applications of computation.63 Visually, the works were quite varied, ranging from mathematical and scientific visualizations to abstract color fields and pictorial representations. Bell Labs pioneer Noll has explained that it was a computer-generated plot of data gone wrong that ultimately led him to begin experimenting with the creative potentials of the machine.64 Members of the Stuttgart School, by contrast, sought to realize Bense’s “Information Aesthetics,” a framework for measuring, evaluating, and creating works of art. Across these and other emergent practices, shared themes began to take shape, oriented largely around questions concerning authorship, aesthetics, chance and randomness, and artificial creativity.
In her account of the origins of computer art at Bell Labs, Zabet Patterson has problematized the centrality of the computer to computer art; comprising increasingly networked hardware, software, and the different technical states that the computer entails, the computer has never been a neatly contained medium. Nake, a member of the pioneering Stuttgart School, has shifted the framing of his work from “computer art” to “algorithmic art,” which he argues more accurately captures the processes and stakes of artmaking with a computer:
Algorithmic art starts with the development of an algorithm. That’s human work. The generative process ends in some material object, say ink on paper. That’s the machine’s work. What the machine realizes is one instance of a potentially infinite set of pieces. The artist, however, has described a concept: the entire class of those pieces.65
Nake’s sense that computer art is rooted within the human artist’s development of algorithms was not unique. Noll has echoed his sentiment; it enabled him to define and legitimize the artist’s contribution to computational artmaking while also asserting their ownership of the resulting computer-generated images. Kawano also highlighted the centrality of the algorithm to computer art, functionally, and definitionally. He, like Nake and Noll, believed that the purpose of the artist within this context was to “teach the computer how to make art by programming an algorithm.”66 To accomplish this, the artist must be aware of the “algorithmic procedures” that underlie the creation of an artwork. Here, Kawano’s assessment echoes the information aesthetics theorized by Bense and enacted by computer artists associated with the Stuttgart School. In both cases, not only is computer art recognized as fundamentally algorithmic, but art in general is reconceived in algorithmic terms. Working with the computer was seen then to encourage reflexive engagement with the presumed logic and procedures that undergird artmaking while also enabling the artist to become more critically attuned to the idiosyncratic behaviors and operations of the computer.
In each of these cases, computer artists oriented much of their work around the instructional algorithm, which was positioned as the locus of creative intervention and conceptual consequence. Artmaking offered an experimental space in which to explore the relationship between human and computer (algorithm), often through questions related to authorship; experimentation with the potential for chance and randomness within computational environments was expected to reveal something about the computer’s capacity for improvisation and nonhuman agency. Similarly, the graphical outputs provided insight into the algorithmic nature of computational and noncomputational aesthetics.
Creative engagement with the instructional was common across mid–twentieth century art movements, from Conceptualism and Fluxus to Art and Technology. Echoing contemporary algorithmic criticism, approaches to the instructional ranged from pragmatic expressions of instruction in practice to theoretical examinations of the impact that instructional systems were having on, for example, culture, space, and the bodies within it. The next section will provide a brief overview of illustrative examples drawn from this history in an effort to further develop an account of art and/as algorithmic critique.
The Art of Instruction
Software–Information Technology: Its New Meaning for Art, opened at the Jewish Museum in Brooklyn, New York, on September 16, 1970. Curated by Jack Burnham, the exhibition was intended to demonstrate “the effects of contemporary control and communication techniques in the hands of artists.”67 While computer hardware featured centrally in the exhibition, the primary focus was on artists’ “plans and procedures for action”68; the traditional art object itself was similarly deprioritized. Burnham’s intention was to explore the revolutionary impact that information technologies were having on “personal and social sensibilities,”69 while pointing to “information technologies as a pervasive environment badly in need of the sensitivity traditionally associated with art.”70 Contributions to the exhibition included Ned Woodman and Theodor H. Nelson’s Labyrinth: An Interactive Catalogue (1970), the first public demonstration of a hypertext system that also revealed the system’s capacity for storing and visualizing user history; Seek (1969–70), the Architecture Machine Group’s (MIT) cybernetic environment responding to the activities of a small colony of gerbil inhabitants; Sonia Landy Sheridan’s demonstration of the creative potentials of automated reproduction using a photocopier in Interactive Paper Systems (1969–70); and numerous examples of information-oriented and instruction-driven works by conceptual artists like John Baldessari, Hans Haacke, Allan Kaprow, Joseph Kosuth, and Les Levine.
Writing about the exhibition three decades later, Edward Shanken has challenged canonical framings of conceptual art that sought to distance it from art-and-technology movements of the 1950s and ’60s by drawing explicit parallels between the mechanics and materiality of conceptualism and information technology. According to Shanken, instruction-driven pieces like those included in Software should be understood as early examples of work that engages with the algorithmic; as Burnham and Shanken detail, these works foreground the similar roles that programmatic directives—and by extension, language—play in both creative practice and the technical undercurrents of information systems.71
In addition to materializing and aestheticizing algorithmic structures, many of the works also engaged with themes that remain salient within contemporary media criticism; the artists used language and instructional systems to draw attention to the revolutionary impact that computers and telecommunications devices were having for public and private institutions, organizational and administrative structures, and the everyday lives of social actors. They also help to demonstrate the instrumental role that information, in hard and soft terms, was playing in “redefining the entire area of esthetic awareness,”72 within and beyond the confines of the art world.73
Shanken’s closure of the historical chasm between conceptual art and computer art movements of the mid-twentieth century offers an entry point through which to begin mapping an expanded genealogy of algorithmic art, guided by the histories of instruction-based work more broadly, whether aligned with computational technologies directly, or not. Tracing the intersection between language and instruction backward, we may look to the earlier “event score,” popularized by the Fluxus movement and associated with earlier practices undertaken by artists like George Brecht, Stanley Brouwn, Ono, and La Monte Young as another critical reference point in the histories of algorithmic art. Describing an approach similar to the instructional works featured in Software, event scores were conceived of as “generic, flexible outline[s] for an action defined by a set of loose coordinates pertaining to time, space, and materiality, which were to be fully fleshed out by the performer in the context of a specific performance situation.”74 In fact, Hannah Higgins has characterized event scores, as “‘computer’ art of the human kind.”75
Adam Lauder’s contribution to this issue, “Martha Wilson’s Algorithmic Performative in Context,” explores the artist Martha Wilson’s strategies for gender performativity within the context of conceptual art practices occurring at Nova Scotia College of Art and Design in the 1970s. Lauder positions Wilson’s work as an additional genealogical point of departure within the histories of algorithmic art. Drawing attention to Wilson’s permutations of the gendered body within the formal and linguistic patterns of conceptual art, Lauder positions her actions as anticipatory of algorithmic culture, while being understood as acts of resistance that push to renegotiate gender inequities in the art world; these works exposed the occlusion of difference within algorithmic visuality. Lauder makes further reference to Wilson’s critical interventions as operating within Erving Goffman’s game theory model for human interaction, while drawing attention to Wilson as an overlooked reference point within the history of computational art.
The Algorithm, Embodied and Disembodied
The histories of computer art, conceptual art, and Fluxus are often contrasted with the Art and Technology movement that emerged in the United States in the mid-1960s and culminated with a series of major exhibitions including The Machine As Seen at the End of the Mechanical Era (1968 at the Museum of Modern Art in New York), Cybernetic Serendipity (1968 at the Institute of Contemporary Art in London), and Art and Technology (1971 at the Los Angeles County Museum of Art). Among the most well-known groups associated with this movement were the artists and engineers of Experiments in Art and Technology (E.A.T.). Co-founded by artists Robert Rauschenberg and Robert Whitman and Bell Labs engineers Billy Klüver and Fred Waldhauer, E.A.T. sought to stimulate collaboration between technological industry and the arts.76 Among the group’s earliest initiatives was 9 Evenings: Theatre and Engineering, a series of technologically oriented performances that took place at the 69th Regiment Armory in New York City in October 1966. The event consisted of ten performances, undertaken over the course of nine evenings. The participating artists were John Cage, Lucinda Childs, Öyvind Fahlström, Alex Hay, Deborah Hay, Steve Paxton, Yvonne Rainer, Robert Rauschenberg, David Tudor, and Robert Whitman, each of whom were given access to a suite of new tools and technologies while also being paired with a team of engineers from Bell Labs, who were to assist them in realizing the technical components of their performances.
Carriage Discreteness, Rainer’s contribution to the event, provides an illustrative example of how the Art and Technology movement identified and critiqued the algorithmic, drawing-specific connections between the rise of algorithmically driven information communication technologies, the environment, and the bodies that exist within it. For Carriage Discreteness, an instruction-based performance piece, Rainer arranged a variety of stage props within a grid, drawn in chalk on the floor of New York’s Armory. Using walkie-talkies, she commanded performers through a scripted series of generic tasks, such as moving the props from one section of the grid to another. Around them, the environment shifted as a sixty-seven-step sequence of mechanically driven devices were triggered, unfolding and recalibrating the performance space in real time. The execution of both the performance and mechanical sequence was largely “pre-programmed” through a script that Rainer co-authored with physicist Per Biorn.77
By foregrounding the performers’ execution of a technologically relayed set of instructions, the work embodies the mid-twentieth century shift from the “mechanical era” to the “information age,” rendering it graspable and more accessible to critique. Echoing recent claims regarding the ubiquity, recursive operation, and human toll of algorithmic technologies, like digital cogs, Rainer’s performers are apprehended within an algorithmically regimented environment that calls their humanity into question. Somewhat counterintuitively, it is through their own activity that the system is actualized, and their capture perpetuated. In contrast to dominant accounts of the work that claim a kind of seamless intersection between technology, script, and body, the walkie-talkies and pre-programmed mechanical elements did not work reliably; they malfunctioned, leaving performers to improvise much of the work at random.78 While the algorithmic is often framed as all-encompassing and inescapable, Carriage Discreteness reminds audiences of the imperceptible gaps in technological operation; it is within these spaces that ingenuity, recalibration, and freedom might be realized.
Examinations of the impact that information communication technologies were having on the body became prominent across the 1980s and ’90s. Central was an underlying uncertainty around the consequences that information and its algorithmic processing posed to materiality—of the art object, the body, in general. In 1972, art critic and curator Lucy Lippard declared the art object officially “dematerialized.” Responding to the rise of information, conceptualism, and the political over the preceding decade, Lippard claimed that the period had been characterized by a loss of interest, among both artists and critics, in the physical evolution of the work of art, resulting in “a profound dematerialization of art, especially of art as object.” She argued that by removing the object’s material grounds, art could circulate through artists’ networks, travelling exclusively as a set of ideas, instructions, and information.79
Dematerialization, particularly as it related to the algorithmically mediated art object, was also a central feature of software art, the histories of which have been decisively connected to computer, conceptual, and instruction-based art.80 First defined in 2001 by the jury of the Transmediale award for artistic software, software art encompasses “projects in which self-written algorithmic computer software (stand alone programmes or script-based applications) is not merely a functional tool, but is itself an artistic creation.”81 Software art engages with the aesthetic and social dimensions of computer software82—conceived of as “algorithmic programming code.”83 Associated artists sought to make “users aware that digital signs are structural hybrids of internal code and an external display that arbitrarily depends on algorithmic formatting.”84 By calling attention to the interactive layers of computing through the display of code, much software art aimed to expose the political and subtextual dimensions of what might otherwise be considered “neutral technical commands.”85 If “conventional programmes are instruments serving purely pragmatic purposes,” then the software of software art was to be “unpragmatic,” “irrational,” and “speculative.”86
Works that are particularly illustrative of Software Art include wwwwwwwww.jodi.org, in which net.artist duo Joan Heemskerk and Dirk Paesmans (JODI) subverted the seamless operation of the internet by revealing glitched-out versions of their website’s underlying code; Webstalker (1997–98), a web browser that displays websites’ html code alongside a map of their hyperlinked contents, by artist collective I/O/D87; and Mez Breeze’s difficult to parse “codework” poems (1993–present), written in Mezangelle, a self-crafted hybrid of programming language, informal speech, and poetic conventions disseminated most famously through the net.art email list 7-11 (1998). Revisionist histories have also identified a number of works that presciently contended with emerging formulations of the attention economy and surveillance capitalism, such as Ubermorgen’s88 Google-Will-Eat-Itself (2005), a bot-driven scheme that exploits, capitalizes upon, and attempts to undo the structure of Google’s AdSense initiative, and LAN’s89 TraceNoizer (2001), a service that produced and cloned fake websites (disinformation) in an effort to complicate personal tracking and surveillance.
There is significant overlap between the histories and practices of software art and net.art, another movement that took hold in the 1990s. Emerging across early web-based message boards and mailing lists, net.art refers to the first generation of art of and about the internet. While (as the preceding section attests to) some of the work engaged explicitly with the technicalities of algorithms, much of the focus was on the cultural impact that algorithms and algorithmic systems were beginning to have. Selective examples of this can be mapped across the layered account of algorithmic culture described above.
In 1993, Heath Bunting founded Cybercafe, one of the many bulletin board systems (BBS) around which artists, activists, academics, subversives, and the like congregated, posting rants, poetry, short music files, and images. Reflecting on her initial experience with the Cybercafe BBS, Rachel Baker described the novelty and uncanniness of somebody “silently entering your home, your personal space through text on a screen.”90 Years later, Bunting would build Communication Creates Conflict (1995) for the Intercommunication Center’s (Japan) InterCommunication ’95 “On the Web: The Museum Inside the Network.” A web-based work, visitors could submit messages to be delivered via postcard, fax, public placard, or leaflet. Bunting’s intervention continued his long-running effort to reimagine government- and corporate-controlled communication networks creating opportunities for connection, playfulness, resistance, and subversion.
“Net.art” was supposedly coined in an email exchange between Vuk Ćosić and Alexei Shulgin as a result of a glitch in the system. Glitch—or a perceptible “error” in technical operations—became one of the dominant cultural aesthetics of net.art, often framed as an exposure of the underlying and typically black-boxed algorithms that drive digital and web-based technologies. JODI’s previously mentioned website serves as an example of this. Olia Lialina’s My Boyfriend Came Back from the War (1996) offers further insight into the impact that algorithmic technologies were having on the construction and dissemination of cultural artifacts associated with net.art.
A hyperlinked “net film,” Lialina’s canonical work required users to click their way through a still- and gif-illustrated nonlinear narrative, presented across a series of gridded windows within a browser. The form of the work was a product of the new affordances and limitations prescribed by web-based software and slow network connections. A similar structure and interactive format was employed by Shu Lea Cheang for BRANDON (1998), a work that navigates the life of Brandon Teena, a trans man who was brutalized and murdered in Nebraska in 1993, while serving as a broader mediation on “gender fusion and techno-body in both public space and cyberspace.”91 While the net had been advanced as a liberatory space, free of the biopolitics of the “meat world,” BRANDON reasserted the material limits of the body and persistence of gender-based oppression.
Software art treats algorithmic programming code as art, denaturalizing it and exploring its sociocultural ramifications. Key works disrupt the seamlessness accorded to algorithmic technologies by exposing their underlying code. Dismantling the black box in this way opens space for direct engagement and critique. Other works reorient and reimagine algorithmic realities through alternative designs, irrational code, and tactical repurposing. In each case, software art is aligned with the pragmatists’ approach to the algorithm. It offers greater insight into the technical operations of algorithmic systems, while also leveraging the algorithm as its means of intervention. Net.art, by contrast, engages with the cultural dimensions of the algorithmic. Many of its canonical works explored how algorithmic technologies were impacting the expression, dissemination, and experience of cultural artifacts and sociocultural platforms. Furthermore, the identity of users and impact that algorithmic technologies were having on perceptions and experiences of the body were also often foregrounded.
Constanza Salazar and Felipe Rivas San Martín’s contributions to this special issue explore intersections between algorithmic technologies, materiality, and the body further. In “Challenging the ‘Data Body’ in New Media Art, 1990s–Present,” Salazar approaches innovative technologies of the 1990s to address histories of the web, surveillance, and artificial intelligence that have transformed individual users into data bodies—an information body that can be extracted and manipulated. Reflecting on structures of authority that infiltrate cyberspace and extract user information for capitalist gain, Salazar addresses performative acts of resistance that rematerialize the body through embodied artistic practice. Artist-activist collectives Critical Art Ensemble and subRosa lay the groundwork for enacting opposition and dismantling gender binaries, racism, and colonialist thinking within histories of algorithmic technology. Salazar offers a critical lens for combating datafication of the body today.
Rivas San Martín’s “Sexual Data: Deviations from the Scientific Image,” focuses on more recent convergences of the body and the algorithm within the context of machine learning. More specifically, Rivas San Martín’s text provides a critique of Stanford academics Yilun Wang and Michal Kosinski’s controversial study on the capacity for artificial technology to detect sexual orientation through facial recognition. With an overview of how their study intersects with surveillance capitalism, Rivas San Martín draws upon the work of Bruno Latour and of Klaus Amann and Karin Knorr Cetina to question the proposed neutrality of scientific images. Incorporating an overview of his project Sexual Data—as an attempt to distance these algorithmic images from their context within scientific research—Rivas San Martín offers a new path for their material translation and liberation through art intervention. The work functions to dismantle algorithmic operations in scientific discourse that try to quantify and neutralize the complexities of the human experience.
Artistic Interventions into Contemporary Algorithmic Culture
Nearly thirty years after the emergence of net.art, algorithms have shifted from being restricted technical engines to ubiquitous culture machines. Given the all-encompassing impact that algorithms are having on our daily lives and following the introduction to the algorithmic outlined above, critical questions have emerged regarding the ethics and sustainability of current data practices, artificial intelligence and machine learning techniques, algorithmic identity politics, platform capitalism and its affiliated modes of governance, and corresponding forms of surveillance and policing (to name a few). These questions guided the curation of Contingent Systems, an international group exhibition that ran from September 16, 2021, through November 20, 2021, at the Illingworth Kerr Gallery at the Alberta University of the Arts (Calgary, Canada). The exhibition featured work by eight artists and collectives whose practice renders the algorithmic intelligible, offering a means of better understanding the technical operations of algorithmic systems, as well as possible strategies for meaningful critique and resistance. Artists in the exhibition included FRAUD (Francisco Gallardo and Audrey Samson); Sarah Friend; Helen Knowles; Lauren Lee McCarthy; Anna Ridler; Stephanie Syjuco; zzyws (Zhenzhen Qi and Yang Wang); and LA Birdwatchers (Aljumaine Gayle, Suzanne Kite, Nicholas Shapiro, and Ladan Mohamed Siad). The remainder of this section offers an overview of the themes that the exhibition addressed, organized in relation to the conceptual arc of the exhibition, as well as a discussion of how the incorporated works engaged with key dimensions of the algorithmic. Close readings of selected works will be presented alongside introductions to essays contributed by participating artists in which they frame their practice and offer insight into their critical engagements with the algorithmic.
Recognizing the centrality that data and practices of datafication play in the operation of algorithms and expansion of the algorithmic in/as culture, the exhibition opened with a series of works that visualize practices of data collection, materialize the labor involved in readying data for use within algorithmic systems, and expose the increasingly personal spaces from which data is extracted and put into practice.
The exhibition opened with a view of Myriad (Tulips) (2018), a triptych installation of thousands of hand-labeled photographs of tulips. Compiled by Ridler (UK), the work draws attention to the skill, labor, and time that goes into constructing datasets. Used as the source material for two subsequent works produced by artificially intelligent software, Mosaic Virus (2018) and Mosaic Virus (2019), Myriad (Tulips) also helps to expose the human element that, while usually hidden by algorithmic processes, is central to machine learning. While much recent attention has been paid to the troubling homogeneity of datasets used to train facial recognition algorithms, Ridler’s selection of flowers as subject matter references overlooked histories within these technologies. The iris flower dataset, produced nearly a century ago by British statistician and vocal supporter of the eugenics movement Ronald Fisher, remains a prominent example of statistical classification techniques used in machine learning; it is a common dataset used to train machine learning programs running in the Python programming language.
Finis-terra (2018), a work composed of two densely woven spring carpets suspended as hammocks, visualizes satellites’ “view from above,” calling attention to the automated gaze of datafication and connections between machine learning and unsustainable accumulation. From mechanization of labor to agricultural aesthetics, predictive analytics, and colonial extraction, in “Terra Analytica: The Automated Gaze of Corn and Soy” (in this issue) FRAUD teases apart the complex genealogical histories that informed the creation of their work. The duo advances “embodied-earth-thinking” as a critical counterpoint to the effect that the modes of algorithmic computation unpacked in the essay are having on sensing and knowing.
Whereas Myriad (Tulips) and Finis-Terra foreground the processes involved in rendering the world into data, McCarthy’s (US) SOMEONE calls attention to the increasingly private and intimate spaces from which data is first extracted and then algorithmically processed and (re)applied. For approximately two months in 2019, McCarthy installed custom-designed smart devices—cameras, microphones, lights, and other appliances—in four individuals’ homes. From a command center in 205 Hudson Gallery in New York City, exhibition visitors were able to look into the homes using one of four laptops installed on a tabletop. In addition to surveying participants, visitors were also able to access and control their networked devices. If participants called out for “someone,” visitors were able to step in as their virtual assistant, fulfilling requests and answering questions. Installed as video documentation of the initial performance, SOMEONE explores the tension between a desire for privacy on the one hand and the drive toward algorithmically enabled convenience on the other. As we become increasingly accustomed to the idea that artificial intelligences and the corporations they work for are surveying and capitalizing on our private lives, McCarthy asks how a reminder of the human element that typically lies behind these actors might provoke a renewed sense of awareness and criticality.
A demonstration of the applications of data following algorithmic processing associated with machine learning was put on view in LENNA (2018) by artist duo artist zzyw (China). Following the key principles of the International Typographic Style—a design movement that emerged in Europe during the 1920s—the work is a custom-written algorithm that automates the design process. Executed by a computer and plot printer, the software is able to generate near-infinite poster designs, engaging with longstanding debates concerning authorship (who or what is the “author” of each individual poster design) and the presumed role of human creativity and ingenuity in artmaking. Following the histories of software art, the work is installed alongside a monitor that visualizes the operations of the algorithm in real time, offering audiences a glance inside the typically obfuscated “black box” of algorithmic automation and production.
The impact that algorithms are having on conceptualizations of the art object is examined further in Friend’s series of NFTs, Off (2021), presented in the gallery space as an auction-linked collection of black rectangular paintings on plexi—the dimensions of which were determined by the aspect ratio of their titular devices. Specializing in blockchain and the p2p web, much of Friend’s recent work consists of NFT editions—like Off—that challenge traditional conceptualizations of art ownership by requiring collectors to give the works away in order for their coded potential to be fully realized. She explores this topic further here through a conversation with Martin Zeilinger, a senior lecturer in Computational Arts and Technology at Abertay University and author of Tactical Entanglements: AI Art, Creative Agency, and the Limits of Intellectual Property (2021).
The agency and accountability of algorithmic systems is put on trial in Knowles’s The Trial of Superdebthunterbot (2016), a forty-five-minute film that documents a legal trial in which jurors must determine whether algorithmic bots have a “duty of care” and should be held criminally responsible in cases of “gross negligence” resulting in death. Knowles details the conceptual framing and production of the film in her essay contribution to this special issue.
Syjuco’s (US) contribution to the exhibition consisted of three intersecting components. Ungovernable (2017) includes three draped and obfuscated protest banners. The design of each banner replicates a photograph of anti-fascist protesters that Syjuco lifted from the internet. The text is presented as pictured—whether legible or not—while the protestors’ bodies have been edited from the scene. CITIZENS (2017) is a series of four staged portraits featuring individuals posed as black-clad and masked protesters. Each of the subjects belongs to a vulnerable community, upping the stakes of their involvement, surveillance, and identification in protest. Lastly, a series of still-life photographs, titled Chromakey Aftermath (2017), capture the material remnants of a protest painted in chromakey green and mounted to a wall of the same color. These works were created in response to many post-2016 presidential election protests that Syjuco attended. She was struck by the difference between the lived experience of the protests and their representation online and in the media. In response, the series of works surface and navigate the tension between the embodied experience, media distortion, and the political ramifications of protest in the age of algorithmic dissemination and surveillance.
The uneven application and differential risks that algorithmic methods of surveillance and discipline pose to targeted communities serves as the point of departure for artist-collective LA Birdwatchers’ eponymous work, LA BIRDWATCHERS (2021). Leveraging the potentials of tactical sousveillance, the collective gathered data in response to predictive policing practices in Los Angeles between October 2019 and October 2020, using it to produce an immersive and interactive sound and data visualization. The work foregrounds connections between policing and the military industrial complex, while also materializing the resistant potentials of counter-veillance. This along with the critical orientation, technological construction, and personal histories of LA Birdwatchers’ work are discussed further in their contributed essay.
Much of the work in the exhibition foregrounded practices of visualization as their primary means of rendering the algorithmic knowable and accessible for critique. Alex Borkowski’s contribution to this special issue reorients this approach through a consideration of the critical potentials of voice-driven analyses. “Vocal Aesthetics, AI Imaginaries: Reconfiguring Smart Interfaces” offers a comprehensive overview of the paradigm shift within interface technology. Critiquing applications of Amazon’s Alexa, Apple’s Siri, and Google Assistant—a use of voice to humanize or neutralize the exchange between computer/algorithm and user—Borkowski recounts the history of speaking machines to interrogate the subjectivities and proposed rationalities that AI can provide. Pulling from artists Holly Herndon and collaborative duo Sofian Audry and Erin Gee, the article examines processes of machine learning and vocal performance that account for the glitch, interferences that can destabilize illusions of self-sufficiency and intelligence. Studying the production of voices through algorithmic technology, Borkowski examines the way that art can unsettle their “seamless” integration into our daily lives.
Conclusion
As our lives have become increasingly mediated by algorithms, so too has locating, grasping, and critiquing them become increasingly difficult. Whether as a result of their distributed and increasingly nonhuman authorship, the speed and scale of their technical operation, or practices of accessing black boxes, the inaccessibility of algorithms to critique has become increasingly urgent as a result of the all-encompassing impact that they are having in and as culture. Identifying and developing methods that better enable us to engage with the algorithmic is imperative. The articles and essays in this special issue of Afterimage do just that by demonstrating the methodological promise that creative practice, art history, and criticism offer to algorithmic critique. The artist essays provide materially grounded and practice-based accounts of research that render algorithmic operations intelligible while also questioning their broader cultural origins, applications, and consequences. The academic essays chart and unpack the longer histories and contemporary realities with which these works are aligned. As in this introduction, what comes into focus across this issue is how creative practice might be used as a method for undoing the obfuscatory simplicity of the pragmatist’s approach to engage directly with the algorithm in and as culture—the levels at which it takes hold, shaping the seeable and the sayable while ultimately reshaping what it means to be human.
Notes
Kate Crawford, Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (New Haven, CT: Yale University Press, 2021); Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (Medford, MA: Polity, 2019).
Adrian MacKenzie and Anna Munster, “Platform Seeing: Image Ensembles and Their Invisualities,” Theory, Culture & Society 36, no. 5 (2019): 3–22, https://doi.org/10.1177/0263276419847.
Rebecca Heilweil, “The world’s scariest facial recognition company, explained,” Vox, May 8, 2020, www.vox.com/recode/2020/2/11/21131991/clearview-ai-facial-recognition-database-law-enforcement; Jason Leopold and Anthony Cormier, “The DEA Has Been Given Permission to Investigate People Protesting George Floyd’s Death,” BuzzFeedNews, June 2, 2020, www.buzzfeednews.com/article/jasonleopold/george-floyd-police-brutality-protests-government.
Ted Striphas, “Algorithmic Culture,” European Journal of Cultural Studies 18, nos. 4–5 (2015): 395–412, https://doi.org/10.1177/1367549415577392.
Ed Finn, What Algorithms Want: Imagination in the Age of Computing (Cambridge, MA: MIT Press, 2017).
Mark B.N. Hansen, Feed-Forward: On the Future of Twenty-First Century Media (Chicago: University of Chicago Press, 2014).
Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York: New York Unversity Press, 2018); Taina Bucher, If…Then: Algorithmic Power and Politics (Oxford, UK: Oxford University Press, 2018); Simone Browne, Dark Matters: On the Surveillance of Blackness (Durham, NC: Duke University Press, 2015).
Shoshanna Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: PublicAffairs, 2018).
Siegfried Zielinski, Deep Time of the Media: Toward an Archaeology of Hearing and Seeing by Technical Means (Cambridge, MA: MIT Press, 2008).
Michelle Henning, “Image Flow: Photography on Tap,” photographies 11, nos. 2–3 (2018): 133–48, https://doi.org/10.1080/17540763.2018.1445011.
Sarah Kember, “The Becoming-Photographer in Technoculture,” in Photography Reframed: New Visions in Contemporary Photographic Culture, ed. Ben Burbridge and Annebella Pollen (New York: Routledge, 2018), 225–33; Anthony McCosker and Rowan Wilken, Automating Vision: The Social Impact of the New Camera Consciousness (New York: Routledge, 2020).
Becky Kazansky and Stefania Milan, “‘Bodies not templates’: Contesting dominant algorithmic imaginaries,” New Media & Society 23, no. 2 (2021): 363–81, https://doi.org/10.1177/1461444820929316; Tuukka Lehtiniemi and Minna Ruckenstein, “The social imaginaries of data activism,” Big Data & Society 6, no. 1 (2019): 1–12, https://doi.org/10.1177/2053951718821146.
Ashley Scarlett, “Interpreting an Improper Materialism: On Aesthesis, Synesthesia and the Digital,” Digital Culture & Society 1, no. 1 (2015): 111–29, https://doi.org/10.14361/dcs-2015-0108.
Christiane Paul, A Companion to Digital Art (Oxford: Wiley Blackwell, 2016).
Finn, What Algorithms Want, 18.
Tarleton Gillespie, “The Relevance of Algorithms,” in Media Technologies: Essays on Communication, Materiality, and Society, ed. Tarletson Gillespie, Pablo J. Boczkowski, and Kirsten A. Foot (Cambridge, MA: MIT Press, 2014), 167.
Finn, What Algorithms Want, 18.
Gillespie, “The Relevance of Algorithms,” 167.
Lisa Gitelman and Virginia Jackson, “Introduction,” in “Raw Data” Is an Oxymoron, ed. Lisa Gitelman (Cambridge, MA: MIT Press, 2013), 1–14.
Gitleman and Jackson, “Introduction,” 3.
Gillespie, “The Relevance of Algorithms,” 171–2.
Jose van Dijck, “Datafication, Dataism and Dataveillance Big Data Between Scientific Paradigm and Ideology,” Surveillance & Society 12, no. 2 (2014): 198, doi: 10.24908/ss.v12i2.4776.
Shoshanna Zuboff, The Age of Surveillance Capitalism; Nick Srnicek, Platform Capitalism (Cambridge, UK: Polity Press, 2016).
David Lyon, “Surveillance Culture: Engagement, Exposure, and Ethics in Digital Modernity,” International Journal of Communication 11 (2017): 824–42, https://ijoc.org/index.php/ijoc/article/view/5527.
Rita Raley, “Dataveillance and Countervaillance,” in Gitelman, “‘Raw Data” Is an Oxymoron, 121–46; Van Dijck, “Datafication”; Lyon, “Surveillance Culture.”
Raley, “Dataveillance,” 124.
Browne, Dark Matters, 17.
Gillespie, “The Relevance of Algorithms,” 170.
Yuk Hui, “Algorithmic Catastrophe—the Revenge of Contingency,” Parrhesia 23 (2015): 122–43.
Hui, “Algorithmic Catastrophe”: 138.
Kodwo Eshun, “Recursion, Interrupted,” e-flux Journal #109 (2020), www.e-flux.com/journal/109/331340/recursion-interrupted.
Hui, “Algorithmic Catastrophe”: 139.
Stine Lomborg and Patrick Heiberg Kapsch, “Decoding algorithms,” Media, Culture & Society 42, no. 5 (2020): 745–61.
Lomborg and Kapsch, “Decoding algorithms”: 747.
Lomborg and Kapsch, “Decoding algorithms”: 747.
Nick Seaver, “Algorithms as culture: Some tactics for the ethnography of algorithmic systems,” Big Data & Society (July–December 2017): 4, https://doi.org/10.1177/2053951717738104.
Seaver, “Algorithms as Culture”: 5.
Seaver, “Algorithms as Culture”: 5.
Christiane Paul and Malcolm Levy, “Genealogies of the New Aesthetic,” in Post-Digital Aesthetics: Art, Computation and Design, ed. David M. Berry and Michael Dieter (New York: Palgrave Macmillan, 2015).
Karen Archey and Robin Peckham, Art Post Internet, exh. cat. (Beijing: Ullens Center for Contemporary Art, 2014).
Seaver, “Algorithms as Culture.”
Matteo Pasquinelli, “Google’s PageRank Algorithm: A Diagram of the Cognitive Capitalism and the Rentier of the Common Intellect,” in Deep Search: The Politics of Search beyond Google, ed. Konrad Becker and Felix Stalder (London: Transaction Publishers, 2009).
Finn, “What Algorithms Want.”
Kari Paul, “Elon Musk reportedly forced Twitter algorithm to boost his tweets after Super Bowl flop,” Guardian, February 15, 2023, www.theguardian.com/technology/2023/feb/15/elon-musk-changes-twitter-algorithm-super-bowl-slump-report.
Ryan Mac and Cecilia Kang, “Whistle-Blower Says Facebook ‘Chooses Profits Over Safety,’” New York Times, October 3, 2021, www.nytimes.com/2021/10/03/technology/whistle-blower-facebook-frances-haugen.html; Tal Orion Harel et al., “The Normalization of Hatred: Identity, Affective Polarization, and Dehumanization on Facebook in the Context of Intractable Political Conflict,” Social Media + Society 6, no. 2 (April–June 2020): 1–10, https://doi.org/10.1177/2056305120913983.
Striphas, “Algorithmic Culture”: 406.
John Cheney-Lippold, “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control,” Theory, Culture & Society 28, no. 6 (2011): 165.
Antoinette Rouvroy and Thomas Berns, “Algorithmic governmentality and prospects of emancipation: Disparateness as a precondition for individuation through relationships?,” Résaux 177, no. 1 (2013): 163–96.
Pascal D. König, “Dissecting the Algorithmic Leviathan: On the Socio-Political Anatomy of Algorithmic Governance,” Philosophy & Technology 33 (2020): 468.
König, “Dissecting the Algorithmic Leviathan”: 468.
Wendy Chun, “Queerying Homophily,” in Pattern Discrimination, ed. Clemens Apprich et al. (Lüneberg, Germany: Meson Press, 2018), 60.
Seaver, “Algorithms as Culture”: 4.
Hui, “Algorithmic Catastrophe.”
Yuk Hui, Art and Cosmotechnics (Minneapolis: University of Minnesota Press, 2021), 247.
Striphas, “Algorithmic Culture”: 406.
Gillespie, “The Relevance of Algorithms,” 184.
Gillespie, “The Relevance of Algorithms,” 184.
John Danaher et al., “Algorithmic Governance: Developing a research agenda through the power of collective intelligence,” Big Data & Society (July–December 2017): 1–21.
Taina Bucher, “The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms,” Information, Communication & Society 20, no. 1 (2017): 30–44, https://doi.org/10.1080/1369118X.2016.1154086.
Nicolas Malevé, “On the data set’s ruins,” AI & Society 36, no. 11 (2021): 1117–31.
Grant D. Taylor, When the Machine Made Art: The Troubled History of Computer Art (London: Bloomsbury, 2014).
Taylor, Machine Made Art, 45.
Zabet Patterson, Peripheral Vision: Bell Labs: The S-C 4020, and the Origins of Computer Art (Cambridge, MA: MIT Press, 2015); Taylor, Machine Made Art.
A. Michael Noll, “The Beginnings of Computer Art in the United States: A Memoir,” Leonardo 27, no. 1 (1994): 39–44.
Frieder Nake, “Algorithmic Art,” Leonardo 47, no. 2 (2014), https://doi.org/10.1162/LEON_a_00706.
Frank Dietrich, “Visual Intelligence: The First Decade of Computer Art (1965–1975),” Leonardo 19, no. 2 (1986): 162.
Jack Burnham, SOFTWARE–Information Technology: Its New Meaning for Art, exh. cat. (New York: Museum of Modern Art, 1970), 10.
Burnham, SOFTWARE, 12.
Burnham, SOFTWARE, 11.
Burnham, SOFTWARE, 14.
Edward Shanken, “Art in the Information Age: Technology and Conceptual Art,” Leonardo 35, no. 4 (2002): 433–8.
Burnham, SOFTWARE, 11.
Burnham, SOFTWARE, 14.
Natilee Harren, Fluxus Forms: Scores, Multiples, and the Eternal Network (Chicago: University of Chicago Press, 2020), 3.
Hannah Higgins in Harren, Fluxus Forms, 3.
Noah Wardrip-Fruin, “E.A.T,” in The New Media Reader, ed. Noah Wardrip-Fruin and Nick Montfort (Cambridge, MA: MIT Press, 1998), 211–13.
Ira Murfin, “Talk Performance: Extemporaneous Speech, Artistic Discipline, and Media in the Post-1960s American Avant-garde” (PhD diss., Northwestern University, 2017).
Murfin, “Talk Performance.”
Lucy Lippard and John Chandler, “The Dematerialization of Art,” in Conceptual Art: A Critical Anthology, ed. Alexander Alberro and Blake Stimson (Cambridge, MA: MIT Press, 1999), 46.
Geoff Cox, Antithesis: The Dialectics of Software Art (Aarhus, Denmark: Digital Aesthetics Research Center, Aarhus University, 2010).
This definition was developed by the Transmediale awards committee, which consisted of Jean-Pierre Balpe, Florian Cramer, Ulrike Gabriel, and Gerfried Stocker, February 8, 2001, http://amsterdam.nettime.org/Lists-Archives/rohrpost-0101/msg00039.html.
Andreas Broekman, “On Software as Art,” in Sarai Reader 03: Shaping Technologies, ed. Jeebesh Bagchi et al. (Delhi: The Sarai Programme CSDS), 215–18.
Florian Cramer and Ulrike Gabriel, “Software Art” (2001), www.netzliteratur.net/cramer/software_art_-_transmediale.html, 1.
Cramer and Gabriel, “Software Art,” 1.
Inke Arns, “Read_me, run_me, execute_me. Code as Executable Text: Software Art and its Focus on Program Code as Performative Text,” Medienkunstnetz.de, 2005, 3.
Arns, “Read_me, run_me, execute_me,” 3.
Members of I/O/D included Matthew Fuller, Colin Green, and Simon Pope.
The members of Ubermorgen (lizvlx and Luzius Bernhard) collaborated with the duo Alessandro Ludovico vs. Paolo Cirio on this project.
Members of LAN included Roman Abt, Marc Lee, Annina Rüst, Fabian Thommen, and Silvan Zurbruegg.
Michael Connor, “Between the Net and the Street: Rachel Baker discusses the network experiments of Heath Bunting,” Rhizome.org (April 10, 2017), https://rhizome.org/editorial/2017/apr/10/between-the-net-and-the-street.
Solomon R. Guggenheim Museum, “BRANDON, 1998–1999” (n.d.), http://brandon.guggenheim.org/shuleaWORKS/brandon.html.