This article engages with the work of Adrian Piper in order to explore relationships among visuality, data, race, and gender. It specifically pairs Piper's multimedia installation Cornered (1988) with her conception of pseudorationality, which she develops in her philosophical work on Immanuel Kant, and brings this pairing to bear on current discussions in software and critical algorithm studies. Cornered's exposure of the pseudorationality of racist perception and its visual grammar contributes to our understanding of the racist and gendered conditions of data's appearance, particularly in terms of data's reliance on the visual. Ultimately, I argue, Piper's work provokes us to call into question the stability of “data” itself.

“What white people see when they look at you is not visible. What they do see when they do look at you is what they have invested you with.”

—James Baldwin1 

“For what is software if not the very effort of making something explicit, of making something intangible visible, while at the same time rendering the visible (such as the machine) invisible?”

—Wendy Hui Kyong Chun2 

DATA, VISUALITY, RACE

“I am black,” states the female newscaster persona in the video component of Adrian Piper's Cornered (1988). The work is a multimedia installation, and the video monitor is placed in the corner of the exhibition space, with one version of Piper's father's birth certificate identifying him as white flanking one side of the monitor, and another version of his birth certificate identifying him as “octoroon” flanking the other.3 The monitor itself sits at table height, and in front of it is a table turned on its side lengthwise, its four legs projecting outward toward the spectator. Arranged in a triangle “gunboat formation” are ten chairs for potential viewers to sit on.4 

Played by the artist herself, the newscaster persona is seated at a table or desk, also in a corner, against a white background. She appears from the waist up, her head tilted slightly to her right, looking into the camera. Her hair falls behind her shoulders, and she is wearing a blue top and a white pearl necklace. Her arms and elbows rest in front of her on the table's surface, and her right hand loosely cups her left. The image is very much composed according to the mise-en-scène of a nightly news program. After her initial statement that she is black, Piper's persona continues: “Now let's deal with this social fact, and the fact of my stating it, together. Maybe you don't see why we have to deal with them together. Maybe you think it's just my problem, and that I should deal with it by myself. But it's not just my problem. It's our problem.”5 

One of Piper's better-known artworks, Cornered exposes the role various media forms and genres (birth certificates, newscasts, video) play in visual and discursive determinations of race and gender.6 What might Cornered—and Piper's conception of pseudorationality that she develops throughout her work as an artist and a philosopher—teach us with respect to digital media's share in such determinations?

Recent discussions of the contemporary rise of data have drawn attention to the fact that racial and gendered biases are at work in data's collection and classification. As Kate Crawford documented in an opinion piece published last summer in the New York Times, these biases have appeared in a wide range of instances: from the tagging of black people as gorillas by Google's photo app, to Nikon's camera software reading Asian people as blinking, to the inability of Hewlett-Packard's facial recognition software to track nonwhite faces.7 As upsetting as they may be, however, it would be a mistake to say that the root of these biases can be traced exclusively to the race and gender of the coders—the “white guys” in the title of Crawford's op-ed—who wrote the controversial algorithms. In fact, it would be a mistake to say that these are mistakes. As Tarleton Gillespie explains, such algorithms are first “trained” on preestablished data sets that “have been certified, either by the designers or by past user practices.”8 In the examples named above, as with all current forms of “computer vision” machine learning, the algorithms were trained to recognize patterns of people and objects not as they appear “in the wild,” as it is said, but first as they appear in already-existing photographs, which they would then apply to actual people and objects.9 Google's photo app, then, is only able to recognize human faces according to the rules by which they have already been recognized within the history of photographic representation. In this way, algorithms do not so much represent the objects they identify as generate them from previously constructed representations. As Antoinette Rouvroy puts it, “Algorithmic reality is formed inside the digital reality without any direct contact with the world it is aimed at representing.”10 Thus, it is not so much that algorithms work incorrectly as that they correctly work based on data that is in some way limited or even incorrect.

The distinction between generation and representation is key to Wendy Hui Kyong Chun's critical engagement with the gendered and racialized history of computation and software.11 The nature of software, Chun demonstrates, is to generate “simulated visibility” (such as the OS desktop users regard as an actual desktop, the folder users treat as an actual folder, the letters in the word processing program's document window that give the impression that one is writing) while at the same time concealing both its generating activity and the machine performing the generating task.12 Yet this is just one layer of the onion-like play of visibility and invisibility that Chun argues characterizes computation. Even before algorithms compute data that has been formed through previous computational operations, they must be embedded in programs that have been coded using existing programming languages, which of course are themselves programs—that is, simulations—already, thus functioning as yet another mediating, and thus concealing, layer between the user and the actual machine.13 And even before the creation of software, the original programmers and computers were women who served in positions of carrying out (computing) commands typically given to them by men, a history that automated computation performed by software helps to erase.14 Harbored within computation, then, is an entire discourse of mastery that is inseparable from historical events of human beings mastering one another.

All this leaves us at a perplexing spot regarding the apparent racial and gendered biases of data generation. If the biased results data algorithms generate are not due to the algorithms themselves but rather the data archive upon which they are trained, is it simply a matter of selecting more comprehensive training data? Would Google's photo app have been more successful at identifying black people as people rather than as gorillas had its algorithm been given a data training archive that included more black people?

Such a solution, though, ignores the fact that, according to Chun, computation is a dialectic between visibility and invisibility—the fact that computation promises, on the one hand, to make visible what, without computation, is assumed to be invisible, such as connections among apparently unconnected data. On the other hand, in carrying out its instructions, computation hides the fact that its activity engages—and receives its legitimation from—other data, which makes data inherently circular, if not narcissistic.15 This is why data abstraction, according to Chun, is essentially the hiding of information rather than the bringing of information to light.16 Behind data is simply more data—a fact that data hides.

So rather than find more comprehensive training data (an excuse that doesn't really pass the smell test: Did Google's photo app categorize black people as gorillas because its algorithm was trained on a data archive of photographs with gorillas instead of black people, or was it more the case that the archive was composed of photos likening black people to gorillas?), perhaps our attention should be focused on the data that data uses for its legitimation. In the case of computer vision, this means focusing on the media that algorithms treat as providing data upon which to train, specifically photography and the assumed objectivity its data offers. Once we turn our attention in this direction, we can see that the problem is not one of determining the correct data to use, but of examining how data gets recognized as data in the first place. If computer vision algorithms select their data from what has already made it into photographic representation, then how does photography select the objects it photographs? What is determined as worthy of being photographed and how?

Such questions come into focus when read against Richard Dyer's book White (1997), which demonstrates that the history of photographic technology is the history of the privilege accorded to the lighting of white skin.17 Given this, it would be less accurate to say that today's digital recognition algorithms are racially biased than to say that the photographic data upon which they are trained is so. Consequently, what contemporary data algorithms capture are not any real characteristics of the objects they supposedly perceive. Instead, they find the racist—and, to add Chun's findings, misogynist—conditions under which data has been first allowed to appear as “data.”18 

The question of whether racial difference can really be registered by data becomes even more tenuous when we consider the way data is often referred to as “beautiful” or “elegant,” qualifiers from which the racially othered body has traditionally been barred.19 If data is supposed to be beautiful, then racial difference, given its history in Western aesthetics, is precisely what data should not be able to see. Or, whenever data “sees,” it is also not seeing. Or rather, racial difference is disqualified from being seen for what it is; it is instead seen as other than what it is. (The black body, for example, is seen as racially other, while the white body is seen to be neutral, “without race,” that is, not seen as raced.)20 This self-contradictory mode of seeing points to what I would suggest is the inherent visuality of data; it never delivers on its promise to make visible that which has until then remained imperceptible, because its main goal is to make its appearance as data, to serve as a spectacle.21 But this self-contradictory mode of seeing also suggests data's participation in that form of racist seeing described by James Baldwin in the epigraph above, which is actually a combination of the racially othered body's invisibility and hypervisibility. Racial difference, then, might not just be one problem among others for data; racial difference might function as that outside against which data coheres its integrity as data. Racial difference might therefore reveal a problem in the coherence of the idea of “data” itself.

Though it is often unclear what “data” is supposed to refer to, despite its ubiquitous usage in contemporary life, I am less concerned about defining what data is than with bringing out the assumptions that animate its usage. Whether data is the “raw stuff” of information, as Jeffrey Pomerantz defines it, or a conceptual frame that allows for information to appear as information, as Annette N. Markham argues, the idea of data not only implies a kind of seeing, but also offers assurances of objectivity and rationality.22 As I have been suggesting, however, the problem of race places these assurances into question, and attenuates the meanings of “objectivity” and “rationality.” Subjective interpretation already inheres in the supposed objectivity of the data algorithms named at the opening of this essay. As a result, their rationality is only apparently so and is instead arbitrary.

It is precisely at this moment, where rationality becomes merely an appearance of objectivity, that Piper's work can intervene in our understanding of data. Piper calls such apparent objectivity “pseudorationality”—the use of reason to force perceptual data that threatens the experiencing subject's existing understanding of the world into familiar, and therefore safe, categories. For Piper, racism is an exemplary instance of pseudorationality. By using Piper's work exposing the pseudorationality of racist perception, we can uncover the racial and gendered conditions of data's appearance and ultimately call into question the stability of the idea of “data” itself. Once we analyze Piper's conception of pseudorationality, we can then return to her work Cornered and see how it attempts to provoke an awareness of what Piper calls “pseudorational defenses,” which function, she argues, to assure the perceiving subject that its perception is objective. Is it possible to understand data as operating in a similar fashion, that is, as a pseudorational defense? If data only works with previously established data, then it only works with itself. Which is to say it essentially works to affirm itself and defend the idea of “data.”

PSEUDORATIONALITY

In “The Critique of Pure Racism,” a 1990 interview with the art critic Maurice Berger, Piper describes pseudorationality in the following terms:

We disassociate phenomena that are too threatening to be incorporated. We try to shape this unmanageable conceptual input into a neat and coherent view of the world. We do this in science, we do it in politics, and we do it with individuals who are alien to us—people who look different, who talk differently, who don't fit our conceptions of how people ought to be and look. When we see people like this, we try to impose our categories. Then the categories don't work. We get nervous. We try to reimpose the categories, tidying up our conception of the person so that he or she will fit those categories. This is the paradox of human rationality; if we didn't have it, we couldn't function at all. Yet these rational categories are invariably inadequate and insensitive to the uniqueness of an individual.23 

In referring to “phenomena” and “categories,” Piper is employing Immanuel Kant's terminology for the structure of human cognition. At a number of points in the Critique of Pure Reason (1781, 1787), Kant goes so far as to describe this structure as “architectonic.”24 While Kant's conception of human rationality is incredibly intricate, and at times ambiguous, it may be described in this basic way: we take what is given to us via our senses (“phenomena”) and unify these phenomena using basic concepts in our understanding (which Kant calls “categories”), yielding representations of the objects of experience. These representations are unified further into empirical concepts that then form the basis of human knowledge in general.

In her interview with Berger, Piper does not go into the technical aspects of Kant's language. She reserves that analysis for her two-volume Rationality and the Structure of the Self (2013).25 There, she identifies the term she translates as “pseudorational” as it appears in Kant's Critique of Pure Reason and in his Groundwork for the Metaphysics of Morals (1785): vernünftelnde.26,Vernünftelnde judgments, for example, are conclusions one draws from thinking that looks rational, but isn't. At the same time, this kind of reasoning is not exactly false rationality, either. As Piper implies in her conversation with Berger quoted above, when one engages in vernünftelnde thinking, one is still employing the rational mechanisms that make up human cognition. Pseudorationality, though not completely irrational, belongs on a continuum from rationality to irrationality. That is to say, pseudorationality belongs to the apparatus of human reason. According to Piper, the insidious characteristic of pseudorationality is that it offers an appearance of rationality and the illusion of both a coherent world and a unified self.27 

In Piper's account of Kant's epistemology, the main purpose of human rationality is to cohere experience in order to establish the consistency of reason itself. This self-reflective—and, I might add, narcissistic—function of rationality is necessary, Piper asserts, to assure oneself that one is a self who not only is rational but acts rationally in the world, that is, is a moral agent.28 Should the subject not be able to constitute this “self”-assurance by making its experiences intelligible to itself, and thereby also assuring certainty that it is the author of its experiences, the subject's integrity becomes threatened. If the subject becomes overwhelmed by phenomena that it cannot incorporate toward the end of rational self-consistency, psychosis ensues. At stake in human rationality, then, according to Piper, is “literal self-preservation.”29 

Because the chief consequence of not being able to establish one's rational consistency is the self's disintegration, the subject will do whatever it needs to do in order to defend itself against any and all phenomena that thwart the subject's attempts at self-mirroring. It will sort through phenomena in such a way as to confirm the desire that the world obeys what the subject perceives to be its cognition of it. In tandem with this sorting, Piper says, “such an agent is disposed to avoid actions, behaviors, or experiences that undermine the unity of the self, and to react aversively when they are forced upon her or compelled by causal determinants (whether inner or outer) outside her control.”30 Xenophobia is one such instance of an adverse reaction to phenomena the subject is unable to cognize and that therefore threaten its coherence as a subject. In response to phenomena that overwhelm the subject's capacity to recognize, unify, and incorporate sensory input into intelligible experience, the subject responds defensively by forcing this input into categories that distort the phenomena being encountered. As a form of xenophobia, racism is a distortion, but one to which the subject resorts as a defense against the prospect of its dissolution. Instead of taking the foreign as an opening to reconsider its subjective identity, the subject chooses to incorporate only those phenomena that reinforce its identity, and cast as offensive those phenomena that don't. Racism is thus a rational response to the world—a degraded, desperate, pseudorational response, but rational nonetheless.

While Piper's account of pseudorationality allows us to think of racism as a form of malfunctioning rationality, it raises interesting questions about how exactly one is to go about correcting this malfunction. Is it a matter of simply educating someone that they may be engaging in pseudorational thinking? Is it a matter of learning that one is treating “information” (phenomena) incorrectly? If so, then is this habit corrected by merely furnishing more information about how to handle information “properly”? Is it all a matter, in other words, of becoming “more” rational? And if it is simply about the degree of rationality, then how does one square this understanding of rationality with racism's clearly aggressive and often sadistic—that is to say, affective—character? Cornered is one work where Piper explores these questions, but it is also a piece that shows how the problem of pseudorationality is involved in the structure of the visual.

CORNERED: FROM APPARENT REASONING TO REASONING FROM APPEARANCE

The duration of Cornered's video is sixteen minutes, twenty seconds. After stating that she is black and that both the “social fact” of her blackness and “the fact of [her] stating it” is a problem shared by her and the viewer, Piper's newscaster persona goes over various reactions the viewer may or may not be having in response to her statement. This review is punctuated by a number of probing questions. For example, Piper's video persona mulls the possibility that the spectator might respond to her stating that her racial identity is black as “making an unnecessary fuss,” and if that is the case, then, she suggests, addressing the viewer directly, “you must feel that the right and proper course of action for me to take is to pass for white.” After suggesting further that her choice to state that her racial identity is black might be perceived as a rejection of whiteness and a form of hostility to those who identify as white, Piper's video persona then asks, “Do you feel affronted? Or embarrassed? Or accused?” If this is indeed the case, then not only is this indicative of a problem the viewer has, says Piper's character, this problem is traceable to the viewer's “presumption that [she is] white. So perhaps, the solution is for you not to make that presumption. About anyone.”

While Piper's video persona clearly makes an argument about prejudicial looking, this argument is not just being made through Cornered's video address. The video's mise-en-scène and the rest of the work's visual elements cannot be separated from the argument being proffered to the viewer. As should be clear by the content of the video address, the possible positions attributed to the viewer, as well as the questions posed to the viewer, are predicated first of all on the apparent fact that Piper's character is white. (We might say that someone who is “visibly black,” as Piper puts it in the script, would not be imagined as having to declare, “I'm black.”) This is why she says the viewer might assume it would be better for her to pass as white and not declare that she is black.31 Piper herself has written of past experiences where she had been assumed to be white and was subjected to others’ racism as a result of them thinking there were no black people present to hear it, but Cornered adds other elements that can be said to enhance the presentation of Piper's character as white.32 These elements include her manner of dress, which can be understood as signifying not only whiteness, but also a certain class status. Together with a newscaster's tone, these elements reassure the viewer that she is not “angry, overbearing, pushy, [or] manipulative”—stereotypical signifiers of a black woman.33 The image Piper produces with her character is one of familiarity, of comfort, however one that is no less dependent upon a very specific—and non-neutral—configuration of race, class, and gender.34 

Additional aspects of Cornered that confound the viewer's perceptual experience include the pairing of her father's two birth certificates, which present conflicting information. Was the person who “incorrectly” identified her father as white at birth the same one who had to “correct” that information on the second certificate? Before that, who is to say what is correct or incorrect? Together with the video's mise-en-scène, the birth certificates work at “unhooking racial identity from the realm of the visible,” as Peggy Phelan remarks, and also underscore the investment that the discursive determination of race has in the visible.35 

Despite its exposure of the knot tying together the visual with racial judgment, Cornered nonetheless stresses the difficulty of undoing this knot. The imbrication of racial judgment with the visual places the racially othered subject into a corner: Piper's video persona proceeds to connect the installation's literal cornered arrangement to her own feelings, on the one hand, of being cornered into the lose-lose scenario of either declaring her racial identity and suffering the aggressions of those who think she is making a fuss about race, or remaining silent and suffering racist remarks by those who think they are not in the company of black people.36 On the other hand, by refusing to bear the burden of the problem of race alone, Piper's character corners the spectator into confronting the problem of race as one that cannot be foisted simply onto racially othered bodies. Instead of race being the burden of the one being perceived, Cornered insists that it be a burden carried by the perceiver, who trusts their perceptual abilities and relies on visual cues—most immediately, skin color—to make judgments about a person's race. Race, the work implies, is a problem inhering in the visual itself and particularly in the visual scaffolding upon which information about race tacitly relies.

Once the work confronts the viewer with the limits of their perception of the race of Piper's video persona, it goes on to put into doubt the viewer's perception of their own race. Drawing on the history of miscegenation in the United States, Piper's character informs the viewer that “in fact, some researchers estimate that almost all purportedly white Americans have between 5% and 20% black ancestry.” She then continues:

Now, this country's entrenched conventions classify a person as black if they have any black ancestry. So most purportedly white Americans are, in fact, black. Think what this means for your own racial classification. If you've been identifying yourself as white, then the chances are really quite good that you're in fact black. What are you going to do about it?

Piper's newscaster persona then goes through some options the viewer has in light of the probability that he or she is actually black: Will they research their family ancestry and determine the “truth” of their racial identity? If they find out a “mistake has been made” and that they are actually black, will they then go out and communicate this fact to friends and colleagues, or to their boss? Or will they keep quiet and continue to enjoy passing as white? Or, if a mistake has not been made, if they are actually part of the “white minority,” will they go around proudly proclaiming this?

Cornered's overturned, barricade-like table thus quickly transforms into a turning of the tables on the viewer. The defensiveness of the table-as-barricade shifts from Piper to what the work anticipates will be the pseudorational defenses the viewer will take up after having been confronted with the destabilizing information of the likely truth of their racial identity. Piper calls this moment of confrontation and self-questioning the “indexical present”—the “here and now” that exists between the work and the viewer, between the addressor and the addressee.37 “Artwork that draws one into a relationship with the other in the indexical present,” she writes, “trades easy classification—and hence xenophobia—for a direct and immediate experience of the complexity of the other, and of one's own responses to her. Experiencing the other in the indexical present teaches one how to see.”38 For Piper, xenophobia is constituted by “failures of vision.”39 Unknown to those who suffer such “failures of vision” is the fact that they are working with insufficient data, a term Piper actually employs in describing pseudorationality.40 By bringing the viewer into the indexical present, works such as Cornered attempt to correct this failure, to show the viewer that the “data set” with which he or she had been working (on which they have been training) is both severely limited and limiting. Cornered is a pedagogy of racism's visual grammar, of racism's aesthetic ground.

But Cornered is not just a pedagogy of racism's visual grammar; it also frustrates this grammar, disrupting the possibility of making sense with this grammar. As Phelan observes, frames proliferate throughout Cornered.41 From the framing of Piper's father's two birth certificates, to the framing of the video, to the framing of the viewer's possible reactions in the text of the video's script, Cornered does not lack for information. It offers an abundance of information, all pointing to other information but yet still not offering any epistemological closure, not offering any direct access to the object (in this case, racial difference). The informational circuits Cornered presents are not unlike the digital circuits in which data traffics. As such, one might say Cornered disrupts the appearance pseudorationality gives of itself as the successful working of rationality. Cornered exposes pseudorationality as an apparent rationality and as a reasoning tied to appearance.

By exposing racist pseudorationality's dependence on the visual, however, Cornered also raises a question concerning rationality's overall dependence on the visual. If pseudorationality is based in a failure of vision, then this suggests that rationality is correct vision. To what extent, then, is rationality as such dependent on the visual? To what extent is it possible to know when one has attained a fully rational form of vision? Is it possible that visuality itself is pseudorational at its heart? This would mean that insofar as data—and rationality as such—is grounded in the visual, it is also pseudorational at its heart. At the very least, the stability of these terms—rationality and data—should now be in question.42 

THE PSEUDORATIONALITY OF DATA AND THE OBJECTIFICATION OF RACIST SEEING

Part of Cornered's critical force has to do with the way it implicates media in pseudorationality. If the declaration by Piper's character that she is black somehow takes the viewer by surprise, it is due not immediately to the appearance of Piper's character as white, but to how video (the medium) provides the conditions for her appearance as white. In Cornered, video helps to uphold what Piper calls the ideology and “myth of racial separation.”43 “We” are used to distinguishing race through such media (but also through print media, as Piper's father's two birth certificates demonstrate), but this is because “we” take for granted the powers of media to be able to present racial difference, trusting what the machine sees and conflating that with “our” own perception. In this way, the failed vision that is pseudorationality is inseparable from the way media gives to see.

Thus, can it even be said that “we” are doing the seeing, or that we ever did? As Chun informs us, there are significant differences between older forms of media (photography, film, video) and computational media, the chief one being that the former is representational and the latter generative. The former presupposes the existence of an object being represented, while the latter creates simulations that we interact with as actual objects. We know, however, due to Dyer's work in the history of photography that such representational media are not objective. Of course, there is the matter of framing, of which one might argue the selection of training data for the programming of data algorithms is an extension. But as Dyer shows, there is also the history of film stock technology that took as its primary task the calibration of film to “properly” expose white skin, revealing that the history of photography and cinema is the history of taking the white body as the proper subject of representation.44 Whether an older or newer form of media, what we are given to see has already been selected as being eligible to be seen by the machine, which means “our” seeing is not ours, at least not immediately. This furthermore means that even our racist and sexist forms of seeing are no longer ours; with computation, we have instead outsourced them to machines to carry out in an automated way, as pointed out by Chun. Insofar as they are biases embodied by machinic operations, we have enabled racism and sexism to become truly objective.45 It all ends up being a convenient alibi: it's not us who are racist or sexist, it's our algorithms. Yet, how do we see the world? In racist and sexist terms, which we confirm with data.

It seems to me, then, that Cornered can contribute to our understanding of how data perpetuates the racist and sexist forms of seeing carried out by older media forms in two ways: first, by showing how media, as a means of making sense of the world, upholds the myth of racial separation and the unquestioned belief that this separation—this difference—can be captured visually, and second, by showing that the myth of racial separation—the pseudorationality of racist seeing—belongs to how we make sense of the world. As Cornered demonstrates, providing more information does not dispense with this myth. Since it is “information” as such that is suspect, providing more of it would simply beg the question. Instead, the task is to disrupt information's visual ground and data's perpetuation of this ground.

NOTES

NOTES
1.
Studs Terkel, “An Interview with James Baldwin” (1961), in James Baldwin: The Last Interviews (New York: Melville House, 2014), 8.
2.
Wendy Hui Kyong Chun, “On Software, or the Persistence of Visual Knowledge,” Grey Room 18 (Winter 2004): 44.
3.
“Octoroon” (meaning one-eighth black) is the term Piper specifies in her description of Cornered. Adrian Piper, Out of Order, Out of Sight, vol. 1, Selected Writings in Meta-Art 1968–1992 (Cambridge, MA: MIT Press, 1996), xiii.
4.
“Gunboat formation” is the phrase used in the catalogue for the 1999 retrospective of Piper's work to describe Cornered's seating arrangement. Maurice Berger, ed., Adrian Piper: A Retrospective (New York: DAP, 1999), 176.
5.
I quote from Piper's script for the work as it appears in Adrian Piper, “Cornered: A Video Installation Project,” in Theory in Contemporary Art since 1985, ed. Simon Leung and Zoya Kocur (Malden, MA: Wiley-Blackwell, 2005), 182–86. All subsequent quotations of the video script are taken from this publication.
6.
Cornered has been written about extensively in the art historical scholarship and in exhibition reviews. As a result, installation views and still shots from the piece's video component abound in print and on the Internet. For a representative installation view of the work, see for instance the digital archive for the 2014 exhibition Take It or Leave It: Institution, Image, Ideology at the Hammer Museum in Los Angeles: https://hammer.ucla.edu/take-it-or-leave-it/art/cornered/. In print, see Berger, Adrian Piper: A Retrospective, 112.
7.
Kate Crawford, “Artificial Intelligence's White Guy Problem,” New York Times, June 25, 2016, accessed April 28, 2017, http://www.nytimes.com/2016/06/26/opinion/sunday/artificial-intelligences-white-guy-problem.html.
8.
Tarleton Gillespie, “Algorithm,” in Digital Keywords: A Vocabulary of Information Society and Culture, ed. Benjamin Peters (Princeton, NJ: Princeton University Press, 2016), 20.
9.
Ibid., 20–21. See also Richard Szeliski, Computer Vision: Algorithms and Applications (London: Springer-Verlag, 2011); and Jentery Sayers, “Notes toward Speculative Computer Vision,” presentation for the 2016 Meeting of the American Studies Association, accessed November 30, 2016, https://github.com/jentery/asa/blob/master/talk.md. Sayers cites Szeliski in his historical overview of computer vision and how it appears to be supplanting human vision while still giving the impression that a human viewer, not a computer, is undertaking the act of seeing. Sayers suggests that in functioning as the ideal viewer of computer vision, this idea of the human helps sustain the appearance that the computer sees the same way humans being do. I thank Lauren Klein for drawing my attention to Sayers's paper.
10.
Antoinette Rouvroy, “The End(s) of Critique: Data-Behaviourism vs. Due-Process,” in Privacy, Due Process and the Computational Turn: The Philosophy of Law Meets the Philosophy of Technology, ed. M. Hildebrandt and K. de Vries (New York: Routledge, 2013), quoted in Ronald E. Day, Indexing It All: The [Subject] in the Age of Documentation, Information, and Data (Cambridge, MA: MIT Press, 2014), 133.
11.
Chun, “On Software, or the Persistence of Visual Knowledge,” 26–51; Wendy Hui Kyong Chun, “Race and/as Technology or How to Do Things to Race,” in Race after the Internet, ed. Lisa Nakamura and Peter A. Chow-White (New York: Routledge, 2012), 38–60.
12.
Chun, “On Software, or the Persistence of Visual Knowledge,” 40.
13.
Ibid., 28, 38. See also Gillespie, “Algorithm,” 21.
14.
Chun, “On Software, or the Persistence of Visual Knowledge,” 33ff.
15.
I have chosen to refer to data in the singular, thus keeping with its increasingly conventional designation as a mass noun, rather than as the plural of “datum,” which would be consistent with its employment in technical, scientific contexts.
16.
Chun, “On Software, or the Persistence of Visual Knowledge,” 37.
17.
Richard Dyer, White (New York: Routledge, 1997).
18.
A similar phenomenon has been observed in language recognition programs. See Hannah Devlin, “AI Programs Exhibit Racial and Gender Biases, Research Reveals,” Guardian, April 13, 2017, accessed April 28, 2017, https://www.theguardian.com/technology/2017/apr/13/ai-programs-exhibit-racist-and-sexist-biases-research-reveals.
19.
On the current habit of referring to data as “elegant” and “beautiful,” see Orit Halpern, Beautiful Data: A History of Vision and Reason since 1945 (Durham, NC: Duke University Press, 2014), 5. On the exclusion of racially othered bodies, especially the black female body, from the regime of beauty in Western aesthetics, see Lorraine O'Grady, “Olympia's Maid: Reclaiming Black Female Subjectivity,” in The Feminism and Visual Culture Reader, ed. Amelia Jones (New York: Routledge, 2003), 174–87; as well as Deborah Willis and Carla Williams, The Black Female Body: A Photographic History (Philadelphia: Temple University Press, 2002).
20.
I thank an anonymous reviewer of this article for this formulation and correction.
21.
See Halpern, Beautiful Data, 21–27. See also Melissa Gregg, “Inside the Data Spectacle,” Television and New Media 16, no. 1 (2015): 37–51. Gregg discusses the spectacularization of data and the ways this spectacularization becomes the real product of data, not what data is supposedly bringing to visibility.
22.
Jeffrey Pomerantz, Metadata (Cambridge, MA: MIT Press, 2015), 19ff; Annette N. Markham, “Undermining ‘Data’: A Critical Examination of a Core Term in Scientific Inquiry,” First Monday 18, no. 10 (October 7, 2013): http://firstmonday.org/article/view/4868/3749.
23.
Maurice Berger, “The Critique of Pure Racism: An Interview with Adrian Piper,” in Adrian Piper: A Retrospective, 88.
24.
Immanuel Kant, Critique of Pure Reason, trans. Norman Kemp Smith (London: Macmillan, 1929), 653ff.
25.
Adrian M. S. Piper, Rationality and the Structure of the Self, vol. 1, The Humean Conception, 2nd ed., and vol. 2, A Kantian Conception, 2nd ed. (Berlin: APRA Foundation, 2013).
26.
Piper, Rationality and the Structure of the Self, vol. 2, 289n1.
27.
Ibid., 290.
28.
Piper writes: “Were a rational agent unable to implicitly recognize himself as rational, he would be unable to make himself—or, therefore, the rest of his experience, including his actions—rationally intelligible to himself at all. Failure of implicit self-recognition would be equivalent to failure of rationality and therefore failure of agency for a fully effective intellect.” Ibid., 217.
29.
Ibid., 190ff.
30.
Ibid., 191.
31.
Originally, Piper explains, she “had wanted a white, Diane Sawyer newscaster-type actress to do the monologue and actually auditioned some white actresses.” However, she states, “I wasn't a good enough director to elicit the delivery I wanted. People were too strident, or too campy, or too serious. I just couldn't get across what I wanted them to do. So I decided to do it myself.” Adrian Piper, “Xenophobia and the Indexical Present II: Lecture,” in Out of Order, Out of Sight, vol. 1, 272–73.
32.
See Adrian Piper, “Passing for White, Passing for Black,” in Out of Order, Out of Sight, vol. 1, 275–307.
33.
These are qualities Piper herself names in response to a New York Times review of Cornered that she says paints her as a stereotypical black woman. See her letter to the editor at http://www.adrianpiper.com/ny_times.shtml.
34.
At one point in the video, Piper's persona remarks, “Because if someone can look and sound like me and still be black, then no one is safely, unquestionably white. No one.” Piper, “Cornered: A Video Installation Project,” 184.
35.
Peggy Phelan, Unmarked: The Politics of Performance (London and New York: Routledge, 1996), 8.
36.
Piper, “Cornered: A Video Installation Project,” 183–84.
37.
Adrian Piper, “Xenophobia and the Indexical Present I: Essay,” in Out of Order, Out of Sight, vol. 1, 248.
38.
Ibid.
39.
Ibid.
40.
Berger, “The Critique of Pure Racism,” 92. See also Adrian Piper, “Xenophobia and Kantian Rationalism,” in Feminist Interpretations of Immanuel Kant, ed. Robin May Schott (University Park: Pennsylvania State University Press, 1997), 21–73. There she writes: “[Xenophobia] is a paradigm example of reacting self-protectively to anomalous data that violates one's internally consistent conceptual scheme” (48).
41.
Phelan, Unmarked, 8.
42.
Here I want to acknowledge that I am parting ways with Piper on the validity of rationality. She maintains a belief in the power of rationality and objectivity, while I am much more suspicious of the possibility of rationality's employment outside the conditions of power. For a statement of Piper's position on rationality, see Berger, “The Critique of Pure Racism,” 84.
43.
Berger, “The Critique of Pure Racism,” 78.
44.
Dyer, White, 89–94.
45.
This is the continuity Katherine McKittrick sees from the slave ledger, as documented in Saidiya Hartman's essay “Venus in Two Acts,” to current predictive policing software, such as PredPol, which, McKittrick argues, assumes the disposability of black life as part of its algorithmic calculations. According to McKittrick, both the slave ledger and PredPol are modes of subjecting the black body to a biopolitical gaze. Katherine McKittrick, “On Algorithms and Curiosities,” keynote presentation for the Duke Gender, Sexuality, and Feminist Studies 11th annual Feminist Theory Workshop, March 25, 2017, Durham, North Carolina. See PredPol.com and Saidiya Hartman, “Venus in Two Acts,” small axe 26 (June 2008): 1–14.