This article explores the rise of deepfake satire as one of the most vibrant subgenres of experimentation within the expanding field of AI art. A combination of “deep learning” and the word “fake,” deepfake videos depict people doing or saying things that they never did or said. While deepfakes have been used to deceive and harm individuals as well as a mass audience, they have also been used toward alternative ends. Deepfake satire offers artful social critique, interrogating the individuals and institutions it portrays as much as the technology used to create it and the platforms on which it circulates. Videos range from snarky jabs at entertainment industry personalities to hard-hitting takedowns of tech entrepreneurs and authoritarian leaders. This speculative form of social imagining makes threatening and seemingly untouchable figures appear weak, vulnerable, and exposed, which can help envision a more equitable and just future. Close examination of specific projects within a broader media ecology leads to a nuanced understanding of both the menace and progressive possibilities of synthetic media—beyond simple “utopia”/“dystopia” binaries—in addition to expanding the grammar of computational art.

Over the last ten years, Artificial Intelligence (AI) technologies have increasingly shaped the production and circulation of moving image media across a wide range of platforms. No form of AI-enabled (or “synthetic”) media has been more hotly debated by journalists, scholars, politicians, or public intellectuals than deepfake videos. A portmanteau of “deep learning” and the word “fake,” deepfakes essentially show people doing or saying things they never did or said. When they first emerged circa 2017, press coverage noted the striking realism of the videos along with the accessibility of the tools to make them. Some particularly alarmist headlines described how the proliferation of deepfakes signaled the coming “collapse of reality,” that it might no longer be possible to distinguish between an audiovisual record of an actual event and a falsified simulation.1 While the “information apocalypse” did not come to pass, deepfakes have made their presence felt in significant and, at times, unanticipated ways. To be sure, some of the most prominent applications of the technology have been used toward explicitly harmful and deceitful ends. Nonconsensual deepfake pornography, for example, constitutes a harrowing form of online violence that has offline consequences for women’s physical and mental health. At the same time, alternative forms of synthetic media are making inroads into studio Hollywood, museums of fine art and public history, and progressive advocacy campaigns. Scholars in cinema and media studies have long written about the misadventures of sentient machines on screen; however, the complex ways through which AI is influencing the craft of motion picture design, along with distribution and exhibition practices, have only recently begun to receive more attention. As scholars Lisa Bode, Dominic Lees, and Dan Golding argue, deepfakes offer new possibilities for creative expression, but also prompt important questions about rights to likeness, the ethics of digital resurrection, the fraught connection between commerce and art, and the role of platforms in mediating the viewer’s encounter with a specific film.2

Deepfake satire is one of the most vibrant yet underexamined subgenres of synthetic media experimentation.3 Even as it boasts a highly public and promiscuous life, it tends to get lost between two fields of inquiry. First, research on screen comedy has focused on historical genres such as silent-era slapstick and classical Hollywood screwball films, legacy formats including the TV sitcom, and the performances of individual comics within the spheres of standup, sketch comedy, and late-night talk show series.4 There have also been some inventive, interdisciplinary studies on computational humor, but they have primarily concentrated on the development of machine learning tools that generate or detect jokes.5 Second, scholarly attention to the relationship between emerging tech and contemporary cinema has centered on high-profile Hollywood productions and the fan cultures that surround them. Innovative research on digital character creation and world-building has led to deeper understandings of the contentious practice of “de-aging” stars, how the industry’s gender and racial prejudices inform the making of machine learning tools, and the ways that AI-inflected films both flaunt and mask the work of below-the-line labor, especially that of animators, production designers, and visual effects artists.6

One of the key features of deepfake satire is that it draws on the complementary strengths of AI as a constellation of tools and satire as a cultural form that holds powerful figures up for evaluation and critique. The elasticity of facial and bodily design afforded by machine learning allows artists to sculpt their human subjects’ expressions and actions in myriad ways. As film scholar Jaimie Baron notes, these acts of audiovisual ventriloquism can result in an outlandish performance that serves a higher social purpose.7 Deepfake satire ranges from snarky jabs at entertainment industry talent to hard-hitting takedowns of tech entrepreneurs and authoritarian leaders such as former Brazilian President Jair Bolsonaro and Russian President Vladimir Putin. Videos lay bare societal inequities that are deliberately obscured from public view, like in Trey Parker and Matt Stone’s raucous sendup of American political culture, Sassy Justice with Fred Sassy (2020).

Deepfake satire derives its rhetorical force (and pleasurable allure) from the persuasiveness of the performance, while also signaling the artifice of its own creation. This helps viewers to see the video as commenting on the individuals and situations depicted and setting themes into sharp relief. Certainly, deepfake satire taps into a rich artistic tradition. Often embracing a combination of appropriation, play, and attack, it recalls the irreverent, left-liberal practice of “culture jamming.”8 Investigating the early use of the term in the 1990s, scholars Marilyn DeLaure and Moritz Fink describe how culture jamming originally applied to “a range of tactics used to critique, subvert, and otherwise ‘jam’ the workings of consumer culture.” They write that more recently, targets have come to include not only major players in the film, television, and advertising industries, but a wider breadth of cultural and political agents.9

Positioning deepfake satire within a broader moving image ecology ultimately helps to clarify the threats and possibilities of synthetic media. It also contributes to an understanding of AI beyond what Sofian Audry sees as the polarizing binary of AI-as-nightmare on the one hand, where robots will somehow “supplant humans as the superior intelligent species” and AI-as-fantasy on the other hand, where “techno-optimist choirs chant the libertarian utopia of a postwork, postdemocratic world…”10 Combatting the deluge of disinformation that thrives online, deepfake satire constitutes a speculative form of progressive social imagining, pointing toward an array of consequential truths.

Audiovisual simulations of human action and speech are nothing new. Still, the rise of broadband internet, heightened access to big data, and the invention of graphics processing units have allowed media makers to adopt deep learning methods in their pursuit of ever more realistic creations.11 The “face-swap” is the most common kind of deepfake. The goal is to replace a “target face” with a “source face,” thus giving the impression that an individual selected by the filmmaker was in fact saying or doing something depicted in the target person’s video. For example, the viral internet subgenre of “Cagefakes” feature the profile of actor Nicolas Cage swapped onto the body of other actors in hit films. Postproduction craft may then help to refine the performance, ensuring that the texture of the skin along with the micro-movements of the eyes, head, and neck appear natural. Making a convincing representation is indeed a labor-intensive endeavor. Nonetheless, there are a large number of apps that have made face-swaps possible with a minimum of images. Reface superimposes faces onto gifs. MyHeritage’s Deep NostalgiaTM animates old photographs. WOMBO generates lip-synching portraits. FaceApp ages faces and Zao draws on film and TV clip libraries for voice modulations. DeepFaceLab is open-source software that allows for involved interaction with datasets to make elaborate deepfakes.

Concern for face-swapping as a means of inflicting harm stems from one of its most damaging and widely practiced applications. Indeed, the word “deepfake” first emerged on Reddit in November 2017, in reference to nonconsensual pornography where the faces of female celebrities were placed on the bodies of adult film stars. As journalist Samantha Cole reported in Vice’s Motherboard publication, a user who went by the handle “u/deepfakes” started a forum by the same name to share source code and discuss techniques for producing such videos.12 Malicious deepfakes quickly moved from the fringe sectors of the internet—forums such as Reddit, 4chan, 8kun, and Voat—to mainstream social media and video-sharing platforms.13 Legal scholar Danielle Citron describes these videos as “say[ing] to individuals that their bodies are not their own and can make it difficult to stay online, get or keep a job, and feel safe.”14 Emily van der Nagel and Sophie Maddocks analyze deepfake pornography as a form of misogynistic control and a violation of civil rights. They argue that feminist advocacy geared toward legislation along with public pressure on platforms is necessary to reimagine models of accountability and consent online.15

While the widespread knowledge that deepfakes exist may lead figures in power to evade responsibility for their actions, deepfakes have already resulted in varying levels of debate and confusion. The controversy surrounding whether Gabonese President Ali Bongo’s 2019 New Year’s video address was, in fact, a deepfake or just a poorly made film of an awkwardly delivered speech, led to an attempted government takeover. The 2021 coup in Myanmar involved the suspected use of a deepfake in the video testimony from detained political prisoner Phyo Min Thein.16 Russian operatives created a deepfake of Ukrainian President Volodymyr Zelenskyy in March 2022 in an effort to demoralize the country’s citizenry and international allies. The film showed Zelenskyy in his usual military fatigue T-shirt, telling soldiers to lay down their arms and surrender. A coordinated response, including Zelenskyy himself posting a selfie-styled message on Telegram and Facebook, quickly debunked the deceptive video.17

Finally! from Ukraine 24, labeled deepfake image of Volodymyr Zelensky, Facebook,

Finally! from Ukraine 24, labeled deepfake image of Volodymyr Zelensky, Facebook,

Close modal

Even as the threat of malicious deepfakes remains quite real, alternative uses of the technology abound. AI has expanded the toolkits of artists in the entertainment industry. To design an emotive supervillain for the Marvel blockbuster Avengers: Infinity War (2018, directed by Anthony Russo and Joe Russo), the visual effects (VFX) house Digital Domain successfully braided together actor Josh Brolin’s expressive performance with a digitally created Thanos. For The Irishman (2019, directed by Martin Scorsese), the process of “de-aging” Robert De Niro, Al Pacino, and Joe Pesci involved Industrial Light & Magic using bespoke machine learning software to match frames from the actors’ past films with new scenes shot and rendered using computer-generated imagery.18 Industrial Light & Magic has been particularly active in this area, winding back the clock on Harrison Ford for the fifth installment in the Indiana Jones series (Indiana Jones and the Dial of Destiny, 2023, directed by James Mangold) as well as fashioning a youthful Luke Skywalker for the Star Wars spinoff television series The Mandalorian (2019) and The Book of Boba Fett (2021–22).

“De-aged” Harrison Ford from Indiana Jones and the Dial of Destiny (2023); image from James Hibberd, “Dial of Destiny Director James Mangold Explains How Time Has Changed Indiana Jones,” Hollywood Reporter, February 14, 2023,

“De-aged” Harrison Ford from Indiana Jones and the Dial of Destiny (2023); image from James Hibberd, “Dial of Destiny Director James Mangold Explains How Time Has Changed Indiana Jones,” Hollywood Reporter, February 14, 2023,

Close modal

Museums have been looking into synthetic media as a means to devise new kinds of exhibitions and as a subject in its own right. Dimensions in Testimony (2015–present), which has traveled to Jewish heritage and Holocaust museums around the world, allows visitors to converse with holographic avatars of Holocaust survivors. The Salvador Dalí Museum’s AI installation of the surrealist painter provides an initial point of orientation for visitors to learn about his life and art. The Museum of the Moving Image’s show Deepfake: Unstable Evidence on Screen (on view 2021–22) investigated the potential manipulative power of synthetic media as well as the artistic and civic uses of the same base technologies through a curated selection of case studies.19

Social justice and public health projects have also mobilized machine learning tools for strategic purposes. One of the most striking examples is the human rights documentary Welcome to Chechnya (2020, directed by David France), in which VFX supervisor Ryan Laney designed faces (“digital veils”) for the film’s persecuted LGBTQ+ subjects—an effort that protected their identity and still allowed them to retain a human face and communicate affectively.20 The initiative Malaria Must Die used synthetic audio to make football legend David Beckham speak nine languages in his address to the international community about the deadly disease.21 Lastly, the journalism rights organization, Propuesta Cívica, “resurrected” the murdered Mexican journalist Javier Valdez Cárdenas as part of their #SeguimosHablando campaign against state-backed violence toward the press.22

Image from Ian Failes, “How Welcome to Chechnya Used A.I. and Machine Learning Techniques to Mask the Doco’s Subjects,” Befores & Afters, March 7, 2021,

Image from Ian Failes, “How Welcome to Chechnya Used A.I. and Machine Learning Techniques to Mask the Doco’s Subjects,” Befores & Afters, March 7, 2021,

Close modal

Considered broadly, satire is a hybrid mode of creative expression that draws on comedic tropes such as hyperbole, caricature, and irony to cast judgment on an individual, organization, or worldview.23 While parody primarily imitates a person or style for amusement, satire explicitly encourages critical evaluation of its target subject, oftentimes revealing a truth that may be hidden or obscured from public view. Writing in the mid-1700s, the English critic and lexicographer Samuel Johnson defined satire as exercising moral judgment, holding “wickedness and folly” up for censure.24 In this way, satire dovetails with how the Russian literary theorist Mikhail Bakhtin conceptualized the socially progressive dimensions of popular comedy—as effectively bringing “high subjects” such as state politics and official forms of history to a “plane equal with contemporary life, in an everyday environment, in the low language of contemporaneity.”25 Laughter—a frequent but not always present component of satire—can serve as a type of conscientious reflection within this experience, productively transporting an object “into a zone of crude contact where one can finger it familiarly on all sides, turn it upside down, inside out, peer at it from above and below, break open its external shell…”26 Elaborating on satire’s distinct mode of engagement, media scholars Jonathan Gray, Jeffrey P. Jones, and Ethan Thompson describe how its “calling card is the ability to produce social scorn or damning indictments through playful means…”27 “Play” implies a practice of toying or meddling with the target so as to deliver the hard-hitting critique in a way that appears lighthearted and humorous. This can function strategically, enabling satire to fly under the radar of censors and gatekeepers who may treat it as mere “entertainment.” Additionally, it speaks to satire’s mass appeal and even the pleasure it can bring to audiences.

Theorists have also explored satire’s role in mediating identity and its ability to upend oppressive societal norms surrounding race, gender, class, and sexual orientation. Analyzing the relationship between satire and African American identity, cultural studies scholar Danielle Fuentes Morgan writes that satire can “destabilize the mainstream acceptance of and propagation of the racial status quo” and “in doing so, satire creates new realms in which social justice might be enacted by disrupting social expectations and demonstrating the connection between laughter and ethical beliefs.”28 In turn, satire’s social uptake is a crucial part of how it functions. As comedy scholar Dannagal Goldthwaite Young investigates, the argument that a work of satire aims to make is ultimately realized and fully formulated by the audience, who not only decipher its meaning, but help shoulder responsibility for circulating the work of art out in the world and facilitating its resonance within a larger sociopolitical context.29

Throughout the long 2010s, the global weakening of civil society institutions, the expansion of the media industries, and the rise of both right-wing extremist movements and left-liberal calls for social justice, have led to satire’s increasingly prominent place in public life. Cultural historian James E. Caron notes that satire retains postmodernism’s resistance to grand metanarratives and faith in scientific progress, along with an investment in localized forms of truth-telling and a capacity to contribute to meaningful social action. It possesses a “reformist” impulse that does not necessarily engender a policy change, but rather entails a “potential metanoia, a change in thinking, perception, or belief, even a repentance of the old way of thinking, perceiving, believing.”30 Just as the reference-rich dimensions of satire make it well-suited for an internet culture that thrives on remixing and sampling, its combination of play, performance, and critique places it within a long tradition of subversive art. Notable precedents include Dada and surrealist installations, John Heartfield’s anti-fascist photomontage, situationist detournement, Yippie street theatre, anti-globalization video art, and the irreverent sketch comedy and monologues of late-night TV.31

One of the first widely viewed projects was You Won’t Believe What Obama Says in This Video! (2018). It functions as a high-concept public service announcement, calling out the threat of tech-savvy disinformation. Jordan Peele’s Monkeypaw Productions partnered with Peele’s brother-in-law Jonah Peretti of BuzzFeed News to simulate an on-camera address by Barack Obama. With the help of producer Jared Sosa, the team used FakeApp along with Adobe After Effects for granular detailing of the former President’s head and neck. In the video, Peele’s vocal impression of Obama syncs with the facial movements of the character, warning viewers that “we’re entering an era in which our enemies can make it look like anyone is saying anything at any point and time, even if they would never say those things.”32 Obama then declares that Ben Carson resides “in the sunken place” and calls Donald Trump a “total and complete dipshit.” The concluding reveal situates Obama’s digital face side-by-side with that of Peele speaking the lines, demonstrating just how convincing this kind of realist fabrication can seem.

Jordan Peele’s Monkeypaw Productions and BuzzFeed News, You Won’t Believe What Obama Says in This Video!, April 17, 2018,

Jordan Peele’s Monkeypaw Productions and BuzzFeed News, You Won’t Believe What Obama Says in This Video!, April 17, 2018,

Close modal

Peele’s wry mention of the “sunken place” recalls his 2017 social-horror film, Get Out (2017), but the video’s larger conceptual reference point is his popular sketch, Obama’s Anger Translator, which he made with Keegan-Michael Key for their Comedy Central series, Key & Peele (2012–15). In the sketch, Key plays Luther, the embodied conscience of Obama (played by Peele), who attempts to translate a weekly presidential address. Performing an imagined Black persona through a repertoire of gestures, dress, and speech patterns, Luther expresses what Obama really thinks about issues ranging from the threat of North Korea to the Tea Party. The sketch foregrounds the contradictory position in which Obama was placed, at once holding the historic mantle of the first Black president and conforming to a postracial mythos that requires he appear safe and palatable to white politicians and a white electorate.33 You Won’t Believe involves a different kind of layered performance, a tongue-in-cheek treatment of Obama’s multiple identities and the role of media in shaping them. On one level, the video demonstrates the convincing power of deepfakes, encouraging viewers to be critical of what they see and hear. At the same time, the video serves as a performance of Obama talking back to his detractors in freewheeling fashion, conveying ideas and using language that might be closer to what he believes, but that the demands of his party, his office, and the larger white public would not allow. You Won’t Believe expresses a desire to see the real-life, post-presidential, but not post-political Obama assert himself in a time when Trump and his allies were using executive authority to dismantle civil society institutions and many central tenets of the democratic electoral process.34

While the film and TV industry have proven to be early adopters of AI, they have also been the target of derision. The entertainment news site Collider created an interview showcase, Above the Line, which reimagined the Hollywood Reporter’s roundtable series with a cast of deepfaked celebrities including Robert Downey Jr., George Lucas, Tom Cruise, Ewan McGregor, and Jeff Goldblum.35 Comedian Mark Ellis hosted the event. Each individual’s face is swapped with a professional impersonator whose body we see and voice we hear. The meandering and, at times, nonsensical conversation, superficially framed around “the streaming wars,” takes aim at the obsessions and self-aggrandizing tendencies of top industry talent. Lucas’s emphatic statement about hoarding copies of Star Wars DVDs in the early days of Netflix pokes fun at his aggressive stance on intellectual property. Cruise’s proclamation, “I walk to the beat of my own drum, I do what I want, I’m a producer these days, I do my own stunts, I do everything…all the Impossibles,” highlights the actor’s manic, often unhinged confidence and desire to flaunt his commitment to his roles. Downey Jr.’s contention that he deserved an Oscar for best actor because “I’ve given it my all, I’m a team player, I’m a nice guy…and I am Iron Man” underscores the Marvel Cinematic Universe’s efforts to align the actor, whose career had undergone a rollercoaster of highs and lows before the franchise, with the film’s Tony Stark character. The scatological comedy throughout the conversation—whether it is Lucas’s nonstop farting, references to the Star Wars franchise as a “diarrhetic,” or Downey Jr.’s admission that he literally peed his pants when watching Star Wars: A New Hope (1977, directed by George Lucas)—alludes to the entertainment industry’s overreliance on “bathroom humor.”

Above the Line works well as a one-off facetious sketch, but its architects stopped short of advancing a deeper interrogation of the industry. The verbal slapstick takes jabs at the stars and money-hungry studios, but also expresses the affection that fans feel for these individuals and the franchises that cast them. Critiques of the industry’s exploitation of creative labor, quiet acceptance of toxic masculinity, and neo-imperialist narratives remain beyond the scope and intention of the vignette. The interviewer, Ellis, plays the role of a comic straight man, helping to facilitate the humorous interactions, rather than taking a combative stance toward his interviewees.

“Above the Line: Deepfake Roundtable: Cruise, Downey Jr., Lucas & More—The Streaming Wars,” Collider, November 11, 2019,

“Above the Line: Deepfake Roundtable: Cruise, Downey Jr., Lucas & More—The Streaming Wars,” Collider, November 11, 2019,

Close modal

Digital strategist and screenwriter William Yu’s media campaign #SeeAsAmStar mobilized deepfake technology toward overtly political ends. He made a series of scenes that featured the faces of Asian American actors on the bodies of white movie stars. Yu used FaceApp along with the most powerful gaming PC he could find (purchased at Costco because of their flexible return policy). He then posted the videos on Twitter with the hashtag #SeeAsAmStar. The scenes are no simple act of star worship. Nor do the films follow what media scholar Drew Ayers describes as a common form of toxic nostalgia, where the faces of contemporary white male actors are brought into blockbusters of the Ronald Reagan era starring Arnold Schwarzenegger, Chuck Norris, and Sylvester Stallone.36 Yu saw #SeeAsAmStar as a call for equity and as the “spiritual sequel” to his #StarringJohnCho campaign, which focused on photoshopping the face of the beloved Korean American actor John Cho into advertisements for upcoming films.37

In #SeeAsAmStar, Yu deepfaked Asian American actors into high-intensity films that showcase their talent. Cho (rather than Chris Evans) plays the titular superhero in Marvel’s Captain America: The Winter Soldier (2014, directed by Anthony Russo). Constance Wu (rather than Scarlett Johansson) appears as the brilliant and lethal star of the sci-fi thriller Lucy (2014, directed by Luc Besson). Steven Yeun (rather than Joseph Gordon-Levitt) plays the charming and befuddled Tom in the quirky rom-com 500 Days of Summer (2009, directed by Marc Webb). Arden Cho (rather than Jennifer Lawrence) assumes the role of Katniss Everdeen in the dystopian action-fantasy The Hunger Games (2012, directed by Gary Ross).38 Centering Asian American talent, #SeeAsAmStar calls attention to the lack of Asian presence in Hollywood as well as argues that these Asian actors deserve big screen opportunities—that they ought to be considered for marquee roles.39 Yu claims that his use of emerging technology wasn’t so much about heaping praise on a particular star, but “about making a statement, about us being okay and getting used to what it means to see an Asian face in a lead role.”40

William Yu, #SeeAsAmStar campaign image, “How I Used Deepfake Tech to Make the Case for an Asian American Movie Star,” Medium, June 14, 2018,

William Yu, #SeeAsAmStar campaign image, “How I Used Deepfake Tech to Make the Case for an Asian American Movie Star,” Medium, June 14, 2018,

Close modal

Yu launched #SeeAsAmStar during Asian Pacific American Heritage Month in May 2018 and the campaign received enthusiastic attention from bloggers, journalists, and celebrities. Videos were also displayed in a 2019 multimedia exhibition of Yu’s art at the Pearl River Mart in New York City and the project helped to propel Yu’s screenwriting career.41 Unfortunately, #SeeAsAmStar didn’t achieve the same kind of visibility as #StarringJohnCho. Yu attributes this to a number of factors, including the lack of a catchy hashtag and the difficulty viewers experienced in sharing and commenting on the videos. Additionally, the campaign might have had a larger footprint if the videos had appeared edited together as a unified film, allowing viewers to see them more easily in relation to one another and grasp the cumulative power of the intervention. And even as Yu targeted influencers along with journalists to generate interest, partnering with progressive media organizations such as the Center for Asian American Media, Visual Communications, the Norman Lear Center, and the Center for Media & Social Impact could have helped to amplify #SeeAsAmStar’s message and position it within broader advocacy efforts for diverse and inclusive representation.42

Deepfakes have also been taken up by artists as part of installations. Hacktivist Bill Posters (Barnaby Francis) and computational designer Daniel Howe joined forces with the synthetic media startup CannyAI to create Spectre, a conceptual project about the “digital influence industry.”43 Six black steel-encased monoliths comprise the core of the installation, which had its 2019 premiere at the Alternate Realities Site Gallery at the Sheffield DocFest. The sleek tablets refer in varying ways to sites of devotional practice: prehistoric Stonehenge, the mythic totem that begins 2001: A Space Odyssey (1968, directed by Stanley Kubrick), and touch-screen kiosks that appear everywhere from urban streets to shopping malls. Journalist Naomi Rea of Artnet News explains that data from visitors is extracted through a game, where they swipe up or down on brands they encounter, signaling a love or hatred of the company. This generates a personality profile that shapes the visitor’s experience, delivering increasingly microtargeted ads.44 The project was inspired by the Facebook–Cambridge Analytica scandal, where the UK consulting firm used data illegally purchased from the social media platform in an effort to influence voters in both the Brexit memorandum and the US presidential elections. The name Spectre is itself a reference to the online persona of Dr. Aleksandr Kogan, who sold millions of Facebook profiles to Cambridge Analytica.

Posters and Howe designed a series of deepfakes titled Big Dada to serve as promotion for Spectre and a video project in its own right. Posters considers deepfakes “the perfect art form for our absurdist reality.”45 The shorts were released on Instagram and then played together as a looped, single-channel video in galleries. Each film features a famous influencer, tech entrepreneur, or politician reflecting on their investment in data and the power it yields. In the video profiling Kim Kardashian, she quips that she doesn’t care about being disliked by some of her followers, for their engagement has resulted in the dollars that come with commanding a big social media fanbase. Kardashian’s concluding line, “I genuinely love the process of manipulating people online for money” references her commitment to media spectacle and the ways she wields her celebrity status to push products.46

A similarly stylized short of Facebook CEO Mark Zuckerberg used dialogue replacement to alter a speech he gave on CNN Business in September 2017. The original video portrays Zuckerberg, clad in his usual solid color T-shirt and framed by a CNN Tech banner, detailing the nine-step company plan to combat election interference.47 By contrast, the deepfake shows Zuckerberg in the same outfit and surroundings, looking directly at the camera and saying: “Imagine this for a second: one man with total control of billions of people’s stolen data, all their secrets, their lives, their futures. I owe it all to Spectre. Spectre showed me that whoever controls the data controls the future.”48 Zuckerberg says the opposite of what he was trying to communicate in the original broadcast (and during his hearing in front of the Senate Commerce and Judiciary Committee). Still, the address speaks to the company’s actual motivations and actions. In a crude form of straight talk, the deepfake lays bare how Facebook has reached commercial success. Posting the video to Instagram was meant to add an element of embarrassment, as the makers aimed to take advantage of Facebook’s pro–free speech stance to take a jab at the company and also call attention to its lax protocols for dealing with disinformation. Facebook had previously allowed a maliciously manipulated video depicting then House Speaker Nancy Pelosi slurring her words to remain on the platform with a minimum amount of information about how it had been doctored. Coverage of the Posters–Howe video in the New York Times, Vice, and the Guardian magnified the critique of Zuckerberg and increased pressure on the company to take a more ethical stance toward content moderation.49

Bill Posters, “Imagine This…” from the Big Dada series, part of the Spectre project, June 7, 2019, Instagram,

Bill Posters, “Imagine This…” from the Big Dada series, part of the Spectre project, June 7, 2019, Instagram,

Close modal

Posters teamed up with an anonymous collective of Brazilian artists to deepfake Jeff Bezos. Rather than revising a past public appearance, the video imagines the Amazon founder and CEO radically reorienting his company toward climate justice. The fictional Amazon Prime series GREEN HEART depicts Bezos pledging to devote profits from his business empire to help the Amazon rainforest.50 Braiding together images of climate activism with those of forest fires and floods, the video captures Bezos making a multidecade commitment to the biodiverse region. Looking directly at the camera, he claims that it is “time to act.” The portrayal of fantasy altruism cleverly reimagines Amazon’s mission, showing what Bezos could do if only he had a complete change of heart and mind. It also calls attention to the vast distance between the company’s namesake and their modus operandi of resource and labor extraction.

Deepfake videos can also generate a space to interrogate autocratic leaders denied in institutionalized spheres of electoral politics and public debate. Xi Jinping as Winnie the Pooh Dancing to Bat Out of Hell by Meat Loaf (2020) by The Fakening (aka Paul Shales) is a compilation of short performances of cosplay dancers dressed as Winnie the Pooh who don the digital face of Chinese President Xi Jinping.51 Shales is mainly known for fan face-swaps of Hollywood stars and bespoke music videos for the Strokes and Diplo.52 Xi as Pooh takes on the Chinese president and his administration, which has been cracking down on all forms of dissent against party leadership. Shales drew inspiration from a 2013 meme that juxtaposed a photograph of Xi walking with then President Obama, and an illustration depicting the similarly peripatetic Disney characters Tigger and Winnie the Pooh. Aligning Xi with the portly, soft-spoken bear counters official iconography of the President. Additionally, Pooh’s sensitive demeanor and eagerness for intimate friendship bends heteronormative conceptions of masculinity. The meme quickly went viral on the Chinese Twitter–equivalent platform, Sina Weibo. It was soon scrubbed from the Chinese internet but flourished beyond the country’s borders. Winnie the Pooh was added to a list of official state “sensitive words” and, in 2018, Disney’s live-action film Christopher Robin (directed by Marc Forster) was banned throughout the country.53 Shales’s montage amplifies the Xi–Pooh connection in flamboyant fashion, extending the life of the critique and serving as a retort to the regime’s efforts to decouple the President from the animated character. The accompanying soundtrack of Meat Loaf’s album Bat Out of Hell (1977), itself a work of baroque kitsch expressing the desire to break free of all obligation, emphasizes a longing to challenge the administration’s restrictive cultural policies and censorship laws.

The Fakening, Xi Jinping as Winnie the Pooh Dancing to Bat Out of Hell by Meatloaf, May 11, 2020,

The Fakening, Xi Jinping as Winnie the Pooh Dancing to Bat Out of Hell by Meatloaf, May 11, 2020,

Close modal

Brazilian artist Bruno Sartori makes similarly skilled deepfakes that call out the failed governance of Jair Bolsonaro. One of Sartori’s earliest videos shows Bolsonaro face-swapped with the diminutive Chapulín Colorado, the titular red grasshopper of the Mexican TV series El Chapulín Colorado (1973–79). Just the visuals are arresting, as blending Bolsonaro with Colorado aligns the then president with the character, a buffoonish insect whose aspirations to become a superhero are repeatedly dashed through self-sabotage. The theme song for the series announces that Colorado is “more agile than a turtle, stronger than a mouse, nobler than a lettuce, his shield is [his] heart…He’s the Cherry Cricket!” In Sartori’s video, Bolsonaro delivers excerpts of a speech he gave during a trip to Dallas in 2019. The President twists his own campaign slogan, the brazenly nationalistic “Brazil above everything, God above all,” into “Brazil and the United States above everything.” At the time, the shift was seen by journalists as an obsequious effort to appease his American hosts. In Sartori’s video, the accompanying laugh track and jazz score heckles Bolsonaro, creating distance between the confident image that Bolsonaro wants to project and the shabby antihero that we see. Bolsonaro canta para Trump Without You (2019) further pokes fun at the Bolsonaro-Trump connection.54 The video shows Bolsonaro face-swapped into the Whitney Houston music video for her song “I Will Always Love You,” serenading the former American president’s floating face. The saccharine-sweet scene captures Bolsonaro’s real-world sycophantic relationship to Trump.

In Lava uma Mão (2020), Bolsonaro, along with Minister Damares Alves, espouses the virtues of washing one’s hands and wearing a mask to control COVID-19. The song draws on the TV series Castelo Rá-Tim-Bum’s Lavar as Mãos, a children’s melody promoting good hygiene. The deepfaked version calls attention to Bolsonaro’s complete lack of care or attention to the pandemic as a public health crisis, which resulted in over 664,000 deaths.55 Working against the recommendations of the World Health Organization, Bolsonaro spread misinformation about the virus, vehemently opposing social distancing measures and blocking mask mandates in pursuit of a vaguely defined idea of herd immunity. According to Human Rights Watch, he vetoed provisions in 2020 that would have made mask wearing mandatory within prisons and churches.56 In an interview with Witness and MIT’s Co-Creation Studio for the series Deepfakery, Sartori reflects on how the critique is not supposed to be personal, but about the consequences of Bolsonaro’s actions: “I always try to portray politics as being a matter of public interest…the image I am using is a public image, the image of the president, to portray a situation, with satire, to create a critique around the attitudes he takes.”57 One of the song’s verses delivers a self-indictment of failed leadership: “After posting that Corona is bullshit, before spreading on WhatsApp more fake news, wash one hand, wash the other.”

Bruno Sartori, Bolsonaro e Ministros Cantam “Lava uma Mão,” #Deepfake March 25, 2020,

Bruno Sartori, Bolsonaro e Ministros Cantam “Lava uma Mão,” #Deepfake March 25, 2020,

Close modal

Vladimir Putin has repeatedly been given a satirical deepfake treatment. Contributing to the stakes surrounding the manipulation of his image is the Kremlin’s vehement drive to stage-manage his identity and control messaging related to Russia’s invasion of Ukraine. State-run broadcast TV along with hacking and online disinformation campaigns have helped to legitimize the country’s kleptocracy and frame the West as an existential threat.58 These efforts make resistance urgent. As film historians Patricia R. Zimmermann and Masha Shpolberg have argued, vérité-style nonfiction by the Ukrainian media collective Babylon ’13 challenge Putin’s propaganda machine, providing a vital source of on-the-ground intelligence for local and international audiences.59 Pro-Ukrainian deepfake satire offers another weapon to be used on the cultural front against the Russian dictator. Films vary in subject from graphic fantasies of Ukrainian military victory to instances of Putin buffoonery.60 The artist who goes by the handle Ctrl Shift Face deepfaked Putin into a particularly violent moment in Quentin Tarantino’s Inglourious Basterds (2009). The scene shows a face-swapped Putin-as-Nazi officer appearing on his knees, seconds before being pummeled to death by Zelenskyy-as-Donny Donowitz (aka the “Bear Jew”). President Joe Biden-as-Lt. Aldo Raine looks on in quiet support throughout the execution. The original scene from the film depicts a twisted Jewish American revenge fantasy. This synthetic media reboot shows Putin being punished for his genocide of Ukrainians. Keeping the narrative and stylistic elements of Tarantino’s scene intact, including the whimsical vocal quips and a Spaghetti Western–style soundtrack, gives the performance a feeling of grotesque play. The video places in sharp relief the actual fascism underlying Putin’s worldview as well as the wartime superhero status of Zelenskyy, the Jewish actor-turned-politician-turned–military commander.

Ctrl Shift Face, Inglorious Bastard [DeepFake], March 2, 2022,

Ctrl Shift Face, Inglorious Bastard [DeepFake], March 2, 2022,

Close modal

BestWishes AI, a company better known for fashioning AI-enabled greeting cards, made a deepfake in which Putin pleads with viewers to donate money and weapons to the Ukrainian cause. The Russian president looks straight at the camera to share the mocking assertion, “Hello, my name is Vladimir Putin, the most hated man in the world! Please, stand with Ukraine. They need your help and solidarity against my brutal and criminal invasion.”61 Extending this line of self-sabotage, the YouTuber known as “The Kremlin Official” made a deepfake of Putin singing the Radiohead song “Creep” at a karaoke session. Here, emo Putin appears diminutive and terribly unsure of himself, struggling to remember the lyrics and keep the beat as he sings about romantic desire and self-loathing.62

Amateur media maker Eugene Blanchard created a series of short films with the tagline, “not so deep fake satire.” The films depict Putin delivering speeches or engaging in outlandish exchanges with an interlocuter. The humor arises from the contrast between what viewers know about Putin’s public image as a stoic strongman, and how candid and weak he appears. In Putin’s Bedtime Story (2022), Putin plays the “big bad wolf” in the story of the three little pigs. Claiming that he wants free access and authority over Mariupol, he declares that he will blow down the house if they do not “let him in.” The exchange reduces Putin’s military strategy to a distressing fairytale of an authoritarian leader trying to make a land grab to expand his empire. Lastly, in a deepfake made from news footage of Putin delivering an official address to the Russian parliament, he sings some verses of the song “You’ll be Back,” from Lin-Manuel Miranda’s hit musical Hamilton (2015). The song features the King George III character bemoaning the rebellious American colonists, who seek to break free of the crown. The King’s lament appears as the frustrated longing of a lover, confident that their beloved will return. The song has special resonance with the Russian invasion of Ukraine and the life and death tension between Putin’s statements about reunification and his dire actions. One of the last verses is especially poignant: “When push comes to shove, I’ll kill your friends and family to remind you of my love.”63

A common characteristic of these deepfakes is the sense of agency they give makers and their audiences to “talk back” to autocratic regimes that treat any alternative to the party line as a type of dangerous dissent. As individual leaders can seem powerful, distant, and seemingly untouchable, these deepfakes make them not just human-scaled, but vulnerable and open to direct confrontation.

Sassy Justice with Fred Sassy serves as a high point for deepfake satire, both because of the incendiary way that it takes aim at a range of deserving targets and because of how it reflects on forms of knowledge production in our contemporary moment. Having long labored in the satirical tradition, South Park (1997–present) creators Trey Parker and Matt Stone teamed up with actor and voice artist Peter Serafinowicz to craft Sassy Justice.64 The web series follows Fred Sassy, a fictitious news correspondent who reports on synthetic media across interconnected vignettes. The show uses AI to poke fun at the interviewees and to interrogate the technology itself. Sassy appears as a deepfaked Donald Trump (played by Serafinowicz), camped up in an ill-fitted wig, ascot, and colorful pocket square.65 Parker and Stone’s original goal was to make a two-hour deepfake Hollywood film. Parker told the New York Times, “It really is this new form of animation for people like us, who like to construct things on a shot-by-shot level and have control over every single actor and voice.” They likened the degree of control they were able to exercise with Sassy Justice to their artisanal process making South Park.66 The combination of the pandemic and technical challenges encouraged the team to forego the full-length production and instead concentrate on a fifteen-minute video, something they could do remotely and fund without having to raise hundreds of millions of dollars. To execute their idea, the duo founded the studio Deep Voodoo, which they staffed with twenty technologists and computer graphic artists.

One of the first scenes showcases Fred Sassy interviewing a deepfaked Al Gore via Zoom. Reflecting on the malicious uses of disinformation, the former Vice President says that “deepfakes put words in people’s mouths and make people say things, like “vagina” and “poop,” and then you’ve got senators going around saying “vagina poop,” but they didn’t say “vagina poop.” The utterance illustrates that deepfakes can indeed make people say anything and also takes a jab at Gore’s own tendency to put his foot in his mouth. Fred’s flamboyant outfit and high-pitched lisp also contribute to the satirical dimension of the sketch. He projects an image of Trump that runs counter to the way the former President wants the world to see his strongman persona. So, too, does Fred’s performance call attention to Trump’s highly stylized self-presentation, from his elaborately coiffed hair to his spray-on bronzer. Another interview segment shows a deepfaked Michael Caine assuring Sassy that the human mind can indeed detect true from false, fabrication from authenticity, and that the perfect impersonation doesn’t exist. The persuasive nature of the Caine deepfake itself challenges Caine’s statements, signaling that we are already in a place where it is possible for even the most discerning viewer to be duped. Fred also trains his investigative sights on a fictional US government–run deepfake program led by Jared Kushner, who has been face-swapped onto a child’s body. The program, as Kushner reveals by way of a message scrawled on a piece of construction paper, is titled the “Anti-Deepfake Club, No Girls Allowed!” Throughout the short conversation, Kushner can barely keep track of the questions, commenting on the need to get Trump re-elected and his own appointment to lead negotiations in the Middle East. The depiction of Kushner-as-child points to his extreme incompetency with handling international politics and his lack of experience, expertise, and the basic critical thinking skills required in diplomacy. Rather, the video suggests, his main strength is his sycophantic devotion to his father’s real estate empire or his father-in-law’s brand.

Trey Parker, Matt Stone, and Peter Serafinowicz, Sassy Justice with Fred Sassy, Deep Voodoo studio, October 26, 2020,

Trey Parker, Matt Stone, and Peter Serafinowicz, Sassy Justice with Fred Sassy, Deep Voodoo studio, October 26, 2020,

Close modal

Infomercials for a “bargain dialysis clinic” that are interwoven into the flow of Sassy’s investigation speak to a related form of deception. A deepfaked Zuckerberg asks, “Have You Been Diagnosed with Type 1 or Type 2 Kidney Failure?…Then come on down to Cheyenne Dialysis, where we’ve got all the deals for all the customers that made me the dialysis king of Cheyenne!” With a montage of miserable-looking patients in the background, Zuckerberg presents the outrageous costs of care as if he is offering a cheap deal: “Two-day, full kidney dialysis for $199,999…No insurance? No problem, we’ll work with you, using your mortgage, will, or other assets!” At once evoking a late-night cable TV infomercial and the absurd “Family Heart Center” TV ad from RoboCop (1987, directed by Paul Verhoeven), the performance foregrounds the fact that healthcare has become an absurdly expensive commodity, sold in exploitative fashion to everyday people. The commercial mocks how Zuckerberg and other captains of big tech present their services as if they were somehow free, but still extract valuable personal data from all who engage.

At the end of the web series, Fred launches into a sobering monologue about how a network of economic, political, and technological forces influence what viewers see online: “Things aren’t always what they seem, you have to use your own noodle, you can’t let anyone sell you on anything, not an idea, not a product.” As he continues to speak, however, periodic interruptions by an off-camera station censor force Sassy to soften his moralizing message, ultimately emphasizing his call for media literacy.67

What makes Sassy Justice so compelling is not only its high production values and polish, but the way it synthesizes so many kinds of critique. The buffoonish treatment of the film’s characters lays bare their arrogance, ignorance, and in some cases, the severe danger of their worldviews. Additionally, the journalistic investigation provides a dialogic framework to analyze the technology itself. The interviews place in sharp relief the threat of disinformation and the gullibility of mass audiences. These exchanges also signal how media outlets are underprepared to engage with these technologies, but nonetheless make the case that robust civic journalism, especially at the local level, has never been more necessary.68

An apt although perhaps unexpected way of understanding deepfake satire’s innovative form and social charge is to see it as answering a provocative call made decades ago by Werner Herzog in the offbeat documentary Werner Herzog Eats His Shoe (1980, directed by Les Blank). The film is ostensibly about the auteur making good on a bet with Errol Morris that he would literally eat his shoe if Morris ever completed his debut feature, Gates of Heaven (1978). However, the outrageous act is really the sideshow. The main attraction is Herzog holding forth about the role of the filmmaker in society and what it means to make creatively bold, thought-provoking moving images in the current media climate. Intercut with shots of Herzog nibbling on his well-stewed boot before a packed theater, are his musings on the dumbing effects of pop culture. The disgruntled filmmaker does not simply rehash Frankfurt School attacks on the culture industry; rather, he connects his critiques to the desperate need for a new and incisive “grammar of images” that can both challenge the entangled relationship between late capitalism and cultural production and offer possible paths forward for cinema. From the vantage point of the late 1970s and ’80s, Herzog viewed the explosion of cable television, the dismantling of media regulations, and the rise of a new breed of conversative regimes as having far-reaching consequences for political action and everyday communication. Eyeing the deluge of cartoons, advertisements, talk shows, and blockbusters that were beginning to proliferate with increasing intensity, Herzog calls for “real war” by way of an alternative constellation of images that can liberate rather than stifle independent thought and expression.

Still from Werner Herzog Eats His Shoe (1980) by Les Blank.

Still from Werner Herzog Eats His Shoe (1980) by Les Blank.

Close modal

Herzog’s ruminations remain prescient, as many of the changes in the early 1980s set the stage for our contemporary, more aggressively financialized and politically fractured media environment. Deepfake satire specifically feels Herzogian, although the point of connection seems to be less about the mirages that dance across his films (for example, Fata Morgana from 1971, Lessons of Darkness from 1992, and Grizzly Man from 2005), than his concept of “ecstatic truth.” Herzog first laid out the idea at the Walker Art Center in 1999, where he delivered his twelve-point Minnesota Declaration. He asserted that merely recording reality constitutes the “truth of accountants,” a rote nonfiction document that is too passive and complicit. A cinema of ecstatic truth is, instead, a stylized depiction that produces insight into social reality. It does so by drawing on the entire toolbox of cinematic techniques.69 In this spirit, the caricatured bodies and voices that comprise deepfake satire demonstrate how comedy can serve as one of the most urgent forms of truth-telling and expose the grotesque lies that undergird systems of oppression around the world.70


See Regina Rini, “Deepfakes Are Coming. We Can No Longer Believe What We See,” New York Times, June 10, 2019,; Nina Schick, Deepfakes: The Coming Infocalypse (New York: Twelve, 2020); Charlie Warzel, “Believable: The Terrifying Future of Fake News,” BuzzFeed News, February 11, 2018,; Franklin Foer, “The Era of Fake Video Begins,” Atlantic, May 15, 2018,; and Jackie Snow, “AI Could Set Us Back 100 Years When It Comes to How We Consume News,” MIT Technology Review, November 7, 2017,


Lisa Bode, Dominic Lees, and Dan Golding, “Introduction,” in “The Digital Face and Deepfakes on Screen,” Special Issue, Convergence: The International Journal of Research into New Media Technologies 27, no. 4 (2021): 849–53,


Henry Ajder and Joshua Glick, Just Joking: Deepfakes, Satire, and the Politics of Synthetic Media (Cambridge, MA: Witness/MIT, 2021), This public-facing inquiry offers an overview of both the malicious and progressive uses of synthetic media. It includes case studies and interview testimony from artists, activists, and legal scholars.


A brief survey of recent scholarship includes Caty Borum Chattoo and Lauren Feldman, A Comedian and an Activist Walk into a Bar: The Serious Role of Comedy in Social Justice (Berkeley: University of California Press, 2020); Caty Borum Chattoo, The Revolution Will Be Hilarious: Comedy for Social Change and Civic Power (New York: New York University Press, 2023); Maggie Hennefeld, Specters of Slapstick and Silent Film Comediennes (New York: Columbia University Press, 2018); Annie Berke, Their Own Best Creations: Women Writers in Postwar Television (Berkeley: University of California Press, 2022); Silas Kaine Ezell, Humor and Satire on Contemporary Television: Animation and the American Joke (New York: Routledge, 2023); Geoffrey Baym and Jeffrey Jones, eds., News Parody and Political Satire Across the Globe (New York: Routledge, 2012); Rob King, Hokum! The Early Sound Slapstick Short and Depression-Era Mass Culture (Berkeley: University of California Press, 2017); Maria San Filippo, After “Happily Ever After”: Romantic Comedy in the Post–Romantic Age (Detroit, MI: Wayne State University Press, 2021); and Nick Marx and Matt Sienkiewicz, eds., The Comedy Studies Reader (Austin: University of Texas Press, 2018).


See Julia M. Taylor, “Computational Treatments of Humor,” in The Routledge Handbook of Language and Humor, ed. Salvatore Attardo (New York: Routledge, 2017), 456–71; Graeme Ritchie, “Can Computers Create Humor?,” AI Magazine (September 2009), 71–81; and Rob King, “Cyborg Humor: On Humor as an Adversarial Network” (presentation, Society for Cinema and Media Studies, Denver, April 12, 2023).


Alice Maurice, ed., Faces on Screen: New Approaches (Edinburgh: Edinburgh University Press, 2022); Christopher Holliday, “Retroframing the Future: Digital De-Aging Technologies in Contemporary Hollywood Cinema,” JCMS: Journal of Cinema and Media Studies 61, no. 5 (2021–22): 210–37.


Jaimie Baron, Reuse, Misuse, Abuse: The Ethics of Audiovisual Appropriation in the Digital Era (New Brunswick, NJ: Rutgers University Press, 2020), 57.


Mark Dery popularized the term “culture jamming” in the early 1990s in reference to a wave of hoaxes, pranks, and agitprop aimed at globalization. See also Mark Dery, “The Merry Pranksters And the Art of the Hoax,” New York Times, December 23, 1990,; and Mark Dery, Culture Jamming: Jacking, Slashing, and Sniping in the Empire of Signs, pamphlet no. 25 (Westfield, NJ: Open Magazine), January 1, 1993,


Marilyn DeLaure and Moritz Fink, “Introduction,” in Culture Jamming: Activism and the Art of Cultural Resistance, ed. Marilyn DeLaure and Moritz Fink (New York: New York University Press, 2017), 6.


Sofian Audry, Art in the Age of Machine Learning (Cambridge, MA: MIT Press, 2021), 14.


Deep learning refers to the use of artificial neural networks (layers of clustered algorithms loosely akin to the human brain) to distinguish, sort, and extract patterns from large datasets. Deep learning sits within the AI field of machine learning, which aims to create “intelligent” computing systems by enabling them to learn from experience rather than operate according to strictly programmed instructions. John D. Kelleher, Deep Learning (Cambridge, MA: MIT Press, 2019), 1–37.


See Samantha Cole, “AI-Assisted Fake Porn is Here and We’re All Fucked,” Vice Motherboard, December 11, 2017,; and Samantha Cole, “Reddit Just Shut Down the Deepfakes Subreddit,” Vice Motherboard, February 7, 2018, Malicious deepfakes contribute to the broader field of dis- and misinformation. Melissa Zimdars and Kembrew Mcleod, eds., Fake News: Understanding Media and Misinformation in the Digital Age (Cambridge, MA: MIT Press, 2020), 71–73. W. Lance Bennett and Steven Livingston, eds., The Disinformation Age: Politics, Technology, and Disruptive Communication in the United States (Cambridge, UK: Cambridge University Press, 2020). Herman Wasserman and Dani Madrid-Morales, eds., Disinformation in the Global South (London: Wiley-Blackwell, 2022).

13. estimated that the amount of deepfakes rose from 7,964 on December 18, 2019, to 85,047 by December 20, 2020, and that the vast majority of them were nonconsensual pornography that targeted women. See also Henry Ajder et al., The State of Deepfakes: Landscape, Threats, and Impact, Deeptrace, September 2019,; and Giorgio Patrini, The State of Deepfakes 2020: Update on Statistics and Trends, Deeptrace, March 2021, 1–6.


Danielle Citron, quoted in The State of Deepfakes: Landscape, Threats, and Impact, 6. Robert Chesney and Danielle K. Citron, “Deep Fakes: A Looming Crisis for Privacy, Democracy, and National Security,” California Law Review 107 (2019): 1753–1819. See also Regina Rini, “Deepfakes and the Epistemic Backstop,” in Philosopher’s Imprint 20, no. 24 (August 2020): 1–16,; and Witness Media Lab’s initiative, “Prepare, Don’t Panic: Synthetic Media and Deepfakes,”


Emily van der Nagel, “Verifying images: deepfakes, control, and consent,” Porn Studies 7, no. 4 (June 2020): 424–9, See also Sophie Maddocks, “Feminism, activism and non-consensual pornography: analyzing efforts to end ‘revenge porn’ in the United States,” Feminist Media Studies 22, no. 7 (2022): 1641–56,; and Sophie Maddocks, “A Deepfake Porn Plot Intended to Silence Me: exploring continuities between pornographic and ‘political’ deep fakes,” Porn Studies 7, no. 4 (2020): 415–23,


See Ali Breland, “The Bizarre and Terrifying Case of the ‘Deepfake Video’ that Helped Bring an African Nation to the Brink,” Mother Jones, March 15, 2019, Deepfake technology has also been used in controversial ways to bolster candidates’ appeal. For example, in the 2019 legislative assembly elections in Delhi, a simulation of Bharatiya Janata Party President Manoj Tiwari spoke English and the Hindi dialect Haryanvi. In South Korea’s 2022 presidential race, opposition candidate Yoon Suk-yeol successfully used a synthetic version of himself, known as “AI Yoon,” to make him look more gregarious and comfortable spouting barbed quips at his opponent. See also Nilesh Christopher, “We’ve Just Seen the First Use of Deepfakes in an Indian Election Campaign,” Vice, February 18, 2020,; and Timothy W. Martin and Dasl Yoon, “These Campaigns Hope ‘Deepfake Candidates Help Get Out the Vote,’” Wall Street Journal, March 8, 2022,


See Digital Forensic Research Lab, “Russian War Report: Hacked news program and deepfake video spread false Zelenskyy claims,” Atlantic Council, March 16, 2022,; and Bobby Allyn, “Deepfake video of Zelenskyy could be ‘tip of the iceberg’ in info war, experts warn,” NPR, March 16, 2022,


See Bryan Bishop, “How Avengers: Infinity War turned Josh Brolin into an Eight-Foot Purple Madman,” The Verge, May 10, 2018,; and Bill Desowitz, “Advanced De-Aging VFX Are Crucial to ‘The Irishman,’ ‘Gemini Man,’ and ‘Captain Marvel,’” IndieWire, October 1, 2019,


For an insightful and nuanced account of the AI-museum intersection, see Mihaela Mihailova, “To Dally with Dalí: Deepfake (Inter)faces in the Art Museum,” in Convergence, ed. Bode et al., 882–98; and Eliza Levinson, “How Do We Know What’s Real in the Era of the Deepfake?,” Hyperallergic, April 25, 2022,


Joshua Rothkopf, “Deepfake Technology Enters the Documentary World,” New York Times, July 1, 2020, Patricia Thomson, “Digital Disguise,” Documentary, June 30, 2020,; and Craig Hight, “Deepfakes and documentary practice in an age of misinformation,” Continuum: Journal of Media & Cultural Studies 36, no. 3 (2022): 1–18. Laney used some of these same techniques in Vice’s film Faceless (2022) and the BBC documentary Hong Kong’s Fight for Freedom (2022).


Jakob Marovt, “How we made David Beckham speak 9 languages,” Synthesia, April 26, 2023, An extension of this campaign involved a film that synthetically aged Beckham. Also see Sara Spary, “David Beckham digitally aged into his 70s in combating malaria campaign,” CNN News, December 3, 2020,


Beatriz García, “‘Clarify this Crime’: Mexican journalist resurrected with artificial intelligence,” Al Día, October 30, 2020,


Satire has its origins in the oral poetry and pictorial writings of Ancient Egypt, Rome, Greece, and Persia. See Jonathan Greenberg, The Cambridge Introduction to Satire (Cambridge, UK: Cambridge University Press, 2018), 3–51; and John T. Gilmore, Satire: the New Critical Idiom (London: Routledge, 2017), 1–17.


Samuel Johnson, Johnson’s Dictionary: A Modern Selection, ed. E.L. McAdam Jr. and George Milne (Mineola, NY: Dover, 2005), 357.


Mikhail Bakhtin, The Dialogic Imagination: Four Essays, ed. Michael Holquist (Austin: University of Texas Press, 1981), 21.


Bakhtin, The Dialogic Imagination, 23.


Jonathan Gray, Jeffrey P. Jones, and Ethan Thompson, eds., Satire TV: Politics and Comedy in the Post-Network Era (New York: New York University Press, 2009), 12–13.


Danielle Fuentes Morgan, “Introduction: The Satirical Mode and African American Identity,” in Laughing to Keep from Dying: African American Satire in the Twenty-First Century (Champaign: University of Illinois Press, 2020), 1–3.


Dannagal Goldthwaite Young, Irony and Outrage: The Polarized Landscape of Rage, Fear, and Laughter in the United States (Oxford, UK: Oxford University Press, 2020), 69–84.


James E. Caron, “The Quantum Paradox of Truthiness: Satire, Activism, and the Postmodern Condition,” Studies in American Humor 2, no. 2 (2016): 156. See also James E. Caron, “Satire Today: An Introduction to the Special Issue,” Studies in American Humor 5, no. 1 (2019): 6–12.


Authoritarian leaders and the right-wing extremist movements that sustain them use the term “satire” as a sword and a shield. They maliciously weaponize humor in the form of speeches, memes, gifs, and re-edited videos to launch personal attacks or spread misleading claims. These forms of media often take aim at the marginalized and vulnerable, manifesting as homophobic, xenophobic, and misogynistic messaging. Comedy scholars Caty Borum Chattoo and Lauren Feldman describe this as “punching down” rather than “punching up.” See Chattoo and Feldman, A Comedian and an Activist Walk Into a Bar, 147. See also AFP, “Philippines’ Duterte: UN pull-out threat a ‘joke,’” Guardian, August 24, 2016,; Aaron Rupar, “Trump Says ‘Russia, if you’re listening’ was a joke. There’s tape to prove otherwise,” Vox, September 29, 2020,; and Nick Fuentes, quoted in Tom Dreisbach, “How Extremists Weaponize Irony to Spread Hate,” NPR, April 26, 2021,


Jordan Peele’s Monkeypaw Productions and Jonah Peretti from BuzzFeed News, You Won’t Believe What Obama Says In This Video!, April 17, 2018, video, See also Aja Romano, “Jordan Peele’s simulated Obama PSA is a double-edged warning against fake news,” Vox, April 18, 2018,; and David Mack, “This PSA About Fake News From Barack Obama Is Not What It Appears,” BuzzFeed News, April 17, 2018, Also, see AI artist Bob de Jong and voice actor Boet Schouwink’s deepfake, This is Not Morgan Freeman–A Deepfake Singularity, video,


Lisa Guerrero, “Can I Live? Contemporary Black Satire and the State of Postmodern Double Consciousness,” Studies in American Humor 2, no. 2 (2016): 270,; and Matt Fotis, Satire & the State: Sketch Comedy and the Presidency (New York: Routledge, 2020), 223–4.


What gives these videos some degree of legal protection as free speech is the public status of those depicted. Matthew F. Ferraro and Evelyn Aswad, “Part III: The Legal View,” in Ajder and Glick, Just Joking!,


See “Deepfake Roundtable: Cruise, Downey Jr., Lucas & More—The Streaming Wars,” Collider, November 11, 2019, video,; and James Vincent, “A Celebrity Deepfake Roundtable with Tom Cruise and Jeff Goldblum is as weird as it sounds,” The Verge, November 18, 2019,


See Drew Ayers, “The limits of transactional identity: Whiteness and embodiment in digital facial replacement,” Convergence 27, no. 4 (2021): 1022,; and Susan Jeffords, Hard Bodies: Hollywood Masculinity in the Reagan Era (New Brunswick, NJ: Rutgers University Press, 1994), 1–30.


Building on the mid-2010s call for media reform (#OscarsSoWhite, #MeToo, etc.), Yu says that #StarringJohnCho was inspired by a series of incidents: whitewashed casting in Hollywood, Chris Rock’s racist jokes at the 2016 Oscars, and the results of UCLA’s Hollywood Diversity Report, which stated that actors of Asian descent played only 1% of lead roles in Hollywood. Cho appeared opposite Emilia Clarke in a poster advertisement for Me Before You (2016, directed by Thea Sharrock), as Chris Pratt in Avengers: Age of Ultron (2015, directed by Joss Whedon), and as Matt Damon in The Martian (2015, directed by Ridley Scott). Cho himself gave a shout-out to the campaign (and went on to star in Searching [2018, directed by Aneesh Chaganty] and series such as Cowboy Bebop [2021, directed by Shinichirō Watanabe]). Director Jon M. Chu cited #StarringJohnCho as part of what inspired him to make Crazy Rich Asians (2018).


William Yu, #SeeAsAmStar Playlist, 2018, video,


Scarlett Johansson was cast as Major Motoko Kusanagi in Ghost In The Shell (2017, directed by Rupert Sanders) and Tilda Swinton played a Tibetan sorcerer, the Ancient One, in Marvel’s Dr. Strange (2016, directed by Scott Derrickson). See Ralph J. Bunche Center for African American Studies, UCLA Hollywood Diversity Report,; Katie Rogers, “John Cho, Starring in Every Movie Ever Made? A Diversity Hashtag is Born,” New York Times, May 10, 2016,; and Steve Rose, “Ghost in the Shell’s whitewashing: does Hollywood have an Asian problem?,” Guardian, March 31, 2017,


Yu quoted in Eric Francisco, “Hollywood Won’t Cast Asian-American Stars, but A.I. Machine Learning Can,” Inverse, May 17, 2018, See also Patrice Peck, “Asian Actors Can Play Any Hollywood Role, and the Incredible #SeeAsAmStar Campaign Proves It,” BuzzFeed, May 14, 2019,; and Inkoo Kang, “Starring John Cho as Captain America,” Slate, May 7, 2018,


Keith Chow, “#StarringJohnCho Comes to Life in New York City Art Show,” Nerds of Color, May 17, 2019, See also William Yu’s website,


William Yu, “How I Used Deepfake Tech To Make The Case for An Asian American Movie Star,” Medium, June 14, 2018,


Bill Posters, account of Spectre (2019),


Naomi Rea, “Artists Create a Sinister ‘Deepfake’ of Mark Zuckerberg to Teach Facebook (and the Rest of Us) a Lesson About Digital Propaganda,” Artnet News, June 12, 2019,


Bill Posters, in “The Art of Interrogation: An Interview with Bill Posters,” Juxtapoz, June 19, 2020,


Big Dada served as a provocative test case for Facebook’s own policies against removing incendiary content from their platform. See Jonathan Shieber, “Facebook will not remove deepfakes of Mark Zuckerberg, Kim Kardashian and others from Instagram,” TechCrunch, June 11, 2019, See also “‘Faking the Powerful’ with Deepfakes: with Bill Posters, Daniel Howe and Stephanie Lepp,” Deepfakery Episode 1, produced by Katerina Cizek and Sam Gregory, August 28, 2020, video,; Bill Posters’s website,; and Bill Posters, “Spectre: a Detour Into Dataism,” June 16, 2019, video,


CNN Business, “How Facebook plans to fight election interference,” September 21, 2017, video,


Bill Posters, “Imagine This…” 2019,


See Cade Metz, “A Fake Zuckerberg Video Challenges Facebook’s Rules,” New York Times, June 11, 2019,; Luke O’Neil, “Doctored video of sinister Mark Zuckerberg puts Facebook to the test,” Guardian, June 11, 2019,; and Samantha Cole, “This Deepfake of Mark Zuckerberg Tests Facebook’s Fake Video Policies,” Vice, June 11, 2019,


Bill Posters and anonymous Brazilian collective, GREEN HEART, Instagram post, October 1, 2019,


The Fakening (Paul Shales), “Xi Jinping as Winnie the Pooh Dancing to Bat Out of Hell by Meatloaf,” May 11, 2020, video, For more videos by The Fakening, see


The Fakening (Paul Shales) de-aged members of The Strokes for their Bad Decisions music video and face-swapped Diplo with Mark Wahlberg for his ’90s-era Calvin Klein underwear shoot. Jacob Kastrenakes, “When Diplo and The Strokes need a deepfake, they go to this guy,” The Verge, March 4, 2020,


See Benjamin Haas, “China Bans Winnie the Pooh film after comparisons to President Xi,” Guardian, August 6, 2018,; James Fallows, “Update on Pooh, Tigger, and the 2 Presidents: Art Recreates Life, Not Vice Versa,” Atlantic, June 13, 2013,; and Megha Rajagopalan and Talal Ansari, “China Has Moved to Get Rid of Presidential Term Limits,” Buzzfeed News, March 11, 2018,


Bruno Sartori, “Bolsonaro canta para Trump Without You,” October 11, 2019, video,


Fernanda Canofre, “A Brazilian journalist uses deepfake to make political satire,” Global Voices, August 12, 2019, See also Raphael Tsavkko Garcia, “Deepfakes Are Being Used to Puncture Politicians’ Bluster,” Medium, December 5, 2019,


Maria Laura Canineu and César Muñoz, “The Toll of Bolsonaro’s Disastrous Covid-19 Response,” Human Rights Watch, October 27, 2021, See also “Lula canta Mariah Carey–Obsessed,” which mocks Bolsonaro’s unhealthy preoccupation with demonizing Lula (Luiz Inácio Lula da Silva), October 26, 2019,


Bruno Sartori in conversation with Victor Ribeiro, “Bruno Sartori: Deepfakes as satire and parody in Brazil,” Deepfakery, Episode 7, produced by Witness, November 9, 2020, video,


See Peter Pomerantsev, Nothing is True and Everything Is Possible: The Surreal Heart of the New Russia (PublicAffairs: New York, 2014); Masha Gessen, “Inside Putin’s Propaganda Machine,” New Yorker, May 18, 2022,; Timothy Snyder, The Road to Unfreedom: Russia, Europe, America (New York: Crown, 2018); and Rina Chandran and Angelina Davydova, “Behind Russia’s ‘digital iron curtain,’ tech workarounds thrive,” Reuters, March 23, 2022,


Ukrainian media network Babylon ’13 formed during the 2013–14 pro-democracy protests in the Maidan, a massive uprising against then President Viktor Yanukovych’s rejection of the EU along with Russia’s attempts to absorb Ukraine into their geopolitical sphere of influence. Some of their earliest films were comprised of eyewitness footage of the trauma unleashed by advancing troops and bombings. The films then expanded into more elaborate projects that braided together eyewitness footage with interviews, commentary, and archival images. See Masha Shpolberg, “Forging a Nation Under Fire,” and Dale Hudson and Patricia R. Zimmermann, “Small Media of Urgent Utility,” Docalogue, May 2022,; and Patricia R. Zimmermann, “Cannes, Zelenskyy, Social Media, and the Mise en Abyme of Independent Media,” The Edge, May 30, 2022,


Deepfakes of Putin appeared online before the invasion of Ukraine. For example, the anti-corruption nonprofit RepresentUS released a PSA-style deepfake of Putin before the 2020 US election, warning against disinformation, video, Volodymyr Tykhyy, a member of Babylon ’13, crafted a satirical face-swap based on his feature-length fiction film, Lethal Kittens (2020). The deepfake Hollywood Embraces Lethal Kittens (2022) showcases a bevy of deepfaked Hollywood stars encouraging viewers to join the Ukrainian freedom struggle, video,


Bestwishes AI, Putin Deepfake: #StandWithUkraine, March 2, 2022, video,


Kremlin Official, “Putin Covers Radiohead’s ‘Creep,’” March 21, 2017, video,


thehardme, [deepfake] You’ll Be Back from Hamilton Starring Vladimir Putin, March 6, 2022, video,


Trey Parker and Matt Stone with Peter Serafinowicz, Sassy Justice With Fred Sassy, Deep Voodoo studio, October 26, 2020, video,


Peter Serafinowicz took cues from the 1993 Saturday Night Live sketch, “Sassy’s Sassiest Boys.” Additionally, Serafinowicz had first used this voice for his Sassy Trump YouTube montage, video, His redubbing of Trump speeches went viral, celebrated for their playful trolling of the President. Serafinowicz also served as the voice actor for numerous South Park characters and performed in live-action films and series (for example, Shaun of the Dead [2004, directed by Edgar Wright] and The Tick [2016, directed by Ben Edlund]).


Sean Parker, quoted in Dave Itzkoff, “The ‘South Park’ Guys Break Down Their Viral Deepfake Video,” New York Times, October 29, 2020, See also Randall Colburn, “The creators of South Park launched an entire deepfake studio to make Sassy Justice,” A/V Club, October 30, 2020,


Sassy Justice served as a springboard for Stone and Parker to partner with Kendrick Lamar on The Heart Part 5 music video (2022). As Lamar raps, his face morphs into African American media figures, including O.J. Simpson, Jussie Smollett, Nipsey Hussle, and Kobe Bryant. Lamar’s lyrics explore the personal experiences and creative work of the individuals he briefly embodies, reflecting on the fraught expectations of celebrity and the pressures placed on high-achieving people of color. The constant shift in tone and address mirrors the visual slippages between personas, as Lamar expresses ambivalence, struggling to reconcile the complexities of these figures’ lives with their professional achievements. See Marc Hogan, “How Kendrick Lamar’s ‘The Heart Part 5’ Video Subverts Deepfake Technology,” Pitchfork, May 9, 2022,


The series Deep Fake Neighbour Wars (2023) expands on some of the themes of Sassy Justice and definitely flaunts high production values. The ITVX-created series is a sendup of the aggressive push of commercial nonfiction toward reality TV. The show simulates melodramatic encounters between incongruous pairings of celebrities. Name-brand movie stars, athletes, influencers, and musical artists are plucked from their lavish lifestyles and reassigned to a working-class job and a modest residence. See Dominic Lees, “Deep Fake Neighbour Wars: ITV’s comedy shows how AI can transform popular culture,” The Conversation, January 27, 2023,


Werner Herzog, quoted in Roger Ebert, “Herzog’s Minnesota Declaration: Defining ‘ecstatic truth,’”, April 30, 1999,


The author would like to thank Afterimage editor Karen (Ren) vanMeenen along with the journal’s two outside readers for their valuable feedback on this article. Deep gratitude goes to Katerina Cizek, Sam Gregory, William Uricchio, Sarah Wolozin, and Henry Ajder for our stimulating conversations about this topic during my time at MIT’s Open Documentary Lab/Co-Creation Studio. Thanks also goes to Mihaela Mihailova and Lisa Bode for their sharp insights about the history of deepfakes and the landscape of synthetic media. Annie Berke offered generous and helpful comments regarding the progressive dimensions of contemporary comedy.