Last year, a widely read technology blog turned heads with the deeply disturbing headline: “We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now.”1 While deliberately provocative, it was— and remains—unfortunately true. An unnamed individual on the popular discussion board, Reddit, superimposed images of celebrities such as Gal Gadot (Wonder Woman), Masie Williams (Game of Thrones), and Daisy Ridley (Star Wars) onto the bodies of adult video stars in pornographic films.2 That Reddit poster’s handle, or moniker, was “deepfake.” Hence, the term deepfake now refer to a video that superimposes hyper-realistic faces onto the bodies of others with the intent of creating a new video with fake representations.3

The initial Reddit post containing the altered video led to the proliferation of computer-generated pornographic videos starring anyone and everyone. As The Atlantic correctly sums up, “[i]n a dank corner of the internet, it is possible to find actresses from Game of Thrones or Harry Potter engaged in all manner of sex acts.”4

Importantly, this image-based technology—which is simply an intelligent algorithm, explained in more depth infra—can perform similar mimicry for auditory sounds. In other words, it can match one’s vocal tone and pattern with user-generated scripts, à la lip-synching.

The purpose of this article is not to debate the morality of this technology and whether it ought to be legal to purchase or download; this article, instead, leaves these decisions to ethicists and, ultimately, policymakers.5 The article also does not attempt to address national security or political questions that this technology raises.6 Rather, this article endeavors to discuss the remedies available to private victims of this technology. Put plainly, how can Daisy Ridley et al. pursue legal recourse against their digital manipulators?

Continue Reading