The NO FAKES Act is a Necessary First Step to Uniform Protection Against Deepfakes

The prevalence of unauthorized deepfakes is an urgent issue, especially in the lead-up to and the aftermath of the 2024 election. In addition to celebrities being falsely represented in a campaign ad and by an AI voice assistant, sexually explicit deepfakes are victimizing girls across the country through the use of “nudification” apps. A recent study found that 98 percent of deepfake videos available online were pornographic, and 99 percent of those were of women and girls.

Given the substantial harms associated with deepfakes, the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act currently pending in Congress is a necessary step towards providing uniform legal remedies, which state laws and private agreements have failed to do. A person’s fundamental right to control their own identity should not depend on where they live. As the first federal recognition of the right of publicity, the NO FAKES Act creates a uniform private right of action against the unauthorized use of deepfakes. Deepfakes (or digital replicas) are digitally created or manipulated images, videos, or audio recordings that falsely represent an individual’s voice or visual likeness. While there are benefits to using deepfakes, the opportunities for misuse can have drastic consequences on our democracy and personal autonomy.

The NO FAKES Act can be improved in the following ways. Firstly, the safe harbor exemptions for developers and distributors of deepfake-generating programs and online service providers (OSPs) should be limited in the context of sexually explicit deepfakes to provide adequate remedies to those injured by the most common use of deepfakes. Secondly, there should be stricter limitations on licensing terms to avoid the commercial exploitation of personal identity. Finally, statutory damages should be raised for individual and non-OSP defendants to increase the incentive to litigate and provide more effective deterrence against the unauthorized use of deepfakes.

The NO FAKES Act Provides Necessary Uniform Protection for Individuals
Most right holders are unlikely to get adequate relief under federal or state law because such protection varies from jurisdiction to jurisdiction and is typically incomplete. Federal laws like the Copyright Act and the Communications Act allow legal recourse against unauthorized deepfakes only in specific circumstances—e.g., if the deepfake infringes on a copyrighted work or is involved in telecommunications. States vary in the extent of their right-of-publicity laws, with Tennessee providing some of the strongest protections. The state recently enacted the ELVIS Act, expanding the right of publicity to deepfakes and extending liability to the distribution or transmission of technology intended to generate replicas of an individual’s voice or likeness. Other states only regulate deepfakes in specific instances, such as deepfakes of deceased people and sexually explicit deepfakes of minors. Additionally, some private agreements, like the recent SAG-AFTRA agreement concerning labor union contracts in entertainment, allow only limited control over the use of deepfakes. As a result, most parties harmed by deepfakes have limited or nonexistent avenues to legal relief.

A federal right of action is necessary to create uniform protection from unauthorized deepfakes. The bi-partisan NO FAKES Act, introduced in July 2024 with broad support from advocacy groups and industry leaders, is an important first step that recognizes the “right to authorize the use of the voice or visual likeness of [an] individual in a digital replica.” The bill excludes uses covered by the First Amendment and limits liability to activities affecting or using interstate commerce, but these limitations are unavoidable constitutional restrictions. The First Amendment exception also does not apply to sexually explicit content, which removes a barrier to litigation for the people most commonly injured by unauthorized deepfakes. Additionally, the interstate nexus requirement can generally be proven easily. Courts only require demonstrating that the deepfake was communicated on the internet or that the internet communication crossed state lines, which is often true in most cases.

The NO FAKES Act Should Limit Safe Harbor Exemptions for Sexually Explicit Deepfakes
The safe harbor provision of the bill exempts certain intermediaries from liability: developers and distributors of products and services capable of producing digital replicas, OSPs referring or linking to an unauthorized digital replica, and OSPs hosting user-uploaded material. There should be a carve-out in those exceptions for instances involving sexually explicit deepfakes because, in those cases, plaintiffs’ burden of overcoming the safe harbor exemption is excessive compared to the harm suffered.
The NO FAKES bill requires showing that a program is primarily designed or promoted to produce unauthorized deepfakes in order to hold the developer and distributor liable, which places a substantial burden on plaintiffs. While product developers generally should not be held liable for third parties’ actions, developers and distributors of programs that are designed for or are commonly used to generate sexually explicit deepfakes should know that users will likely use their programs to produce unauthorized content online. Considering that nonconsensual sexually explicit deepfakes constitute the vast majority of deepfakes available online, a separate provision should impose liability for programs intended to produce unauthorized sexually explicit deepfakes and those commonly known to be used for that purpose, provided the developer or distributor takes no reasonable precautions to prevent that type of use.

Under the current iteration of the NO FAKES Act, OSPs are exempt from liability if they remove an unauthorized deepfake after receiving a notice-and-takedown request. However, this is unlikely to remove the deepfake completely from the internet, and the harm will continue indefinitely. OSPs should be required to proactively regulate sexually explicit deepfakes. For example, there could be a requirement, like in the Digital Millenium Copyright Act (DMCA), for OSPs to adopt and implement a reasonable policy (e.g., an automatic filtering system) for removing unauthorized sexually explicit deepfakes and terminating the accounts of repeat offenders.

Limiting safe harbor exemptions in these ways will incentivize the proactive prevention of nonconsensual sexually explicit deepfakes before they are released onto the internet.

The NO FAKES Act Should Provide Stricter Limitations on Licenses
While the bill addresses some ethical concerns raised by experts, it does not go far enough to protect a person’s identity from commercial exploitation by a third party. At a congressional hearing considering the bill, University of Pennsylvania law professor Jennifer Rothman emphasized the threat to a person’s identity if corporations and individuals are given the opportunity to commodify it. By describing name, likeness, and voice as “indicia of identity,” Rothman argued that allowing the permanent transfer of the right of publicity is akin to involuntary servitude. To prevent that dangerous possibility, the NO FAKES Act makes the right to authorize digital replicas non-assignable during the principal’s lifetime and requires that a license granted by a living adult have a maximum duration of ten years and be in writing with a “reasonably specific description of the intended uses” of the deepfake.

“Reasonably specific description of the intended uses” is an ambiguous phrase that could allow descriptions of general categories of uses, rather than specific uses. Furthermore, ten years is a long time for a person to be uncertain about what their licensee may have the power to do under the license terms, especially when courts’ interpretations of “reasonably specific” may shift during that time. This provision should be re-worded to require licenses to describe each intended use with reasonable specificity and be limited to no more than five years. With these requirements, the principal can give knowing consent to the use of their identity and preserve their personal autonomy.

The NO FAKES Act Should Increase Statutory Damages for Individual and non-OSP Defendants
The bill provides the following remedies: actual or statutory damages, injunctive relief, and, in the case of willful misconduct, punitive damages. Statutory damages are capped at $5,000 per deepfake for individual defendants, $25,000 per deepfake for defendants that are non-OSP entities, and $5,000 per violation (i.e., display, copy, or transmission) for OSP defendants.

If they win, plaintiffs receive attorney’s fees, which can expand access to litigation for people with limited resources. However, the risk of losing presents a significant barrier for many individuals, especially considering the potentially lengthy litigation process. The potential damages for OSPs could be substantial since a single deepfake could cause any number of violations. But plaintiffs would have to overcome the substantial hurdle of the safe harbor provision first. In addition, because there is no liability if the OSP takes reasonable steps to remove the unauthorized deepfake after being notified of its existence, plaintiffs are unlikely to get adequate relief. Considering these barriers and the uncertainty of actual and punitive damages, statutory damages for individual and non-OSP defendants should be increased to $25,000 and $50,000, respectively, to provide more incentive to litigate. This increase would also create more effective deterrence against the source of unauthorized deepfakes because individuals and non-OSP entities would be less likely to post them online if the risk of monetary loss is greater.

Conclusion
The NO FAKES Act is an essential first step to creating uniform relief for unauthorized deepfakes. To improve, it should limit safe harbor exemptions for programs that generate nonconsensual sexually explicit deepfakes and OSPs that link to or host that content; further restrict licensing terms; and increase statutory damages for individual and non-OSP defendants. These measures would provide more equitable remedies and target the source of the most serious harms associated with deepfakes.

Charlotte Brownell

GLTR Staff Editor; Georgetown University Law Center, J.D. expected 2026; University of Edinburgh, M.A. 2022.