Over the past several years, both Congress and state legislatures have introduced legislation concerning “digital replicas,” or deepfakes created by AI. Many of these bills have been directed to protecting the name-image-likeness (NIL) rights of actors, singers, celebrities and others. This article compares two versions of the federal NO FAKES bill with recently-adopted California statutes.
NO FAKES Act of 2023
The “Nurture Originals, Foster Art, and Keep Entertainment Safe” (NO FAKES) Act of 2023, was introduced in October 2023, with the goal of establishing a property right in a digital replica. The bill received support from the recording industry (RIAA) and actor’s union (SAG-AFTRA).
The bill defined a “digital replica” as a “newly created, computer-generated, electronic representation of the image, voice, or visual likeness of an individual that is [nearly indistinguishable] from the actual image, voice, or visual likeness of that individual; fixed in a sound recording or audiovisual work in which that individual did not actually perform or appear,” and defined an “individual” as “a human being, living or dead.” “Visual likeness” is defined as the “actual visual image or likeness of an individual … that is readily identifiable” as the likeness of that individual.
The bill created a property right to authorize the use of an image or likeness in a digital replica, applicable and inheritable for 70 years after the death of the individual, and created a cause of action for unauthorized production or dissemination of digital replicas. Remedies would have included the greater of $5,000 per violation or actual damages.
Notably, the bill contained a disclaimer of preemption, explicitly leaving state laws in place. Additionally, the bill would have specified that the rights created were “intellectual property,” making Section 230 of the Communications Act inapplicable (per 47 U.S.C. § 230(e)(2)).
California Legislation
California AB 2602, passed on September 25, 2024, and signed into law on September 17, 2024, characterizes as void as against public policy any contractual provision that allows for the use of a digital replica if it is (1) “in place of work the individual would otherwise have performed in person,” (2) does not include a reasonably specific description of intended uses of the digital replica, and (3) the individual was not represented by a lawyer or a union whose collective bargaining agreement covers digital replicas. The law applies to new performances after January 1, 2025 (but presumably covers older contracts that may govern such new performances).
California AB 1836, passed on August 31, 2024, and also signed into law on September 17, 2024, also supported by RIAA and SAG-AFTRA, prohibits the use of a digital replica of a deceased personality’s voice or likeness without prior consent from the deceased personality’s representatives. Like the 2023 NO FAKES bill, the right applies for 70 years after the death of the deceased personality.
Both bills defined “digital replica” as “a computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual that is embodied in a sound recording, image, audiovisual work, or transmission in which the actual individual either did not actually perform or appear, or the actual individual did perform or appear, but the fundamental character of the performance or appearance has been materially altered.” Terms such as “highly realistic,” “readily identifiable,” “fundamental character,” and “materially altered” are not further defined. Also in both bills, “digital replica” is defined to exclude electronic reproduction, sampling, digital remastering and the like as authorized by the copyrightholder of the work.
NO FAKES Act of 2024
Even while the California bills were pending, Senator Coons introduced The NO FAKES Act of 2024 (S.4875), in July of 2024. S.4875 expands significantly beyond what would have been covered by the NO FAKES Act of 2023 and what is covered by the two California laws. It has received support from OpenAI, The Walt Disney Company, Warner Music Group, the Authors Guild, RIAA, the Motion Picture Association (MPA), Universal Music Group and SAG-AFTRA.
Its definition of “digital replica” pulls elements from both the NO FAKES Act of 2023 and the California bills. Generally speaking, it tracks the California definition, but adds “newly-created” as a requirement.
It would create a property right to authorize the use of an individual’s likeness in a digital replica, and would allow that right to continue to exist after the death of the individual. However, rather than a flat 70-year duration after death, the post-death right would only be for an initial 10-year period and would require ongoing proof of “active and authorized public use” in order to renew for successive 5-year periods up to 70 years total. Post-mortem rights would also have to be registered with the Register of Copyrights.
Although it creates civil liability for violation of the property right, liability only arises if the person using the digital replica had knowledge that it was a digital replica and that it was unauthorized. It also creates safe harbors for (a) products and services capable of producing digital replicas, (b) referral or linking, and (c) online services hosting user-uploaded material. It authorizes civil remedies for the greater of specified dollar amounts or actual damages, and places several other limitations on liability.
Importantly, the 2024 bill contains a provision preempting any cause of action under state law for the protection of an individual’s voice and likeness rights in connection with digital replicas. However, the preemption provision itself has three carve-outs: state statutes in existence as of January 2, 2025, state laws regulating deepfake pornography, and state laws regulating products or services capable of creating digital replicas. Additionally, like the 2023 bill, S.4875 specifies that the rights created are “intellectual property,” making Section 230 of the Communications Act inapplicable (per 47 U.S.C. § 230(e)(2)).
Conclusion
The era of AI-generated deepfakes has exposed gaps and shortcomings in traditional regulation of intellectual property rights and rights of publicity. Reading this series of bills in sequence shines light on some of the challenges in clearly defining the rights that need to be protected, and the risks of unintended consequences in that regulation.
As one example, consider the definition of “digital replica,” some version of which appears in all four bills. Some questions arise, such as:
- How new is “newly-created”?
- How far afield from photorealism must an image be in order to fall shy of “highly realistic”?
- In the era of “you sound just like …” or “you look just like …” what does it mean to be “readily identifiable” as an individual? Must that determination be made solely with reference to the digital replica itself, or can other, suggestive, contextual evidence be considered?
- How should the “fundamental character” of a performance be determined?
Clarifying the legal framework in this area is important. Getting it right, and relatively complete, may take some time—even as the technology evolves and advances at lightning speed.