From Deepfake Drake to the SAG/AFTRA strike, 2023 shed light on a key source of anxiety in among talent in the entertainment industry: how will generative AI impact the right of publicity over talent’s likeness? As innovative technology companies push the boundaries of the capacity of generative AI, talent and their representatives seek to protect the inherent value of their likeness.

Talent and their representatives are no strangers to the legal issues surrounding protecting name, image, likeness, voice, and other identifiable elements of a public-facing person. Half of U.S. states have statutes protecting an individual’s right of publicity, and more have some type of common law protection over such a right, such as a common law tort for invasion of privacy or misappropriation of name or likeness.

For example, California provides a civil claim for unauthorized use of an individual’s “name, voice, signature, photograph, or likeness” on “products merchandise, or goods, or purposes of advertising or selling.” (Cal. Civ. Code § 3344.) The California statute includes a post-mortem right extending 70 years after death and California law provides that this is in addition to a common law right of publicity that predates the statute.

Meanwhile, Texas provides a statute for a property right in “name, voice, signature, photograph, or likeness” post-mortem for 50 years after death, but does not have a statute protecting from misappropriation while the individual is living. Despite the post-mortem specific statute, Texas maintains a common law right to privacy and requires a plaintiff to establish three elements of misappropriation: (1) appropriation of the plaintiff’s “name or likeness for the value associated with it”; (2) identifiability of the Plaintiff from the publication; and (3) that the defendant received “some advantage or benefit.” Despite the patchwork of right of publicity statutes across the country and overlap in practice to related intellectual property issues regulated at a federal level such as trademark and copyright, there is no federal right of publicity law. The need for a federal right of publicity law predates AI more than one might expect. In fact, in 1888, a bill was introduced in Congress seeking to prohibit “the use of likeness, portraits, or representations of females for advertising purposes without consent in writing.”

Current Proposed Acts: NO FAKES Act and No AI FRAUD Act

With increased attention to issues surrounding generative AI, Congress has been taking a closer look at federal privacy legislation more closely. In 2023, a group of senators circulated a discussion draft of a bill titled “Nurture Originals, Foster Art, and Keep Entertainment Safe Act” (NO FAKES Act). While the proposed legislation generated excitement, it was criticized for being overly broad in its protection and for lacking clarity regarding enforcement by individuals versus those granted rights.

In the House, the “No Artificial Intelligence Fake Replicas and Unauthorized Duplications Act of 2024” (No AI FRAUD Act) was introduced in January. Unlike the tort/privacy-based right created under most state laws, this proposed bill creates a “property” right to a person’s likeness and voice. This includes a person’s actual voice or likeness or a simulation that is “readily identifiable” either from the simulation itself or “from other information displayed in connection with the likeness.” The proposed federal laws would not preempt existing state laws. Thus, long-standing state precedent would continue, raising the potential for significant conflicts in federal and state protections. Furthermore, the No AI FRAUD Act also considers the Section 230 protections for Internet Service Providers and expressly excludes them from liability in this situation, meaning there is no safe harbor in the current draft.

Proposed No AI FRAUD Act Overview

The proposed No AI FRAUD Act extends beyond the scope of the NO FAKES Act, allowing for action against those who (1) create technology whose “primary purpose or function” is to produce digital voice replicas or digital depictions of particular, identified individuals; (2) “publishes, performs, transmits, or otherwise makes available to the public a digital voice replica or digital depiction” with knowledge that it was unauthorized; or (3) “materially contributes to, directs, or otherwise facilitates” either of those activities. This bill aims to impose greater liability for unauthorized digital reproductions of an individual’s image, voice, and visual likeness in audiovisual or sound recordings.

The bill provides for $50,000 in statutory damages for cloning services or recovery of actual damages and attributable profits. It further provides for $5,000 in statutory damages for digital depictions and digital voice replicas or recovery of actual damages and attributable profits. The proposed bill also allows for punitive damages and reasonable attorneys’ fees. Possible harms that can lead to damages include “financial or physical injury” or “elevated risk” of either, “severe emotional distress,” or “deception of the public, a court, or tribunal.”

The proposed bill allows those in an “exclusive personal services” contract with a “recording artist or an exclusive license to distribute sound recordings that capture the individual’s audio performances” to bring suit. Thus, entities such as labels or unions could act on behalf of the person whose likeness has been appropriated. This provision could potentially enable group licensing of voice or likeness from entities with licensing rights for multiple individuals. The proposed bill also allows for the property rights created to be “freely transferable,” which could lead to long-term ownership interests akin to selling rights in a publishing catalog. However, there are concerns about the lack of limitations on transfers in perpetuity, which could result in talent entering predatory contracts that transfer their voice or likeness rights. Although not currently addressed in the proposed bill, it is hoped that final versions will consider these issues.

While predatory contracts are not currently contemplated, the proposed bill does include protections for minors. For instance, the proposed bill limits authorized “digital depiction[s]” or “digital voice replica[s]” to those where a person was represented by counsel or is over the age of 18, unless a court approves the agreement or if a collective bargaining agreement applies.

An additional feature of the proposed bill is the extension of postmortem rights to “executors, heirs, transferees, or devisees” for an initial postmortem period of ten years after death. This period may terminate if there is no continuing commercial use, though the exact mechanism for termination is not specified.

Both the circulated NO FAKES Act in the Senate and the No AI FRAUD Act in the House make clear that disclaimers are not a defense to unauthorized depictions, voice replications, or cloning services.

Similar to defamation claims, First Amendment rights are named as a potential defense to a cause of action under the proposed No AI FRAUD Act. However, the proposed act includes a vague and potentially unconstitutional balancing test in which the “public interest in access to the use shall be balanced against the intellectual property interest in the voice or likeness.” The bill outlines factors to consider, resembling a fair use defense, such as whether the use is “commercial,” “necessary for and relevant to the primary expressive purpose of the work,” and whether the “use competes with or otherwise adversely affects the value of the work of the owner or licensee of the voice or likeness rights.”

The bill also considers whether the use is “transformative,” a term not defined in the bill but used by some courts in right of publicity cases in the context of First Amendment defenses. This originated in copyright’s fair use defense and has likely been narrowed in the publicity context following the Supreme Court’s decision in The Andy Warhol Foundation for the Visual Arts v. Goldsmith, 598 U.S. 508 (2023). The bill suggests that uses are allowed if they are “constitutionally protected commentary on a matter of public concern.”

The bill remains a hot bipartisan topic and is expected to undergo many iterations in the coming months. MWM’s Entertainment team and AI and Machine Learning team will stay up to date as a final version of the bill is reached or any similar bill is passed.