HB Ad Slot
HB Mobile Ad Slot
Proposed FTC Rule Would Hold AI Companies Liable for “Deepfake” Impersonation Scams — AI: The Washington Report
Thursday, February 29, 2024
  1. After over two years of consultation and review, on February 15, 2024, the Federal Trade Commission (FTC or Commission) finalized a rule prohibiting the impersonation of government agencies or businesses in matters “in or affecting commerce…”
  2. According to Chair Lina Khan, however, advances in AI had rendered the finalized rule insufficient. So, the FTC has proposed a supplemental rule that would make it a violation of the FTC Act to “materially and falsely pose as, directly or by implication, an individual, in or affecting commerce” or to “materially misrepresent, directly or by implication, affiliation with, including endorsement or sponsorship by, an individual, in or affecting commerce as commerce…”
  3. The proposed rule would also declare it “unlawful for a firm, such as an AI platform that creates images, video, or text, to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation.” This provision is significant because if included in the final rule, enforcement agencies may expand the number of cases in which AI companies are held liable for unlawful conduct facilitated by their products and services. 
     

Advances in AI Cause FTC to Initiate a Supplemental Rulemaking on Impersonation Fraud

In December 2021, the Federal Trade Commission (FTC or Commission) published an advance notice of proposed rulemaking for a rule that would prohibit the impersonation of government agencies or businesses. On February 15, 2024, the FTC finalized this rule with slight modifications.

However, advances in AI technology over the intervening two years had, in the view of the Khan Commission, rendered the rule insufficient. As such, on the same day that the Commission finalized the rule on government and business impersonation, the FTC published a Supplemental Notice of Proposed Rulemaking (SNPR) that would prohibit the impersonation of an individual in a matter affecting commerce.

Significantly, the proposed rule would also prohibit the provision of goods or services “with knowledge or reason to know” that they will be used in the impersonation of government agencies or businesses. This provision is explicitly intended to crack down on any generative AI service that, in the words of the Commission, “threatens to turbocharge” the issue of impersonation fraud.

FTC Finalizes Trade Regulation on the Impersonation of Government Entities and Businesses

Following up on a December 2021 advance notice of proposed rulemaking, on February 15, 2024, the Commission published its final “Trade Regulation Rule on Impersonation of Government and Businesses” (“impersonation rule”).

Under the final rule, it is considered an unfair or deceptive act or practice, and thus a violation of the FTC Act, to “materially and falsely pose as, directly or by implication, a government entity or officer thereof, in or affecting commerce” or to “materially misrepresent, directly or by implication, affiliation with, including endorsement or sponsorship by, a government entity or officer thereof, in or affecting commerce…”

Along with prohibiting the impersonation of government entities, the rule also bars the impersonation of business entities. Specifically, the rule makes it an unfair or deceptive act or practice to “materially and falsely pose as, directly or by implication, a business or officer thereof, in or affecting commerce” or to “materially misrepresent, directly or by implication, affiliation with, including endorsement or sponsorship by, a business or officer thereof, in or affecting commerce…”

The rule’s specification that the conduct must be “in or affecting commerce” to be prohibited was added to the final rule in order to make it “abundantly clear that the scope of the final regulatory text is coterminous with the scope of the FTC’s authority under the FTC Act and they clearly specify the misconduct prohibited by the final rule.”

Advances in AI Supercharge Impersonation Risks

Since the Commission first published notice of the impersonation rule in December 2021, sophisticated generative AI tools have become easily accessible to consumers. An abuse of these tools that particularly concerns lawmakers and regulators is the use of AI tools to effectively impersonate individuals.

AI-assisted impersonation is often conducted through the creation of “deepfakes” or doctored images, videos, or recordings that make it appear as though an individual is saying or doing something that they did not actually say or do. The recent proliferation of AI tools has made the production of deepfakes easier than ever, and regulators and experts are worried that deepfakes are already being used to create revenge pornographyspread misinformation, and defraud consumers.

FTC Proposes AI-Age Update to the Impersonation Rule

It is the possibility of AI-assisted impersonation fraud that has led the FTC to propose an update to the impersonation rule. On February 15, 2024, the same day that it finalized the impersonation rule, the FTC published a supplemental notice of proposed rulemaking (SNPR) on the impersonation of individuals.

The SNPR would make it an unfair or deceptive act or practice, and thus a violation of the FTC Act, to “materially and falsely pose as, directly or by implication, an individual, in or affecting commerce” or to “materially misrepresent, directly or by implication, affiliation with, including endorsement or sponsorship by, an individual, in or affecting commerce as commerce...”

The Commission’s press release states that the FTC is proposing this rule in response to “surging complaints around impersonation fraud” and the fact that “AI-generated deepfakes…threatens to turbocharge” the issue of impersonation fraud.

Along with barring individual impersonation fraud, the proposed rule would, as put by the FTC’s press release, take the significant step of declaring it “unlawful for a firm, such as an AI platform that creates images, video, or text, to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation.” This prohibition would be based on the principle that one who “places in the hands of another a means of consummating a fraud or competing unfairly in violation of the Federal Trade Commission Act is himself guilty of a violation of the Act.”[1]

Specifically, the proposed rule would prohibit the provision of goods or services “with knowledge or reason to know that those goods or services will be used to…materially and falsely pose as, directly or by implication, a government entity or officer thereof, a business or officer thereof, or an individual, in or affecting commerce as commerce” or “materially misrepresent, directly or by implication, affiliation with, including endorsement or sponsorship by, a government entity or officer thereof, a business or officer thereof, or an individual, in or affecting commerce…”

Conclusion: Greater Liability for AI Companies on the FTC’s Agenda

The FTC’s proposed rule on the impersonation of individuals would not only substantially expand the Commission’s ability to seek monetary relief for victims of impersonation fraud but would also place providers of AI products and services under much greater liability for acts of impersonation fraud committed by their users. If the FTC includes this provision in the final rule, enforcement agencies such as the Department of Justice and Consumer Financial Protection Bureau may follow suit, holding AI companies liable for unlawful conduct facilitated by their products or services.

Providers of AI products and services should closely track the progress of this supplemental notice of proposed rulemaking. Comments on the proposed rule will be due sixty days after it is posted on the Federal Register,[2] and may be submitted through the Federal Register’s website.

Endnotes
[1] As established in C. Howard Hunt Pen Co. v. FTC, 197 F.2d 273, 281 (3d Cir. 1952).
[2] At the time of writing (February 27, 2024), the proposed rule is not yet published in the Federal Register.

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins