Federal Rule 901 governs the authentication of evidence in court. Per the rule, “[t]o satisfy the requirement of authenticating or identifying an item of evidence, the proponent must produce evidence sufficient to support a finding that the item is what the proponent claims it is.” Historically, this requirement could be satisfied, for example, through the testimony of a witness with knowledge, comparison with an authenticated item by an expert witness or trier of fact, or identification of distinctive characteristics. The advent of deepfakes, however, has generated debate whether additional safeguards need to be implemented to protect the authenticity of evidence.
On May 2, 2025, the U.S. Judicial Conference’s Advisory Committee on Evidence Rules considered proposals to amend the Federal Rules of Evidence to address the challenges posed by AI-generated evidence (see our prior post regarding the Committee’s proposed Rule 707 – Machine-Generated Evidence). Besides Rule 707, the Committee evaluated Rule 901(c), a new draft amendment that addresses deepfakes, i.e., altered or wholly fabricated AI-generated images, audio, or video that are difficult to discern from reality.
While recognizing the importance of detecting deepfakes to preserve the integrity of the judicial system, the Committee ultimately decided that a rule amendment was not necessary at this time, given the courts' existing methods for evaluating authenticity and the limited instances of deepfakes in the courtroom to date. Nonetheless, as a precaution, the Committee proposed Rule 901(c) for future consideration should circumstances change.
Rule 901. Authenticating of Identifying Evidence
*****
(c) Potentially Fabricated Evidence Created by Artificial Intelligence.
- (1) Showing Required Before an Inquiry into Fabrication. A party challenging the authenticity of an item of evidence on the ground that it has been fabricated, in whole or in part, by generative artificial intelligence must present evidence sufficient to support a finding of such fabrication to warrant an inquiry by the court.
- (2) Showing Required by the Proponent. If the opponent meets the requirement of (1), the item of evidence will be admissible only if the proponent demonstrates to the court that it is more likely than not authentic.
- (3) Applicability. This rule applies to items offered under wither Rule 901 or 902.
In the notes section to the draft amendment, the Committee explained its rationalefor Rule 901(c)’s two-step process for evaluating deepfake claims. First, the opponent must submit sufficient information for a reasonable person to infer that the proffered evidence was fabricated. A mere assertion that the evidence is a deepfake is not sufficient. Provided that the opponent meets their burden, the Committee explained that the proponent “must prove authenticity under a higher evidentiary standard than the prima facie standard ordinarily applied under Rule 901.”
The Committee acknowledged that Rule 901(c) does not specifically combat another possible consequence of deepfakes, whereby the risk of deepfakes leads juries to distrust genuine evidence. The Committee, however, cited Rule 403 (the “prejudice rule”) and the judges’ role as gatekeepers to curb attorney assertions that “you canno t believe anything that your see.”
Clearly, deepfakes present significant challenges in the courtroom and risk eroding public confidence in our judicial system. LMS will continue to monitor this evolving topic, the tools used by judges to verify evidence authenticity, and any associated amendments to the rules of evidence.