Newly Updated Evaluation of Corporate Compliance Programs (ECCP) Addresses AI for the First Time
On September 23, 2024, the U.S. Department of Justice (DOJ) revised its Evaluation of Corporate Compliance Programs (ECCP). The revisions addressed several areas, including whistleblower protections and the role of data analysis. But of particular note, the updated ECCP now includes new expectations for how such programs address the risks of disruptive technologies, such as artificial intelligence (AI).
First published in 2017, the ECCP is designed to help federal prosecutors weigh the effectiveness of a corporation’s compliance program when determining whether to criminally prosecute a corporation and, if so, the structure of any resolution. This most recent revision is the first since March 2023.
ECCP and Corporate Compliance
The DOJ’s Justice Manual — a line prosecutor’s lodestar for determining if and how to bring federal criminal charges — sets forth specific factors to consider when investigating a corporation, including “the adequacy and effectiveness of the corporation’s compliance program at the time of the offense, as well as at the time of a charging decision.” This consideration is reinforced in the United States Sentencing Guidelines for the purposes of determining the appropriate criminal fine, if any (see U.S.S.G. §§ 8B2.1, 8C2.5(f), and 8C2.8(11)). The latest iteration of the ECCP reminds prosecutors to ask three fundamental questions when evaluating a corporation’s compliance program:
- Is the corporation’s compliance program well designed?
- Is the program adequately resourced and empowered to function effectively?
- Does the corporation’s compliance program work in practice?
To address these fundamental questions, the ECCP has prosecutors consider a variety of aspects of a corporation’s operations. Earlier iterations of the ECCP asked prosecutors to consider technological developments, including whether a corporation preserves and produces data from third-party messaging applications used by the corporation. Last month’s DOJ guidance significantly increases this focus by instructing prosecutors to evaluate how corporations are managing emerging technological risks, AI specifically, to ensure compliance with applicable laws. Among other things, the September 2024 ECCP asks:
- How does the company assess the potential impact of AI on its ability to comply with criminal laws?
- What is the company’s approach to AI governance in its commercial business and compliance program?
- How is the company curbing any potential negative or unintended consequences of AI?
- How is the company mitigating the potential for deliberate or reckless of misuse of AI, including by company insiders?
- How does the company ensure accountability over its use of AI, including the baseline of human decision making used to assess AI and how the company trains its employees on its use?
Guidance for Prosecutors — and Companies
While the ECCP is a guide for federal prosecutors, it also signals to companies DOJ’s expectations about their compliance programs. AI presents extraordinary opportunities for business and compliance programs alike, but it also presents risk. For example, a healthcare company that uses an AI system to help with healthcare coding might be subject to upcoding liability regarding payment requests. Or a bank that does not adequately oversee AI-enabled anti-money laundering technologies may be liable for Bank Secrecy Act violations. Companies across regulated industries should expect the DOJ to ask questions about their own use of AI, but also how the company is mitigating risks posed by AI-enabled cybercrime or fraud schemes.
Takeaways
Companies should be crafting AI governance policies, relying on cross-functional teams to assess AI risks and opportunities, and routinely reassessing those policies to keep pace with the rapidly evolving technology. The DOJ now expects companies to integrate AI risk assessment into the fabric of their existing compliance programs and take proactive steps to mitigate those risks.