HB Ad Slot
HB Mobile Ad Slot
Most Law Firms Are Building on Leaky Foundations, and AI Will Expose It
by: Mark Doble of Alexi AI  
Tuesday, July 8, 2025

In May, the FBI issued a warning1 about a group that has been targeting law firms across the United States using phishing and social engineering, exploiting the legal industry’s handling of confidential and sensitive data.

While the legal profession is no stranger to cybersecurity risks, this warning comes at a time when the sector is also undergoing a major technological shift. Many law firms are integrating AI into their workflows without fully addressing the structural vulnerabilities in their technology infrastructure. Between the urgency of needing to “keep up” with innovation and the realities of outdated AI systems, law firms are exposed to a level of risk that many aren’t prepared to manage.

Shadow AI, a threat from within

With the rise of AI adoption, cybersecurity threats aren’t exclusively from external actors anymore. One of the most pressing concerns is employees’ often invisible use of unapproved generative AI tools, a phenomenon known as “Shadow AI,” and tools we’re all familiar with, like ChatGPT.

Shadow AI2 refers to AI tools introduced into a workplace without the awareness or approval of IT, legal operations, or firm leadership. These tools are widely available and easy to access, which has led to their rapid adoption across all types of organizations. A recent study found that 99 percent3 of enterprises are exposed to Shadow AI, and fewer than one in seven4 have implemented any AI-specific security or governance controls.

This lack of oversight is particularly dangerous in the legal sector. Often under pressure to meet tight deadlines, legal professionals may use public AI tools to draft emails, conduct research, summarize documents, or brainstorm ideas. If those tools are not governed or secured properly, they can expose sensitive information to environments that are outside of the firm’s control, widening opportunities for unintended disclosure, data leakage, and regulatory risk.

Law firms’ infrastructures are outdated

Many firms continue to rely on a mix of legacy software, public cloud services, and SaaS platforms that were not built with AI integration or legal confidentiality in mind.

While public cloud models offer scalability and ease of use, they often lack the transparency and control that legal organizations need. In many cases, law firms can’t easily determine where their data is stored, how it is processed, or whether it’s being used to train third-party AI models. Even with terms of service and privacy policies in place, the firm’s visibility into the actual movement and handling of data can be limited.

Generative AI tools also become increasingly dependent on firm-specific knowledge and documents as they need access to internal data to provide relevant and reliable outputs. Without a secure environment for these interactions, law firms risk breaching client confidentiality and professional obligations.

Private, isolated AI should be the new standard

Isolated AI environments can address these risks, providing dedicated, private systems designed specifically to support secure and governed AI use within law firms and other professional organizations.

An isolated environment prevents data from being transmitted to or stored on third-party servers outside of the firm’s jurisdiction. This containment is essential for meeting regulatory requirements and fulfilling obligations to clients.

These environments also offer a high degree of customization. AI tools can be trained or fine-tuned using firm-specific documents and knowledge bases without sending that data to external vendors. This results in more accurate and relevant outputs while preserving the integrity and privacy of the firm’s information.

Moving toward isolated environments does not require abandoning cloud computing entirely. Instead, it involves choosing infrastructure models that are purpose-built for security, compliance, and performance within regulated sectors. A properly implemented private cloud, for example, can offer the same flexibility and scalability as public alternatives, while providing the safeguards necessary for legal practice.

AI security is a shared responsibility

To use AI safely and effectively, law firms must also establish clear governance policies that define how these tools are adopted, used, and monitored.

AI governance should address multiple areas:

  • Tool approval processes that ensure all software is reviewed for compliance, security, and relevance before deployment.
  • Training programs that help employees understand the risks of using unauthorized AI tools and promote safe alternatives.
  • Guidelines that define which tasks are appropriate for AI assistance and which require human oversight.
  • Ongoing audits and evaluations to measure the effectiveness and compliance of AI systems over time.

This governance framework should not be limited to IT departments. It requires collaboration between legal leadership, technology teams, compliance officers, and client-facing staff. AI affects the firm’s risk profile, reputation, and operational integrity, which makes it a shared responsibility across the organization.

More AI use should mean more security

The more law firms rely on AI, the more critical it becomes to ensure that every aspect of its deployment, like where it runs, what data it uses, and who controls it, is aligned with professional obligations and long-term business interests.

The rise of Shadow AI is a warning sign that internal systems and governance models have not kept pace with external innovation. 

Law firms need to transition away from general-purpose platforms and toward secure, isolated AI environments that are tailored to the demands of legal practice. This approach supports innovation without compromising the trust that clients place in their legal counsel.

The threats are here, the tools are advancing, and the risks are growing.

What’s needed now is a deliberate shift toward infrastructure and policies that treat AI with the same seriousness as the law itself.

Notes

  1. https://www.ic3.gov/CSA/2025/250523.pdf
  2. https://www.ibm.com/think/topics/shadow-ai
  3. https://142972.fs1.hubspotusercontent-na1.net/hubfs/142972/Files/reports/2025-varonis-state-of-data-security-report.pdf
  4. https://virtualizationreview.com/articles/2025/06/16/ai-booms-but-cloud-security-lags-just-13-use-ai-specific-protections-says-wiz.aspx
HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up for any (or all) of our 25+ Newsletters.

 

Sign Up for any (or all) of our 25+ Newsletters