Home NovaAstrax 360 AI is flooding the courts with more cases, more filings, and more...

    AI is flooding the courts with more cases, more filings, and more fake citations

    6
    0



    AI use is becoming pervasive across the legal system, with both experienced staff and absolute novices turning to ChatGPT and other tools to try to make the most persuasive case possible when they arrive in court, even if some of those claims turn out to be literally too good to be true.

    Last month, top law firm Sullivan & Cromwell was forced to apologize for filing fictitious case names and fabricated quotes in a legal document submitted in a case, as well as citing incorrect statutes in the U.S. Bankruptcy Code. “We deeply regret this occurred,” the firm wrote in an apologetic letter to the judge in a case about an alleged scam operation run out of Cambodia, which the defendant denies.

    It’s far from the only legal hitch blamed on AI. A 2025 High Court case in the U.K. saw a barrister submit 18 fictitious case-law citations out of 45 total. In another 2025 disciplinary case, a barrister used AI to prepare for a hearing and attempted to mask fabricated citations, while the widely publicized 2023 Mata v. Avianca case was among the first major examples of an attorney using ChatGPT to draft a legal filing that relied on entirely nonexistent judicial precedents.

    The impact of AI on the legal system is also starting to come into focus through new research examining the underlying numbers. A recent study suggests that U.S. federal courts are beginning to see significant increases in their caseloads.

    “The pro se share of all civil cases has been 11% for quite some time,” says Anand Shah, a researcher at the Massachusetts Institute of Technology who led the research. “And then in the post-AI world, we see it jumping all the way up to something like 18%.”

    At the same time, Shah and his co-author, Joshua Levy of the University of Southern California, analyzed the proportion of AI-generated text in complaints using a random sample of 1,600 filings drawn from an eight-year period. They found that AI-generated text rose from “basically 0%” before generative AI to about 18% in early 2026. “We were just floored,” says Shah.

    By digging deeper into the filings themselves, Shah and Levy found that the increase was concentrated in simpler, more templatable case types, rather than highly technical areas like patents or securities law. Shah believes that may suggest AI is helping people pursue cases they previously would not have attempted, because it has become far easier to generate the framework of a legal argument and the accompanying documents with minimal effort.

    While anecdotal evidence suggests the AI influx is beginning to strain the legal system, Shah says the broader disruption has not yet fully materialized in the data. “Cases are not resolving any faster or slower, which itself is a little surprising,” he says. But he notes that the back-and-forth between opposing parties is increasing, dramatically expanding the number of filings judges must review. That number is up roughly 158%, Shah says.

    Just because judges are managing to work through their expanded workloads, at least for now, doesn’t mean the system can absorb the pressure indefinitely, Shah argues. Society, he says, needs to start setting boundaries around AI in the courts before the strain becomes severe enough to slow the legal system down.

    The adoption of AI isn’t entirely negative, according to Will Pearce of Orbital, a company that provides legal AI tools to the real estate sector. “There’s a complete paradigm shift, not only in legal, but just generally in terms of how society accesses and interprets information,” he says.

    Pearce claims that AI has been “incredibly empowering,” opening up a legal system once dominated by dense legalese and arcane processes to people who can now use AI tools to parse documents and figure out possible next steps.

    But the risks remain significant. Shah says the lower courts are already under intense strain, and warns that the pressure is likely to grow quickly as AI models improve and more people realize they can use them to generate legal filings. “I don’t think we have a lot of time,” he says.

    That means more work is needed to establish rules and norms governing how and when AI should be used in the legal system. “We very much should not YOLO this transition of letting AI courts pop up willy-nilly and try a lot of stuff,” Shah warns.

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here