Courts Draw a Line: AI Evidence Faces Scrutiny, Amicus Transparency Tweaked

In the quiet but consequential halls of Washington, a U.S. judicial rulemaking body has taken a double-edged step—tightening transparency for court influencers while cracking open the door to regulate artificial intelligence in the courtroom.

On Tuesday, the Committee on Rules of Practice and Procedure, which steers how justice is administered in federal courts, approved what it called a “modest” new rule: groups filing amicus briefs—those friend-of-the-court arguments often submitted by advocacy outfits—must now name any new donor who chipped in more than $100 for brief-writing if they’ve been a member for less than a year. Also required: groups formed within the last year must state their formation date. The tweak aims to address a rising concern—that some organizations are birthed purely to sway a single case without leaving fingerprints.

But that’s where the reform stops.

A more sweeping transparency measure that would’ve compelled disclosures when a litigant funded a quarter or more of an organization’s annual income was shelved. The vote? Razor-thin. The opposition? Heavyweights like the U.S. Chamber of Commerce. Critics, especially on Capitol Hill, are unimpressed.

“This doesn’t go far enough,” has been the refrain from lawmakers like Senator Sheldon Whitehouse, who has long argued that shadow money cloaks courtroom influence by funneling partisan funds through so-called neutral nonprofits.

Still, the rule now heads to the Judicial Conference, the judiciary’s policy command center. From there, it makes its way to the U.S. Supreme Court—and possibly Congress, if lawmakers decide to step in.

Meanwhile, the judiciary took another, more forward-looking step. It’s seeking public comment on a proposed rule that would set boundaries around AI-generated evidence at trial. The question: Can machines speak the truth in a courtroom?

The proposal says this: AI-driven evidence—say, software analyzing trading behavior or comparing creative works for copyright similarity—must pass the same muster required of human expert testimony under Rule 702 of the Federal Rules of Evidence. That is, unless it’s from a basic scientific tool like a thermometer or radar gun.

Judge Jesse Furman, who heads the advisory group behind the AI rule, believes it’s time to take seriously the credibility questions surrounding machine-made conclusions—especially when introduced without an expert to explain them.

Judge John Bates, who chaired the rules committee until now, called the draft rule “a good first shot.” The only official resistance? The U.S. Department of Justice, which labeled the effort too sweeping.

With machine intelligence inching closer to the witness stand and hidden donors still slipping in through legal side doors, the judiciary appears to be wrestling—sometimes reluctantly—with the evolving frontiers of law and technology.

Print Friendly, PDF & Email
Scroll to Top