AI-Generated Fabrication Sparks Legal Storm in Music Copyright Case

A California federal judge has called on AI company Anthropic to address serious allegations involving an “AI-generated hallucination” in its defense against copyright claims. This new controversy revolves around the company’s chatbot, Claude, and its alleged use of copyrighted lyrics from major music publishers, including Universal Music Group, Concord, and ABKCO, to train its system.

At a hearing this week, a lawyer for the plaintiffs in the case revealed that a key expert from Anthropic had cited an academic article that, according to investigations, does not exist. The expert, a data scientist at Anthropic, had used the citation to support the company’s argument that the reproduction of song lyrics by Claude was a rare event.

The judge, Susan van Keulen, expressed concern over the allegation, calling it a significant issue. While she acknowledged that it could have been an accidental error, she emphasized the difference between a simple misstep and the potential consequences of relying on fabricated sources created by AI. She ordered Anthropic to clarify the situation by Thursday.

The controversial citation referred to an article in American Statistician, a journal, but upon investigation, the plaintiffs’ lawyer, Matt Oppenheim, found that neither the article nor its purported authors existed. Oppenheim suggested that the expert, Olivia Chen, might have used Anthropic’s own Claude system to generate the reference.

Anthropic’s attorney, Sy Damle, responded that the citation error was simply a misstep, not an intentional act of fabrication, but conceded that the link in the filing led to an entirely different article by different authors. The controversy adds fuel to the growing debate over the use of AI in legal and academic contexts, particularly when AI systems, like Claude, can produce convincing but ultimately fabricated information.

This case is part of a larger trend of legal clashes between copyright holders and tech companies over the use of creative works to train AI systems, with the music publishers seeking to hold Anthropic accountable for allegedly using their lyrics without permission. The court battle is far from over, but it’s clear that the legal system is taking AI’s potential for “hallucination” seriously.

Print Friendly, PDF & Email
Scroll to Top