Briefly
- ANTHROPS AS A JUDGERS ordered to answer after his expert allegedly cited a non -existent academic article in a 75 million dollar authority lawsuit.
- The quotation aimed at supporting the claims that Claude rarely reproduces texts, but prosecutors said it was a “complete fabrication”, probably produced by the use of Claud himself.
- The case is added to the prefabricated legal pressure on the programmers of AI, with open, targets and anthropic lawsuits that face training models on unlicensed copyright material.
Amazon Amazon expert supported Anthropic has been accused of citing a fictional academic article in court reports that was supposed to defend the company from the claims that he had trained his AI model on copyright without permission.
The submission of the application, submitted by anthropic scientist Olivia Chen, was part of a legal response from a $ 75 million lawsuit filed by Universal Music Group, Concord, ABKCO and other main publishers.
AND Publishers are alleged In the 2023 lawsuit, anthrop was illegally used by hundreds of songs, including one Beyoncé, The Rolling Stones and The Beach Boys, for training the language model Claude.
Chenov declaration included a quote in an article from American statisticianintended for supporting anthropin argument that Claude only reproduces texts protected by copyright in rare and specific conditions, according to a Reuters report.
During a hearing on Tuesday in San Jose, Prosecutor’s lawyer Matt Oppenheim called a quote “complete fabrication”, but said he did not believe that Chen intentionally invented, only that she probably used Claude himself to generate sources.
Anthropin’s lawyer, Sy Damle, said Chen’s mistake seemed wrong, not a fabrication, at the same time criticizing the prosecutors for initiating the problem late in the proceedings.
Per Reuters, US Judge Susan Van Keulen said the question was a “very serious and serious” concern, noting that “there is a world of differences between a missed quote and a hallucination created by AI.”
She refused to examine Chen immediately, but ordered the anthropic to formally respond to the allegation until Thursday.
Anthropic did not immediately respond Decipherptic Request for comment.
Anthropically in court
The lawsuit against Anthropic was filed in October 2023, and prosecutors accused Claude’s model Anthropic of dressed on a huge amount of text -protected texts and reproduced them on request.
They demanded compensation, discovering a set of training and destroying content that violates violation.
Anthropically replied In January 2024, denying that his systems were designed to make texts protected by copyright.
She called each such reproduction a “rare scratch” and accused publishers of not offering evidence that typical users encountered a violation of the violation.
In August 2024. The company was hit Another lawsuitThis time by author Andrea Bartz, Charles Graeber and Kirk Wallace Johnson, who accused Anthropic of training Claud on the pyratic versions of his books.
Genai and copyright
The case is part of a growing return blow against generative AI companies accused of entering the material protected by copyright in training data without consent.
Openai faces multiple lawsuits of comedian Sarah Silverman, the author of the guild and New York Timesaccusing the company of using books and articles protected by copyright for training of its GPT models without approval or licenses.
Target was called u Similar suitsGiven that prosecutors claimed that his LLE -E models were trained for unlicensed literary works obtained from pirate data sets.
Meanwhile, in March, Openi and Google called Trump’s administration To alleviate copyright limitations on the AI training, calling them an obstacle of innovation in their formal proposals for the upcoming American “action plan”.
In the UK, the Government’s Proposal of the Law that would enable artificial intelligence companies to use copyright protected work without approval hit a block of path this week, after the home of Lords supported the amendment They require that AI companies discover what material protected by copyrights used in their models.
Generally intelligent Bulletin
Weekly AI journey narrated by gene, generative AI model.