Music Publishers Sue AI Company Anthropic For Copyright Infringement – WorldNewsEra

5 Min Read

Universal Music Publishing Group, Concord, and ABKCO have sued the artificial intelligence company Anthropic over alleged copyright infringement of the three publishers’ songs, Rolling Stone reports and documents viewed by Pitchfork confirm. The companies filed their lawsuit against the San Francisco–based startup in a Tennessee federal court. The plaintiffs allege that Anthropic’s AI assistant Claude—a large language model (LLM) similar to OpenAI’s popular ChatGPT—infringed on the publishers’ copyrights by training Claude on their songs and posting the songs’ lyrics in its prompted answers without a licensing agreement, as well as removing copyright management information in violation of the Copyright Act of 1976. The lawsuit cites 500 copyrighted works owned by the plaintiffs, including Sam Cooke’s “A Change Is Gonna Come,” the Police’s “Every Breath You Take,” and Beyoncé’s “Halo.” The lawsuit alleges the infringement is “systematic and widespread,” and that Anthropic is not only liable for Claude’s infringement, but also for the infringing acts of its users. Anthropic was founded by four former OpenAI employees in 2021 and has raised funding from companies such as Google and Zoom. In April 2022, the company raised $500 million from a group led by Sam Bankman-Fried, the founder of the failed cryptocurrency exchange FTX who was later indicted on seven counts of conspiracy and fraud in connection with the exchange’s collapse. The Department of Justice alleges that the investment came from FTX customer funds. Last month, Anthropic announced that Amazon had invested “up to $4 billion,” securing a minority stake in the company. Because the training sets for commercial LLMs are proprietary and not made public, the plaintiffs do not know exactly how or where Anthropic “scraped” their copyrighted material. Their claims are deduced largely from the fact that Claude outputs copyrighted material. In the documentation for Claude 2, Anthropic said “Claude models are trained on a proprietary mix of publicly available information from the Internet, datasets that we license from third party businesses, and data that our users affirmatively share or that crowd workers provide.” Because of the opaque nature of LLMs and their training data sets, the publishers claim they are “substantially and irreparably harmed in an amount not readily capable of determination.” The publishers’ lawsuit cites examples of Claude serving up the lyrics to Katy Perry’s “Roar,” Gloria Gaynor’s “I Will Survive,” Garth Brooks’ “Friends in Low Places,” and the Rolling Stones’ “You Can’t Always Get What You Want” when prompted. It also cites examples of Claude generating lyrics for “new” songs that incorporate the lyrics from existing copyrighted works. The lawsuit alleges that when Claude was prompted to write a song about the death of Buddy Holly, it served up the lyrics to Don McLean’s “American Pie”; when prompted to write a song about moving from Philadelphia to Bel Air, it generated the lyrics to the theme song from The Fresh Prince of Bel-Air. The resolution of the lawsuit carries significant implications for the application of copyright law with regard to artificial intelligence tools like LLMs. While the U.S. Copyright Office has issued guidelines that declare that any portion of a work created by AI is ineligible for copyright protection, it has not determined the legality of training LLMs on copyrighted material. The publishers’ request for relief from the court, if granted, could potentially set a precedent that forces companies to exclude copyrighted works from their LLMs’ generated output or even their training data sets. The publishers are seeking a permanent injunction prohibiting infringement of the copyrighted works, various damages, and attorney’s fees. They’re asking the court to force Anthropic to identify the publishers’ lyrics and other copyrighted works on which it has trained its AI models, disclose the methods by which the company has collected and copied the training data (including any third parties with which it has shared the data), and to destroy—under the court’s supervision—all infringing copies of the copyrighted works in the company’s possession or control.

Share This Article
By admin
test bio
Leave a comment