A pivotal copyright class action is unfolding in the United States that could have significant implications for authors worldwide. In Bartz v. Anthropic, a federal court in California has ruled that a trial should proceed to determine whether Anthropic, the AI company behind the Claude language model, unlawfully used pirated books to train its AI systems. This case is attracting attention because it addresses how AI companies use copyrighted works for model training, an issue with potential global ramifications for writers, publishers, and other content creators.
⚖️ Background: The Case at a Glance
The lawsuit was initiated by authors Andrea Bartz, Charles Graeber, and MJ+KJ, Inc., who allege that Anthropic downloaded millions of books from pirate websites Library Genesis (LibGen) and Pirate Library Mirror (PiLiMi) without permission. The court has certified a class comprising all legal or beneficial owners of copyright-registered books that were downloaded by Anthropic from these sites.
The court’s decision to certify a class is significant because it allows a single lawsuit to represent a large group of authors and rights holders. This process streamlines legal action for individuals whose works may have been affected and avoids the need for each author to file separate cases. A trial is scheduled for 1 December 2025 to examine the allegations of copyright infringement.
Class counsel representing the plaintiffs are Lieff Cabraser Heimann & Bernstein LLP and Susman Godfrey LLP. Authors who are members of the Authors Guild or who participate in the Authors Registry may already have their information submitted to the court under subpoena, which ensures they are notified of developments.
📌 Key Points for Authors
Class Certification
The court has certified a class action, allowing us authors whose books were downloaded by Anthropic from LibGen or PiLiMi to be part of the lawsuit. Both authors and publishers can be class members if our works meet certain criteria.
Eligibility Criteria
To be included in the class, authors must meet several requirements, including:
Legal or beneficial ownership of copyright
An ISBN or ASIN for the work
Copyright registration with the U.S. Copyright Office within three months of publication or before the alleged infringement
Potential Damages
If the plaintiffs succeed at trial and prove willful infringement, statutory damages could range from $750 to $150,000 per title. This emphasises the scale of potential compensation for affected authors and publishers.
No Action Required to Join
Authors don’t need to file any claims to be included in the class. However, to ensure we receive official notices and have the opportunity to opt out or remain in the class, authors should provide their contact information and details of our works to the court-appointed class counsel.
🔍 How to Check if Your Books Are Affected
A list of affected works will be submitted to the court on 1 September 2025. Authors can check if our books are included by visiting the official website of the class counsel. Providing our information helps ensure that copyright interests are recognised and that we’re notified of developments, including the trial and any potential settlement or damages awards.
🌍 Is This Only a U.S. Issue?
Although Bartz v. Anthropic is a U.S.-based case, similar concerns are emerging worldwide. In the UK, authors such as Lord Price have accused companies like Meta of using pirated books from LibGen to train AI systems without authorisation. These authors are pursuing legal action through UK courts, demonstrating the international dimension of copyright infringement in AI training.
For us UK based authors, this is a reminder that the issue of unauthorised AI training is not limited to the United States. Even if our works haven’t been used by Anthropic specifically, similar legal and ethical questions may arise regarding other AI models operating in Europe.
🧭 What Should UK Authors Do?
Stay Informed
Monitor developments in both the U.S. and UK regarding AI training and copyright laws. The legal landscape is evolving quickly, and early awareness of changes will help protect your rights.
Consult Legal Advice
If you believe your works have been used without authorisation, seek guidance from a solicitor experienced in copyright and intellectual property law. They can help you understand your options and the potential for action in your jurisdiction.
Engage with Advocacy Groups
Organisations such as the Authors Guild in the U.S. and the Society of Authors in the UK are actively involved in these cases. They provide resources, updates, and support for authors affected by AI copyright issues.
📢 Conclusion
Bartz v. Anthropic is a landmark case for authors concerned about the unauthorised use of their works in AI training. Whether you are in the U.S. or the UK, this case highlights the growing importance of copyright awareness in the age of artificial intelligence.
If you believe your works may have been downloaded by Anthropic, visit the class counsel website to provide your details. Doing so ensures that you are informed about the lawsuit and can receive your share of any compensation awarded.
✅ Bartz v. Anthropic: Quick Reference for Authors
What it is: A U.S. class action lawsuit against Anthropic over the alleged use of pirated books to train its AI model, Claude.
Who is involved: Authors Andrea Bartz, Charles Graeber, and MJ+KJ, Inc.; class counsel are Lieff Cabraser Heimann & Bernstein LLP and Susman Godfrey LLP.
Class eligibility: Authors or publishers whose copyright-registered books were downloaded from LibGen or PiLiMi. Must have ISBN/ASIN and valid copyright registration.
Trial date: 1 December 2025.
Potential damages: $750–$150,000 per title if willful infringement is proven.
Action required: None to join the class automatically, but authors should submit contact information and book details to ensure notices and updates.
Checking your works: Affected works list will be filed 1 September 2025; submitting your info helps confirm your inclusion.
UK relevance: While the lawsuit is U.S.-based, UK authors should monitor similar AI copyright issues and consider legal advice if their works are affected.
Stay connected: Engage with advocacy groups like the Authors Guild (U.S.) or Society of Authors (UK) for updates and support.