Following a complaint I submitted to their legal department, which is on-going, The Hollywood Reporter modified the reporting I wrote about here from the original version, which called me a “fraudster” to the new ending of the article which presently reads (underlines added by me to highlight changed text elements).
The authors also argue that Anthropic is depriving authors of book sales by facilitating the creation of rip-offs. When Kara Swisher released Burn Book earlier this year, Amazon was flooded with AI-generated copycats, according to the complaint. In another instance, author Jane Friedman discovered a “cache of garbage books” written under her name.
According to the lawsuit, authors have turned to Claude to generate “cheap book content,” and the complaint highlights an individual who have created dozens of books in a short period of time to make its case.
The authors claim that Anthropic used a dataset called “The Pile,” which incorporates nearly 200,000 books from a shadow library site, to train Claude. In July, Anthropic confirmed the use of the dataset to various publications, according to the lawsuit.
Anthropic didn’t immediately respond to a request for comment.
Aug. 23, 9 am Updated to revise a paragraph within this story as well as include more detail from the complaint and remove an incorrect reference to author Tim Boucher.
Leave a Reply
You must be logged in to post a comment.