Harpercollins wants authors to sign away AI training rights
Rights come from power.
If you’d like an essay-formatted version of this post to read or share, here’s a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/11/18/rights-without-power/#careful-what-you-wish-for
Rights don’t give you power. People with power can claim rights. Giving a “right” to someone powerless just transfers it to someone more powerful than them. Nowhere is this more visible than in copyright fights, where creative workers are given new rights that are immediately hoovered up by their bosses.
It’s not clear whether copyright gives anyone the right to control whether their work is used to train an AI model. It’s very common for people (including high ranking officials in entertainment companies, and practicing lawyers who don’t practice IP law) to overestimate their understanding of copyright in general, and their knowledge of fair use in particular.
Here’s a hint: any time someone says “X can never be fair use,” they are wrong and don’t know what they’re talking about (same goes for “X is always fair use”). Likewise, anyone who says, “Fair use is assessed solely by considering the ‘four factors.’” That is your iron-clad sign that the speaker does not understand fair use:
https://pluralistic.net/2024/06/27/nuke-first/#ask-questions-never
But let’s say for the sake of argument that training a model on someone’s work is a copyright violation, and so training is a licensable activity, and AI companies must get permission from rightsholders before they use their copyrighted works to train a model.
Even if that’s not how copyright works today, it’s how things could work. No one came down off a mountain with two stone tablets bearing the text of 17 USC chiseled in very, very tiny writing. We totally overhauled copyright in 1976, and again in 1998. There’ve been several smaller alterations since.
We could easily write a new law that requires licensing for AI training, and it’s not hard to imagine that happening, given the current confluence of interests among creative workers (who are worried about AI pitchmen’s proclaimed intention to destroy their livelihoods) and entertainment companies (who are suing many AI companies).
Creative workers are an essential element of that coalition. Without those workers as moral standard-bearers, it’s hard to imagine the cause getting much traction. No one seriously believes that entertainment execs like Warner CEO David Zaslav actually cares about creative works — this is a guy who happily deletes every copy of an unreleased major film that had superb early notices because it would be worth infinitesimally more as a tax-break than as a work of art:
https://collider.com/coyote-vs-acme-david-zaslav-never-seen/
The activists in this coalition commonly call it “anti AI.” But is it? Does David Zaslav — or any of the entertainment execs who are suing AI companies — want to prevent gen AI models from being used in the production of their products? No way — these guys love AI. Zaslav and his fellow movie execs held out against screenwriters demanding control over AI in the writers’ room for 148 days, and locked out their actors for another 118 days over the use of AI to replace actors. Studio execs forfeited at least $5 billion in a bid to insist on their right to use AI against workers:
Entertainment businesses love the idea of replacing their workers with AI. Now, that doesn’t mean that AI can replace workers: just because your boss can be sold an AI to do your job, it doesn’t mean that the AI he buys can actually do your job:
https://pluralistic.net/2024/07/25/accountability-sinks/#work-harder-not-smarter
So if we get the right to refuse to allow our work to be used to train a model, the “anti AI” coalition will fracture. Workers will (broadly) want to exercise that right to prevent AI models from being trained at all, while our bosses will want to exercise that right to be sure that they’re paid for AI training, and that they can steer production of the resulting model to maximize the number of workers than can fire after it’s done.
Hypothetically, creative workers could simply say to our bosses, “We will not sell you this right to authorize or refuse AI training that Congress just gave us.” But our bosses will then say, “Fine, you’re fired. We won’t hire you for this movie, or record your album, or publish your book.”
Given that there are only five major publishers, four major studios, three major labels, two ad-tech companies and one company that controls the whole ebook and audiobook market, a refusal to deal on the part of a small handful of firms effectively dooms you to obscurity.
As Rebecca Giblin and I write in our 2022 book Chokepoint Capitalism, giving more rights to a creative worker who has no bargaining power is like giving your bullied schoolkid more lunch money. No matter how much lunch money you give that kid, the bullies will take it and your kid will remain hungry. To get your kid lunch, you have to clear the bullies away from the gate. You need to make a structural change:
https://chokepointcapitalism.com/
Or, put another way: people with power can claim rights. But giving powerless people more rights doesn’t make them powerful — it just transfers those rights to the people they bargain against.
Or, put a third way: “just because you’re on their side, it doesn’t follow that they’re on your side” (h/t Teresa Nielsen Hayden):
Last month, Penguin Random House, the largest publisher in the history of human civilization, started including a copyright notice in its books advising all comers that they would not permit AI training with the material between the covers:
At the time, people who don’t like AI were very excited about this, even though it was — at the utmost — a purely theatrical gesture. After all, if AI training isn’t fair use, then you don’t need a notice to turn it into a copyright infringement. If AI training is fair use, it remains fair use even if you add some text to the copyright notice.
But far more important was the fact that the less that Penguin Random House pays its authors, the more it can pay its shareholders and executives. PRH didn’t say it wouldn’t sell the right to train a model to an AI company — they only said that an AI company that wanted to train a model on its books would have to pay PRH first. In other words, just because you’re on their side, it doesn’t follow that they’re on your side.
When I wrote about PRH and its AI warning, I mentioned that I had personally seen one of the big five publishers hold up a book because a creator demanded a clause in their contract saying their work wouldn’t be used to train an AI.
There’s a good reason you’d want this in your contract; the standard contracting language contains bizarrely overreaching language seeking “rights in all media now know and yet to be devised throughout the universe”:
https://pluralistic.net/2022/06/19/reasonable-agreement/
But the publisher flat-out refused, and the creator fought and fought, and in the end, it became clear that this was a take-it-or-leave-it situation: the publisher would not include a “no AI training” clause in the contract.
One of the big five publishers is Rupert Murdoch’s Harpercollins. Murdoch is famously of the opinion that any kind of indexing or archiving of the work he publishes must require a license. He even demanded to be paid to have his newspapers indexed by search engines:
https://www.inquisitr.com/46786/epic-win-news-corp-likely-to-remove-content-from-google
No surprise, then, that Murdoch sued an AI company over training on Newscorp content:
But Rupert Murdoch doesn’t oppose the material he publishes from being used in AI training, nor is he opposed to the creation and use of models. Murdoch’s Harpercollins is now pressuring its authors to sign away their rights to have their works used to train an AI model:
https://bsky.app/profile/kibblesmith.com/post/3laz4ryav3k2w
The deal is not negotiable, and the email demanding that authors opt into it warns that AI might make writers obsolete (remember, even if AI can’t do your job, an AI salesman can convince Rupert Murdoch — who is insatiably horny for not paying writers — that an AI is capable of doing your job):
https://www.avclub.com/harpercollins-selling-books-to-ai-language-training
And it’s not hard to see why an AI company might want this; after all, if they can lock in an exclusive deal to train a model on Harpercollins’ back catalog, their products will exclusively enjoy whatever advantage is to be had in that corpus.
In just a month, we’ve gone from “publishers won’t promise not to train a model on your work” to “publishers are letting an AI company train a model on your work, but will pay you a nonnegotiable pittance for your work.” The next step is likely to be, “publishers require you to sign away the right to train a model on your work.”
The right to decide who can train a model on your work does you no good unless it comes with the power to exercise that right.
Rather than campaigning for the right to decide who can train a model on our work, we should be campaigning for the power to decide what terms we contract under. The Writers Guild spent 148 days on the picket line, a remarkable show of solidarity.
But the Guild’s real achievement was in securing the right to unionize at all — to create a sectoral bargaining unit that could represent all the writers, writing for all the studios. The achievements of our labor forebears, in the teeth of ruthless armed resistance, resulted in the legalization and formalization of unions. Never forget that the unions that exist today were criminal enterprises once upon a time, and the only reason they exist is because people risked prison, violence and murder to organize when doing so was a crime:
https://pluralistic.net/2024/11/11/rip-jane-mcalevey/#organize
The fights were worth fighting. The screenwriters comprehensively won the right to control AI in the writers’ room, because they had power:
https://pluralistic.net/2023/10/01/how-the-writers-guild-sunk-ais-ship/
Image:
Cryteria (modified)
https://commons.wikimedia.org/wiki/File:HAL9000.svg
CC BY 3.0
https://creativecommons.org/licenses/by/3.0/deed.en
—
Eva Rinaldi (modified)
https://commons.wikimedia.org/wiki/File:Rupert_Murdoch_-_Flickr_-_Eva_Rinaldi_Celebrity_and_Live_Music_Photographer.jpg
CC BY-SA 2.0
https://creativecommons.org/licenses/by-sa/2.0/deed.en