Opinion: As AI is embraced, what happens to the artists whose work was stolen to build it?
Amid the hype surrounding Apple’s new deal with OpenAI, one issue has been largely papered over: The AI company’s foundational models are, and have always been, built atop the theft of creative professionals’ work.
The arrangement with Apple isn’t the only news from OpenAI. Among recent updates and controversies including high-level defections, last month the company quietly announced Media Manager, scheduled for release in 2025. A tool purportedly designed to allow creators and content owners to control how their work is used, Media Manager is really a shameless attempt to evade responsibility for the theft of artists’ intellectual property that OpenAI is already profiting from.
OpenAI says this tool would allow creators to identify their work and choose whether to exclude it from AI training processes. But this does nothing to address the fact that the company built its foundational models using authors’ and other creators’ works without consent, compensation or control over how OpenAI users will be able to imitate the artists’ styles to create new works. As it’s described, Media Manager puts the burden on creators to protect their work and fails to address the company’s past legal and ethical transgressions. This overture is like having your valuables stolen from your home and then hearing the thief say, “Don’t worry, I’ll give you a chance to opt out of future burglaries ... next year.â€
Apple announced new artificial intelligence-powered tools, including a partnership with OpenAI that allows Siri to surface answers from ChatGPT.
Writers, artists, journalists and other creative workers have consistently asked that OpenAI and other generative AI companies obtain creators’ consent before using their work to train artificial intelligence products, and that the organizations refrain from using works without express permission. Last July, more than 16,000 authors signed a letter to leading AI companies demanding that the businesses obtain permission and pay for works they use to train their AI. Yet OpenAI continues to trample on artists’ rights and rebuff their appeals, as we saw recently when it launched a ChatGPT audio assistant with a voice similar to Scarlett Johansson’s despite the actor’s clear and repeated refusals.
Although Johansson won her battle — OpenAI “paused†the offending voice from its offerings after the actor threatened legal action — the best chance for the wider community of artists is to band together. AI companies’ cavalier attitude toward creators’ rights and consent extends to people at all levels of fame.
Johansson, who portrayed the voice of a computer program in ‘Her,’ was not behind OpenAI’s ‘Sky’ voice assistant. Another actor provided the voice, OpenAI said.
Last year the Authors Guild, along with 17 other plaintiffs, sued OpenAI and Microsoft, demanding that authors receive what they are due. That suit is ongoing and other creative professionals and copyright owners have also taken legal action. Among these are a class action filed by visual artists against Stability AI, Runway AI, Midjourney and Deviant Art, a lawsuit by music publishers against Anthropic for infringement of song lyrics, and suits in the U.S. and U.K. brought by Getty Images against Stability AI for copyright infringement of photographs.
AI companies often argue that it would be impossible for them to license all the content that they need and that doing so would bring progress to a grinding halt. This is simply untrue. OpenAI has signed a succession of licensing agreements with publishers large and small. While the exact terms of these agreements are rarely released to the public, the compensation estimates pale in comparison with the vast outlays for computing power and energy that the company readily spends. Payments to authors would have minimal effects on AI companies’ war chests, but receiving royalties for AI training use would be a meaningful new revenue stream for a profession that’s already suffering.
Authors’ earnings have been in precipitous decline for more than a decade. In 2022, the median annual writing-related income for full-time writers was just over $20,000, down nearly 50% from 2009. And the data for 2023 look even more dire. AI-generated books, sometimes listed as written by real authors without the writer’s permission, flood Amazon, where anyone searching might buy them instead of the creative work the human author spent months or years writing. Meanwhile, OpenAI is valued at $80 billion, Anthropic at $18.4 billion and French AI startup Mistral at $6.2 billion. These companies claim they need our work to succeed but can’t afford to pay for it. Any human author can tell you that this narrative has blatant inconsistencies.
We cannot trust tech companies that swear their innovations are so important that they do not need to pay for one of the main ingredients — other people’s creative works. The “better future†we are being sold by OpenAI and others is, in fact, a dystopia. It’s time for creative professionals to stand together, demand what we are owed and determine our own futures.
Mary Rasenberger is the CEO of the Authors Guild.
More to Read
A cure for the common opinion
Get thought-provoking perspectives with our weekly newsletter.
You may occasionally receive promotional content from the Los Angeles Times.