“Beating up on screenwriters,” John Gregory Dunne wrote in 1996, “is a Hollywood blood sport; everyone in the business thinks he or she can write, if only time could be found.” To put it another way, everybody in show business thinks they could do what writers do, given a little uninterrupted headspace. “That writers find the time is evidence of their inferior position in the food chain,” Dunne quipped. He knew the territory; by 1996, he and his wife, Joan Didion, had been working as Hollywood screenwriters for about 30 years. In those decades, they’d also participated in four writers strikes, labor stoppages by the Writers Guild of America (WGA), the union that bargains on behalf of Hollywood’s many working writers.
All these years later, Dunne’s words read prophetically in the face of yet another Hollywood writers strike. The idea that screenwriting is easy stuff, that anyone can do it, that writers are dispensable — this is all old news. But the attitude takes on a new dimension when you’re presented with a tool that could enable the studios to crop writers right out of the picture, or at least minimize the need to pay them, and an entertainment landscape that might not mind the results.
That tool, of course, is AI.
Not since the advent of streaming has a technology stood to change the landscape of Hollywood so drastically. “About a year ago, I went to the Guild because I had questions about AI,” John August told me. August also knows the territory intimately. He’s a widely produced screenwriter (Go, Charlie’s Angels, Big Fish, and lots more), the co-host of the hugely popular Scriptnotes podcast, and a former board member of the WGA. His concerns are part of the reason AI is one of the issues the WGA is working to address in its negotiations with Hollywood’s studios. Friends had shown him a rudimentary text generator that they said could help write a script. “Oh, that’s interesting,” he remembers thinking. “But also potentially really problematic, because it raises a host of questions — like, who really wrote this thing?”
Problematic technology has always been a sticking point in writers’ contracts. Back in 2007, the last time there was a strike, residuals from streaming services was a major area of discussion. A future in which most people would watch TV by streaming it from the internet, and in which half of all series writers would be working on projects that would never appear on broadcast TV at all, was unthinkable. That’s why one of the major disputes had to do with whether writers would get residuals, a sizable source of steady income, when their work streamed. The studios said no; the writers said yes.
“To me, this seems like a similar level of shift,” August says.
The WGA was able to secure some residuals back then, but nowhere near the income that studios pay out for broadcast (which includes both network and cable). Had the WGA gotten its hands on a crystal ball, it might have fought harder to achieve parity between streaming and broadcast. But you know what they say about hindsight.
That’s a handy story to remember now. The threat AI poses to creative writers is hard to fully imagine, because right now, AI tools are still pretty rudimentary. You can ask an AI to write essays or ideas or screenplays, and what it spits out has the creativity of a medium-bright 10th grader, regurgitated from the content it’s been trained on and unreliable when it comes to things like facts. (As Ted Chiang put it in an excellent essay for the New Yorker, ChatGPT is a “blurry JPEG of the web.”) But anyone who’s been on the internet in the past year knows that these tools are evolving at an alarming rate — so alarming that a consortium of prominent AI researchers and tech leaders recently wrote an open letter calling for a six-month halt to AI experimentation so the human cost and dangers can be properly evaluated.
Some of Hollywood’s power players are clearly far from ready to face the reality of AI and its cost-cutting (read: job-cutting) potential. The shift toward AI use has been evident for years. Consider the use of AI engines to make decisions about greenlighting projects, or the generation of a second Will Smith for the 2019 action movie Gemini Man, in which Smith co-starred opposite a fully computer-generated replica of his younger self — something AI makes very easy. Or consider Avengers: Engdame co-directors Joe and Anthony Russo’s ventures into filmmaking AI, which they believe will be capable of generating scarily narcissistic-sounding entertainment — you get to star in a movie with Marilyn Monroe, with a couple of button clicks — inside of a few years. (On that point, they’re not wrong.)
The WGA, on the other hand, is aware of the issue, and included it in their pattern of demands ahead of the overwhelming strike authorization vote. At the moment, the WGA’s contract (called the MBA, or Minimum Basic Agreement) only defines a “writer” as a “person,” which August quipped is “still, in 2023, a human being.” But those definitions could change, and the tech is evolving fast.
“So we felt it’s important to get two things defined in the contract more clearly,” August told me. The WGA has two main stipulations. First, the guild wants to make sure that “literary material” — the MBA term for screenplays, teleplays, outlines, treatments, and other things that people write — can’t be generated by an AI. In other words, ChatGPT and its cousins can’t be credited with writing a screenplay. If a movie made by a studio that has an agreement with the WGA has a writing credit — and that’s over 350 of America’s major studios and production companies — then the writer needs to be a person.
“Based on what we’re aiming for in this contract, there couldn’t be a movie that was released by a company that we work with that had no writer,” says August.
Second, the WGA says it’s imperative that “source material” can’t be something generated by an AI, either. This is especially important because studios frequently hire writers to adapt source material (like a novel, an article, or other IP) into new work to be produced as TV or films. However, the payment terms, particularly residual payouts, are different for an adaptation than for “literary material.” It’s very easy to imagine a situation in which a studio uses AI to generate ideas or drafts, claims those ideas are “source material,” and hires a writer to polish it up for a lower rate. “We believe that is not source material, any more than a Wikipedia article is source material,” says August. “That’s the crux of what we’re negotiating.”
In negotiations prior to the strike, the AMPTP refused the WGA’s demands around AI, instead countering with “annual meetings to discuss advancements in technology.”
This is all extra important because the appeal of AI to Hollywood, in particular to replace writers, is obvious. For one, the industry is sitting atop a pile of data that tells them not just what people want in the aggregate, but what, precisely, individual consumers want. For now, the industry’s method for making money requires making a product that’s as broadly appealing as possible. But suppose you could flip that: Netflix could use your viewing data to not just generate weirdly specific suggestions for you but create on-the-fly entertainment that matches your interests. Sure, it might seem like the results would be repetitive. But consider the extraordinary popularity of highly formulaic entertainment — procedurals, sitcoms, action flicks, Hallmark movies — and you can start to see the appeal for platforms whose main goal is to keep you watching.
Of course, that can’t be replicated (yet) in a theater, and there’s plenty of evidence that people like to see the same movie as their friends. AI can help with that, too. Hollywood’s other huge problem since its inception is that making movies requires employing a lot of people, and those people want to be compensated fairly for their labor and treated like humans — sleeping, eating, getting some vacation time. If you were faced with the possibility of removing some humans from the equation, employing instead a tireless machine that doesn’t need a salary and won’t go on strike when it’s being exploited, wouldn’t that be tempting?
The WGA can’t address all of those concerns, of course. “This contract is very specifically about the artistic and creative work we’re doing, to make sure that we’re protecting ourselves,” says August. It’s about compensation, he adds — “about how much we’re getting paid for original work, and how much we’re being paid in residuals. I see AI not so much as a threat to replace writers, but to push our pay lower.”
Part of the issue is that the WGA cannot fully prevent the use of AI-generated material. You could, for instance, imagine studios experimenting with having no writers credited on something that’s nonetheless scripted. There are inherent issues given the state of the technology right now, especially since AI engines currently don’t do a great job of distinguishing between information and ideas that are under copyright and those in the public domain.
But if the strike were to stretch on for many months, or if someone just decides to try some experiments, it’s not impossible that we’ll see some movies or shows with AI-generated screenplays. (Technically, any writer who worked on one would be scabbing, but it’s not hard to imagine some executive getting their intern to fix it up, or just doing it themselves because they took a screenwriting workshop in college.)
When I asked August about this possibility, he smiled. “I do imagine that we will see some material generated by some random executive,” he said. “We can’t protect them from their bad decisions. What we’re really trying to do is make sure that we’re protecting our members from abuses.”
Fair enough. As a writer (and a member of the WGA myself, though not the division that works for the MBA), I am concerned about AI’s potential. Maybe it’s my philosophical commitments, but I don’t expect the tools to ever turn out something as good as what a real human writer can achieve. I don’t think AI is going to be able to write Everything Everywhere All at Once, or Tar, or Succession. At best, it will be an okay imitation of things that humans have already written.
But here is the thing: Cheap imitations of good things are what power the entertainment industry. Audiences have shown themselves more than happy to gobble up the same dreck over and over, and get big mad when presented with something confusing or challenging. And labor agreements are only as good as the people who keep them.
So I do worry that writers will be not just exploited but cut out of the picture entirely, at least when it comes to the kinds of entertainment that risk-averse studios are willing to invest in, and especially if the WGA doesn’t manage to secure their place in this round of bargaining. There will always be a place in the movie business (and, maybe, the TV business) for people with original ideas and paradigm-shifting work. But whether they’ll get paid — whether the Jordan Peeles and Greta Gerwigs and Chloé Zhaos of the future will even get a chance to work — is the big question, and it’s one that lately I don’t feel good about.
Maybe you don’t care about the WGA, and that’s your prerogative. But it’s worth considering that this round of bargaining may have long-reaching implications in your field, too. AI tools can be extraordinarily useful, but when they’re used to replace humans and lower expectations so we don’t even notice the lack, then we’re teetering on the brink of something very dangerous. After all, if we’ve learned anything in the last decade of the internet, it should be that the people who own the platforms and the algorithms have extraordinary power to shift reality.
Luckily, it’s not just the WGA that knows this. Organizations like the Algorithmic Justice League are thinking long and hard about how to address the future that AI might bring on. And Hollywood might be on the leading edge. “In conversations with actor friends, they have similar concerns about the use of their likenesses, the use of their voices,” August notes. Writers, too, are worried about their voices, particularly in a time when barriers around who gets to write, speak, and see their work produced have been broken. “It would swing the pendulum vastly toward inauthenticity if they were to go down this road,” he says.
What’s at stake, ultimately, is what’s considered “authentic,” and whether we’re willing to accept limits to how tools are used. I’m not optimistic about it if left unchecked — but I’m hopeful that we might come to see that there’s great potential to AI, if only we know when not to use it. There are some things you just can’t replace.
Update May 2, 9:30 am ET: This story was originally published on April 27 and has been updated to reflect that the WGA is now on strike.