In The Matrix, Neo (Keanu Reeves) wanders through crowded city streets, bumping past sailors and women in red dresses, before learning that they aren’t real people, but instead simulations.
In future Keanu Reeves movies, it’s possible that everyone around him might be simulated, too. On July 13, Hollywood producers advertised a “groundbreaking AI proposal” involving the “use of digital replicas or…digital alterations of a performance.” The SAG-AFTRA union lambasted the proposal, accusing the studios of simply trying to replace background actors with AI. Studios could scan an actor, pay them for a day, and then simply use AI to insert them into the rest of the film, Duncan Crabtree-Ireland, SAG-AFTRA’s chief negotiator, said in a press conference. The Alliance of Motion Picture and Television Producers responded that this characterization was inaccurate and that they would “establish a comprehensive set of provisions that require informed consent and fair compensation when a ‘digital replica’” or similar AI technology is used.
While the scenario Crabtree-Ireland described may sound far-off or dystopian, it’s basically already technologically possible. Generative AI companies like Runway and Stability AI have released products that allow filmmakers to create all sorts of hyper-realistic images out of written prompts. And those consumer-facing products pale in comparison to the advanced tools that major studios have at their disposal. Studios can already use AI to render scenes of packed nightclubs or sprawling battlegrounds—and do so more cheaply than paying for dozens of actual actors, AI experts say.
But even those AI experts, who believe that AI technology will eventually be a net good for creators and workers in film, believe that replacing background actors with AI is a bad idea. “That is a great example of a terrible way to use AI in the industry,” says Tye Sheridan, an actor and entrepreneur who co-founded the AI start-up Wonder Dynamics. “We need to come together as a community to know where it poses its threat, and where it can potentially launch the next great artists of our generation.”
Before AI tools were available, Hollywood artists used CGI—or traditional computer graphics techniques—to change actors’ appearances. Carrie Fisher, who died in 2016, appeared posthumously as Princess Leia in subsequent Star Wars movies thanks to expert VFX teams performing digital wizardry upon archival footage of the late actor. More recently, The Flash contained scenes with Christopher Reeve's Superman, who was depicted via a similar blend of film and technology.
But AI processes, which require less human intervention than CGI techniques, are becoming cheaper and more widely available, says Nikola Todorovic, Wonder Dynamics’ CEO and co-founder. “With Carrie Fisher, that had to be a VFX studio, which put a huge budget and thousands of artists behind it,” he says. “Before, it was more expensive to do it than hire an actor. Now, it’s less expensive, so that’s why the studios are like, “Oh, scan them once, and do it every time.’”
According to Collider, studios have already been using AI technology to render background characters for several years, including in the upcoming films Captain America: Brave New World and Netflix’s The Residence. In April, the Marvel director Joe Russo predicted that AIs will be able to create movies within two years. Just last month, filmmakers generated a 12-minute movie solely with AI imagery—although its eerie close-ups of human faces make it obvious that the footage is not real.
Empowering Indie Filmmakers
Filmmakers around the world have already begun testing the abilities of AI to create on-screen characters, with eye-popping success. It took the Berlin-based director Martin Haerlin about three days to create a now-viral video in which he seamlessly transforms from a wealthy British aristocrat into a talking ape into a female MMA fighter with a snap of his finger.
Haerlin, who mostly directs commercials and music videos, started playing around with the AI tools Runway and Elevenlabs in the midst of a sharp decrease in advertising budgets this year. Haerlin filmed himself at his house, and then input the footage into Runway, asking the AI to transform him into various historical or sci-fi settings. “This was a revelation for me and an empowerment, because all of a sudden, I could tell stories without the pressure of the crew, of a producer, of being chosen by a client or an advertising agency,” he says.
Jahmel Reynolds, a Los Angeles-based filmmaker, has been similarly emboldened. Reynolds has been using Stable Diffusion and Runway to create sci-fi scenes of marauding giant robots and Power Rangers-like motorcycle gangs. He’s currently working on a short film, Helmet City, created entirely in collaboration with AI, which fuses sci-fi and hip-hop aesthetics.
Reynolds still says that generative AI rendering, as advanced as it is, can’t achieve full realism. “The movements look awkward: I haven’t seen a technology that’s been able to do that well, in a way where you can’t make the distinction,” he says. “There’s still a level of the uncanny valley.”
Actor Tye Sheridan, who starred in the 2018 metaverse sci-fi film Ready Player One, co-founded Wonder Dynamics with Nikola Todorovic in 2017 precisely to aid small-scale filmmakers like Haerlin and Reynolds. Wonder Dynamics’ AI-driven software allows users to film a scene, then replace the on-screen actor with another character, whether it be a cartoon or an alien. That character then mimics the actors’ motions and even facial expressions.
The goal of Wonder Dynamics technology, its creators say, is to empower independent sci-fi filmmakers to dream bigger, and to create worlds like Avatar or Ready Player One without needing massive studio budgets. Sheridan recalls spending eight weeks in motion capture suits on the set of the latter movie, which then required dozens of artists to process all of the data. “We don’t know where the next Spielberg, who might be some kid in some village somewhere, is from. Right now it’s almost impossible to discover some of these voices,” Todorovic says. “We want to build tech to give access globally, as opposed to people having to move to L.A. and break into the industry a certain way.”
Risks For Actors
But it is exactly this sort of technology that also seems to threaten the livelihoods of actors. Martin Haerlin says that production companies have already started soliciting him to create AI videos to cut down on costs and the number of actors involved. “They all think AI is like a magic wand; that now there's one person who can replace everything, and can make a video very easily,” he says.
Fran Drescher, the SAG-AFTRA president, says that the movie studios want to use AI technology in lieu of paying actors full-time. (On July 21, the studios released a chart refuting this characterization.) The scenario Drescher describes seems not so different from the dystopia depicted in a recent episode of Black Mirror, “Joan is Awful,” in which a streaming service instantaneously creates emotionally manipulative content featuring AI replicas of the actors Salma Hayek and Cate Blanchett.
“Our livelihood is our likeness—the way we act, the way we speak, the gestures we make, that’s what we’re selling,” Drescher told TIME in an interview. “And that’s what they want to rip off.”
The results of SAG-AFTRA and the AMPTP’s negotiations could disproportionately affect a large working class of non-prominent actors. Working-class actors must often take on background roles in order to gain experience, network and pay the bills. Drescher told TIME that 86% of the union’s 160,000 member don’t even make enough money to be eligible for health benefits, which is around $26,000.
And while prominent actors likely would be able to hire good entertainment lawyers to negotiate favorable contracts with regards to AI, working-class actors may not—which could result in reputational damage if the AI performs worse than they do. At the same time, even big name actors are concerned, including Tom Cruise, who joined negotiations to press the producers on SAG-AFTRA’s concerns around AI, according to the Hollywood Reporter.
All four AI filmmakers interviewed for this story agreed that protections for actors and other film workers are essential. “AI has been very cool and empowering for someone like me. But the flip side is larger companies using it for their best interests, in a way that isn’t fair to background actors,” Jahmel Reynolds says.
Todorovic and Sheridan created Wonder Dynamics’ technology specifically so that actors’ performances would remain central and irreplaceable to the films that use it. Filmmakers can use Wonder Dynamics to turn an actor into an alien—but not an actor into another actor.
“We’re not generating art out of thin air,” Todorovic says. “We don’t want to be a part of building a future where actors are sitting at home, licensing their likeness, and they’re in 5 movies at the same time.”
At this juncture, AI-driven upheaval in film seems inevitable. Haerlin predicts that “many, many jobs will be lost during the next months or years.” But he hopes that actors will be protected—and that analog and AI movies will be able to exist side by side and serve different purposes. “It’s maybe comparable to rugs, “ he says. “You can buy a rug from IKEA that is machine-made. And then you can buy a handmade rug, which is maybe more beautiful and sophisticated, but it's way more expensive.”
Correction, July 26
The original version of this story mischaracterized Stable Diffusion. It is a tool, co-created by Stability AI; it is not a company.
More Must-Reads From TIME
- Who Will Be TIME's Person of the Year 2023?
- Why Cell Phone Reception Is Getting Worse
- The Dirty Secrets of Alternative Plastics
- Column: It's Time to Scrap the Abraham Accords
- Israeli Family Celebrates Release of Hostage Grandmother
- In a New Movie, Beyoncé Finds Freedom
- The Top 100 Photos of 2023
- Want Weekly Recs on What to Watch, Read, and More? Sign Up for Worth Your Time
Contact us at email@example.com