A version of this article was published in TIME’s newsletter Into the Metaverse. Subscribe for a weekly guide to the future of the Internet. You can find past issues of the newsletter here.
If you were watching the Coachella YouTube livestream on Saturday night, you might have done a double take as giant leafy trees and a Godzilla-sized parrot slowly rose above the stage of the electronic artist Flume. Were they giant inflatables? Mirages on a 200-foot-tall LED screen? Sheer hallucinations of the mind?
None of the above. This year, Coachella partnered with Unreal Engine—Epic Games’ 3D software development tool, which I wrote about in this newsletter two weeks ago—to create what organizers say is the first livestream to add augmented reality (AR) tech into a music festival performance. Unreal Engine worked with Flume’s artistic team and other technical collaborators to create massive psychedelic 3D images that blended in seamlessly with his stage design and set, floating around the artist and into the Indio sky.
But nobody at the festival could see those enormous parrots—only viewers at home. The result, while only lasting a few minutes, serves as a template for how live event planners might wield metaverse technology going forward to create unique experiences for at-home viewers. Many metaverse builders believe that live events will be increasingly hybrid with both digital and real-world components—and that immersive tools might help make each version of the event distinctive and desirable in its own right. “It doesn’t make sense to just recreate the live music experience virtually,” says Sam Schoonover, the innovation lead for Coachella. “Instead, you should give fans something new and different that they can’t do in real life.”
For the last couple years, AR visuals have been making their way into live broadcasts, although mostly as small gimmicks. Riot Games brought a giant dragon to the opening ceremonies of the League of Legends Worlds 2017 final; a camera followed the shrieking beast as it flew around fans at the stadium. Last September, a giant panther bounded across the Carolina Panthers’ stadium in a similar fashion. (The panther was also created with Unreal Engine.)
Schoonover has been trying to utilize similar effects for Coachella’s livestream for years in an effort to broaden its audience beyond the confines of the Empire Polo Club. “The online audience for shows is growing exponentially to the point where there’s maybe 10 or 20 times more people who are watching the show through a livestream than at the festival,” Schoonover says. “Because the at-home experience is never going to compare to the at-festival experience, we want to give artists new ways to express themselves and scaling viewership around the world.”
However, previous efforts at AR experimentation at Coachella were stymied by the cost of production and the lack of interest from performers. This year, it took a partnership with Epic—which is focused on lowering the barriers to entry for 3D creators—and the buy-in of Flume—an electronic musician who has long emphasized visual craftsmanship at his concerts—to bring the project to fruition. Key players in this process included the artist Jonathan Zawada, who has worked extensively on audio-visual projects with Flume, including on NFTs, and the director Michael Hili, who directed Flume’s extremely trippy recent music video, “Say Nothing.” Several other production companies were also involved, including All of It Now.
The result was the creation of enormous Australian birds (Flume is Australian), brightly colored flowers and leafy trees swaying in the wind, above the stage and teeming crowd. Three broadcast cameras equipped with additional hardware tracking allowed the production team to insert those 3D graphics into the video feed in real time.
The graphics are only the beginning of what might be created in AR for live concert settings, Schoonover says. Future performers could have lighting effects surrounding their faces at all times, for instance, or sync their dance moves to those of surrounding avatars. It’s easy to imagine production designers adding, in real time, the type of effects that the omnipresent music video director Cole Bennett adds to his videos in post-production, or a Snoop Dogg performance in which he’s flanked by his characters from the Sandbox metaverse.
And Schoonover says these AR experiences will jump another level when AR glasses are normalized. Eventually, you might be able to see the concert in 3D, from the festival grounds, surrounded by floating AR birds, plants, and whatever else 3D artists dream up. “When it comes to people wanting to get that Coachella experience from their couches, this is the entry point,” he says.
Subscribe to Into the Metaverse for a weekly guide to the future of the Internet.
More Must-Reads from TIME
- Where Trump 2.0 Will Differ From 1.0
- How Elon Musk Became a Kingmaker
- The Power—And Limits—of Peer Support
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com