A version of this article was published in TIME’s newsletter Into the Metaverse. Subscribe for a weekly guide to the future of the Internet. You can find past issues of the newsletter here.
For years, the 3D software development tool Unreal Engine has powered some of the biggest video games on the market—from Fortnite to Valorant—as well as television shows like The Mandalorian and even Porsche engineering. On Tuesday, Epic Games showed off the public release of Unreal Engine 5, the engine’s first major update in 8 years.
The company promises that the new updates to Unreal Engine 5 will make it the bedrock for the next generation of Web 3 developments—from metaverse experiences to movies, and of course, video games.
Unreal Engine is the second-most widely used video game engine, trailing only Unity, and is known for its depth of features and visual quality. Unreal Engine 5 augments those strengths, giving its users hyper-intricate 3D detail, facial realism, and large-scale world building. Its release opens the door for Disney to create a live Mandalorian video game that looks nearly as real as the show does, for example, says Kim Libreri, the CTO at Epic Games.
But top developers at Epic Games and outside of the organization argue that UE5’s biggest impact is not on the biggest studios, but rather smaller, independent developers who can now make high-quality games for much lower costs. Starting today, UE5 is free to download and use, with Epic taking a 5% cut on products created with it only after they earn over $1 million in gross revenue.
“We want to allow anyone to make great-looking stuff and take the drudgery out,” Libreri tells TIME. “Nobody should have to make a chair or a rock at this point: we want people to focus on what is truly important, like great gameplay and great artistry.”
Leading up to its release, TIME got exclusive access to several developers and artists who have already started using a preview version of Unreal Engine 5. They praised the system and discussed its potential in ushering an array of advancements across industries, including the metaverse. Here’s what’s under the hood.
Hi-definition 3D imagery
In December, Epic teased the release of UE5 with a demo featuring Keanu Reeves and Carrie-Anne Moss of the Matrix franchise. The video showed Reeves and Moss transforming back into their bodies from 23 years ago—when the original Matrix came out—and then being transported into a virtual city to fight off a slew of bad guys. The graphics of the city are startlingly lifelike: the way the sun glints off the top of a car or a wet highway, for instance, or the depth and texture of intricately carved Art Deco reliefs and rusty chain link fences.
These visual details are boosted by two new technologies in UE5: Lumen, which emulates natural light, and Nanite, which allows for incredibly precise 3D detail. “In the past, as you got closer to surfaces, the realism would break down: you could see that it’s a flat surface with a lot of texture detail as opposed to 3D geometry,” says Nick Penwarden, vice president of engineering at Epic. “Now, the artist can keep chiseling down and put in as much detail as they possibly can.”
There’s a real-world link between The Matrix and UE5: Libreri, now Epic’s CTO, served as a visual effects supervisor of the Matrix franchise, presiding over the “bullet time” technology in the original film. “A lot of us at Epic share the philosophy that the real world and the virtual world can look the same,” he says. “Our whole tagline was: ‘What is real? What is a movie, and what is a game?’”
Crossing the uncanny valley
To create the Matrix Awakens demo, Reeves and Moss flew to a digital scanning specialist studio in Serbia, where they had 3D scans taken of their faces and bodies. These scans were then incorporated into another one of Epic’s new technologies: Metahuman, which creates lifelike avatars. Up to this point, the creation of digital humans has been expensive and complex for developers and filmmakers. Epic’s Metahuman app gives you templates and tools to create characters in minutes, letting you customize the crinkles around their eyes, add freckle patterns, even change the shape of their irises. (Some people are concerned that Metahuman will ease the creation of deepfakes, however.)
This technology is especially exciting for filmmakers like Aaron Sims, an artist who used to work with physical prosthetics and makeup for Hollywood movies like Gremlins 2 and Men in Black, and now builds characters and beasts for video games and his own films. “We can take the realism all the way down to the pore,” says Sims. “As someone who used to make puppets and prosthetics, now I can do anything I want—and the foam isn’t going to rip.”
What’s next for the metaverse
To show off the the depth of the new UE5, Epic is releasing the entire city from the Matrix demo, so that developers can build games and experiences on top of it. The world will be populated by 20,000 metahumans driving cars and walking around the city streets, with each block rendered in vivid detail, down to each leaf and brick.
Epic hopes this release shows the possibility of Unreal Engine’s metaverse capabilities, in which high definition, large-scale worlds can be built easily. Another new update, World Partition, breaks down enormous maps into parcels that are manageable for a regular gamer to play without an expensive rig. “We’re also releasing tutorials to show developers that if you’re starting from scratch and want to make your own fantasy city, this is how we did it,” Libreri says.
With the templates for virtual worlds ready to go, it’s up to companies and developers to fill them with things and events. Libreri anticipates that UE5 will also enable a robust environment of digital twins, in which real life physical objects and environments are replicated in the virtual world. Many industries have begun using UE5 to create prototypes, from car companies like Porsche to architecture firms to manufacturing plants. The fact that these designs are already in UE5 makes it nearly seamless for Porsche to make a virtual 911 that drives inside the Matrix city, for example.
Hybrid live-virtual events are also on the way. Libreri is excited, for instance, about the possibility of concerts that take place in real life, with the performer wearing a motion capture suit, that are then streamed to viewers at home in real time. He also mentions live-virtual game shows and “gamified musical concerts.” “I think that the next evolution of social connectivity is going to happen through these live events,” he says.
But just because incredibly life-like worlds events can be built in UE5 doesn’t mean every game or metaverse environment will suddenly be intricately lifelike. Developers still need to account for the fact that many devices—including many smart phones—don’t have the capacity to run highly sophisticated graphics. “The more you push fidelity, the fewer devices you can support,” says Jacob Navok, the CEO of Genvid Technologies, which develops tech tools for streaming. (Navok is also a co-writer of Matthew Ball’s influential essays on the metaverse.) “Fortnite and Minecraft have proven that visual fidelity is not necessarily the thing that gets people excited to spend billions of hours inside of virtual worlds.”
The impact on Fortnite is TBD
Epic’s flagship game is Fortnite, which has 350 million registered users worldwide. In December, Epic switched Fortnite over to UE5, but very few discernible changes were detected by gamers. Libreri says this was by design: Epic wanted to show how seamless the transition between engines could be. While he is cagey about future changes to Fortnite, he says that the implementation of UE5 could allow Battle Royale Island to grow in size and contestant capacity, as well as open up the game to an array of visual possibilities. “I’d love to see more industrial crossovers. I’d love to experiment with what photorealism can mean in a stylized world,” he says.
Empowering smaller artists
Whether or not Fortnite is impacted in the next year, UE5 has already shifted the artistic process of many smaller artists. One of those artists is Daniel Martinger, a 29-year-old Swede who began using a preview version of the engine in December. Martinger had previously used Unity to create 3D environments and surfaces—but using it required some coding, which took him away from the visual aspects of creation.
For Martinger’s first UE5 project, he decided to create a carpenter’s cellar, complete with shopworn tools and rough surfaces. Working alone several days a week for three months, he created each tool hanging on the shelves individually, dulling ax blades and making slight dents in wooden tables. The resulting video was widely shared around the internet and hailed for its realism.
“You can play around a lot: using painting tools, blending textures. It’s easier with lighting,” Martinger says. “It feels like Unreal opens up so many doors for me.”
Navok says that Epic’s counterintuitive business strategy, of giving their product away for free to low-level developers, relies on the belief that the time and resources spent on virtual worlds will continue to dramatically rise. “If we were in a ‘fixed pie’ world, I would say they’re building tools that allow them to maintain their status quo as the number two engine in the world,” he says. “But Tim [Sweeney, the CEO of Epic Games] is betting that we’re in a ‘growing pie’ world: that the number of developers will multiply next year and the year after, who will need to decide between building on Unity, Unreal, or somewhere else. And that is why they’re focused on cost savings.”
Impact on the film industry
Meanwhile, the impact of UE5 on the film industry is likely to expand once filmmakers realize its capabilities for rendering digital objects and scenes quickly and relatively cheaply. Over the past few years, Unreal Engine has been used by big budget shows including Westworld and The Mandalorian, whose showrunners commissioned Epic to erect stages surrounded by massive LED walls. Those hi-definition walls served as a replacement for creating complex sci-fi sets, appearing in the camera as fully three-dimensional. Westworld, for example, was able to shoot a helicopter flying over a city despite everyone being planted firmly on the ground. This technique is likely to become more and more common: Epic says that there are now more than 250 in-camera visual effects stages across the world, up from less than a dozen just two years ago.
Sims, the filmmaker, is using UE5 in a different way: to create entire projects in his home studio. Sims can ask a friend to put on a motion capture suit, watch the corresponding monster inside UE5 in real time, and then adjust his visuals and storytelling accordingly. Last year, he created an entire short film in UE5, The Eye: Calenthek. While Sims had originally budgeted three to four months to create the film traditionally, the process inside UE5 took six weeks. “In traditional digital effects, you built something and then it went through a whole process before you knew what it really looked like,” he says. “Now, you don’t need an army of thousands of people to create things anymore, and you get instant gratification.”
Sims calls using UE5 a “slight addiction.” “On movie sets, it’s often the case that everything that could go wrong, goes wrong,” he says. “There’s a lot of stress involved, and so much pressure having everyone on set. I feel like these virtual sets we’re creating are a way to be more creative, not have the same stress, and have no limitations.”
Subscribe to Into the Metaverse for a weekly guide to the future of the Internet.
More Must-Reads from TIME
- How Donald Trump Won
- The Best Inventions of 2024
- Why Sleep Is the Key to Living Longer
- How to Break 8 Toxic Communication Habits
- Nicola Coughlan Bet on Herself—And Won
- What It’s Like to Have Long COVID As a Kid
- 22 Essential Works of Indigenous Cinema
- Meet TIME's Newest Class of Next Generation Leaders
Contact us at letters@time.com