One of DirectX's three co-creators, Alex St. John, explains why Apple's Metal is such a blow to OpenGL, and what it means in the long run.
Back in the 1990s, I remember enjoying then-Microsoft-bigwig Alex St. John’s intelligent screeds (and occasional rants) back when DirectX was still this wild, unruly, nascent thing and assistive 3D cards in PCs from 3dfx and PowerVR and Rendition were wafting through the industry like ozone after a thunderstorm. He was bold and colorful and controversial and willing to get into public spats with rivals — perhaps most visibly Doom creator John Carmack, a longtime OpenGL evangelist — without apologies. Those were 3D’s halcyon days for early adopters like me.
I haven’t kept up with the guy in years, but he apparently maintains a blog, named after his old handle (“The Saint”), where he’s still doing his thing. Yesterday, he wrote a fascinating reaction piece to Apple’s surprise Metal reveal on Monday. Metal is to iOS as DirectX is to Windows, a way for Apple to get developers closer to the iPhone and iPad’s A7 processor, resulting (claims Apple) in dramatic performance increases. The tradeoff, of course, is that it’s proprietary.
Enter St. John, who delves into the political side of Metal’s raison d’être and what it says about the industry’s trajectory. I’ve tried to sum up his key points:
- OpenGL drivers are just “a grab bag of broken inconsistent functionality” without standard hardware definitions.
- Apple’s pretty much responsible for defining OpenGL as it exists in the mobile space today, thanks to the iPhone (and in part because Apple never developed its own DirectX-like API).
- But by supporting OpenGL, Apple’s made it easier for game developers to switch between iOS and Android, thus a proprietary API like Metal is as much about insulating Apple as it is getting developers closer to the hardware.
- Today’s GPUs are so fast they’re held back by lagging CPU technology, and “bloated” legacy 3D APIs aren’t helping matters.
- Most of what we’ve seen to date in the history of 3D gaming involves abstraction, not real-world physics modeling. That abstraction is hurting 3D development, where you have to decouple, then knit back together your trick-physics and “real” physics aspects. Here’s St. John: “The intimate link between light physics and other physics is largely broken in modern games because the graphics pipeline largely abstracts the visual elements of physics simulation from other aspects of physics forcing developers to awkwardly attempt to recouple them in the game by manually stitching them together.”
- Cloud-based parallelism employed to render virtual worlds may be the future, allowing processing leaps and bounds that won’t occur as rapidly on the client side (indeed, we’ve already seen demonstrations of games in which some of the visual assets are rendered on distant servers, then laid into the client application in real time).
- The future looks increasingly CUDA-like (giving developers direct access to a GPU’s parallelism). Here’s St. John again: “While the rest of the game community is trying to adopt Mantle, DirectX 12 or Metal, I’ll be re-learning my ray-tracing and quantum physics because I believe those roads all ultimately lead to a more CUDA like API for cloud based game design. It will just take the market a while to realize that.”
So is OpenGL really doomed? Even St. John backpedals on that one halfway in, admitting that DirectX and OpenGL could be “overhauled with time to look much more like Nvidia’s CUDA.” We’ll see.
(If you want a deeper primer on what Metal is and how low-level APIs work, Anandtech’s explainer is the best and most accessible I’ve seen.)