• Tech
  • Video Games

Sony’s Virtual Reality Expert Explains How PlayStation VR Works

7 minute read

Sony’s PlayStation VR headset arrives on October 13 for its PlayStation 4 games console, and it’s already garnering rave reviews—including our own, where we’ve called it “the first-gen virtual reality headset to beat.”

TIME spoke with Sony Senior Research Engineer Dr. Richard Marks about the new platform (Marks has been with Sony for 17 years, and played a central role in the inception of Sony’s EyeToy and PlayStation Move motion controllers). In our chat, he offered a high-level overview of how the technology works, and what drove the company to make certain choices regarding it.

PlayStation VR can produce strikingly clear images for two reasons

“There’s two basic things here, the panel and the lenses. Those are the two main things that affect visual quality,” says Marks. “Apple has popularized the term ‘retinal display,’ which is this idea that you can’t see the pixels in an electronic screen. Now imagine you’re going to blow them up into this huge field of view, like you’re looking at a giant TV. When you do that, suddenly you can see the pixels again because they’re so big.”

“You’ve probably heard of the screen door effect, which is one of the most popular negatives about virtual reality. That’s the black space between the pixels, and it looks kind of like a screen, and that’s what you don’t want. So you want your panel to have very little black and lots of color, meaning the red, green and blue. Our panel just has a better red, green and blue ratio compared to the blackness. That’s one factor. Having more light-emitting portions of the panel and less of those black circuit electronics is good.”

Playstation Virtual Reality Headset
Tyler Essary for TIME

“And then you have the refresh rate, the screen update rate, which for us is 120 frames per second. It’s updating what’s on the screen very often, and that update rate makes the image seem very solid to you when you’re looking at it.”

“The next thing is the optics, and the optics should be really well matched to the pixels. In our headset they’re blowing everything up but also softening the view, so it looks like a continuous image and you don’t see the discrete pixels. You don’t want to soften it too much, because you’ll get a blurry image, and you don’t want to soften it not at all, because then you’ll see the really sharp edges of things. So that’s another factor.”

“And the other part of the optics component is that there’s something called the eye box, and that’s about how much you can move your eyes around in front of the lenses and still get a good image. We wanted it to be such that when you put it on, there was a big range, so you didn’t have to be exactly in the right spot to still get a good image. We have such a big eye box that we don’t require you move the lenses at all.”

Lighter isn’t always better

“If you’re going to start from scratch and say, ‘let’s make a head-mounted display,’ you’d think you’d want the most minimal, lightest thing possible. But that’s not necessarily right if you want the product to be rugged and consumer friendly and easy to use. Even in terms of comfort, what our experience with other products over many years showed us, is that having it be balanced and not pulling your neck is probably the most important thing.”

“Also, and this varies by culture, but there’s a strong bias to not having anything touching your face. When something’s touching your face, you feel a little bit claustrophobic about it in a different way. And then also it traps heat in there, and having your face get hot is obviously uncomfortable.”

“So a lot of it is not having it touch your face, not pull your neck forward. And also pressure on your face, besides just touching it. When you push on your face, that can be really uncomfortable and lead to headaches. So having all the weight distributed on the top of your head instead and pushing down through your body rather than thrust up against your face and torquing your neck, that’s the fundamental design behind our headset.”

Sony

The PlayStation Move controllers benefit from you not being able to see them

“The tracking of the Move’s been tweaked some, but it’s also that it’s running on PlayStation 4 now, so we have more compute power” says Marks. “One of the biggest things is that it has the new camera, and it’s a higher resolution stereo camera, so it can triangulate the depth better. Whereas on PlayStation 3 it was a single camera and had to use the size of the sphere as its way to get depth, which isn’t as accurate.”

“The other thing is that when you’re in virtual reality, you can’t see your hand or the real Move obviously, so you don’t have a reference to compare it to. So there can be a bit more latency, and you won’t notice it.”

The company focused on sitting/standing experiences in part to be more developer-friendly

“I’m a technology person, so I like the technology Valve’s promoting with the lighthouse boxes and walking around,” says Marks. “But it feels like it’s much more suited to location-based experiences, where you have a VR arcade or something with someone overseeing the experience for you.”

“When you have it as just a single person, where the cable’s in the way, setting it up, finding the space you need, not everyone has the same amount of space, so everyone sets it up differently. And then the developer has to author an experience that deals with that different setup. That’s asking a lot from developers, especially at this stage, when we’re very new to VR, to make it that in everyone’s home it’ll do the right stuff. Seated or standing without walking, it’s much more constrained, so developers don’t have to factor in the different room spaces.”

Sony

And the shape of virtual reality to come…

“I think on the display side, the resolution is the most straightforward thing to address,” says Marks. “Two sets of things need to happen. You need to improve the display resolution, and then you need the graphics to drive that resolution. That’s why the graphics companies are very excited about the future of VR [laughs].”

“I think what you’ll maybe be surprised at, the area that will probably get a lot of excitement and energy, is input in VR. So how you’re interacting with things. You’re seeing stuff like eye tracking already being talked about from some smaller companies, and then other kinds of input, hand tracking. All those things and then haptic feedback and things like that. Those are all other areas that aren’t on an axis where they necessarily get better with time, so it’s hard to predict them.”

More Must-Reads from TIME

Write to Matt Peckham at matt.peckham@time.com