“Look down at your hands. Can you see them?”
The question comes from David Holz, chief technology officer and co-founder of Leap Motion. In any other scenario, the query would seem odd, if not outright insane. But today, I’m sitting in a conference room with a virtual reality (VR) headset strapped to my face. The goggles take me out of the physical world and immerse me in a virtual one that’s a little like being inside Tron.
Typically, being in virtual reality means you can’t see anything in the real world, hands included. That makes it tough to interact with objects in the digital space. Some VR headset makers have addressed this problem with joysticks or gamepads similar to traditional video game controllers. But today, for the first time, I have honest-to-goodness virtual hands, complete with wiggling fingers. The difference is subtle and powerful all at once.
“Now try picking up that cube,” says Holz continues. I oblige, reaching out into a seemingly infinite space to grab the block in front of me. I fling it across a vast nothingness, because it seemed like the thing to do.
Leap Motion’s hands-on demo is powered by the company’s new Orion tracking system. Orion uses a combination of cameras and software to identify users’ hands in the physical space, then imports them into virtual worlds.
It’s a proof-of-concept more than anything else. But Holz and his team is hoping that when the beta version is made available Feb. 17, developers will build other more interesting experiences that bring hands into VR for the first time. If they do, it could help Leap Motion capture a slice of the booming virtual reality hardware market, estimated to be worth $70 billion by 2020.
One obvious use case: Video games. Imagine hurling boulders at virtual enemies, then pushing open a door to catch your breath. Education is a possibility here, too. Students could learn about astrophysics by interacting with virtual planets the same way they might build a solar system mobile.
Still, using Orion felt a little unusual. Using my hands to reach out and lift, move, or shove objects around me was more natural than doing so with a controller. But there’s still one big difference between my real hands and my virtual ones: The latter lack a sense of touch, sometimes breaking VR’s illusion.
Other firms, too, are experimenting with ways to bring hand movement into virtual reality. Thalmic Labs’ Myo armband, for instance, works with the Oculus Rift to enable controller-free gameplay. But Leap Motion’s technology differs in that it uses a camera to track movements in users’ hands and fingers, meaning it can detect more detailed movements. The Myo, by comparison, reads muscle movement in the forearm to pick up on a limited set of gestures.
While the Orion demo was impressive, it’s up to VR hardware and software creators to integrate the technology into their offerings. The company says it’s working with several VR hardware firms to directly integrate its gadget into headsets, but it won’t offer specific details. In the meantime, Leap Motion’s Orion software will be compatible with its motion controller for laptop and desktop computers.
- What a Photographer Saw in the West Bank
- Accenture’s Chief AI Officer on Why This Is a Defining Moment
- Inside COP28's Big 'Experiment'
- U.S. Doctors Can't Be Silent About Gaza: Column
- The Movie Wives Would Like a Word
- The 100 Must-Read Books of 2023
- The Top 100 Photos of 2023
- Want Weekly Recs on What to Watch, Read, and More? Sign Up for Worth Your Time