While VR headsets are pretty good these days, technologies to help us be better embodied in VR worlds aren’t quite at the same level of refinement. In recent times, touch controllers have started to become a standard part of the VR experience. Whether it’s a Windows Mixed Reality headset or indeed the Oculus Quest we’re discussing here, touch motion controllers are not an optional extra.
The problem is that headsets like the Quest are meant to be mobile. Something you’d take with you, like a Smartphone or Nintendo Switch. Nintendo solved their controller problem by designing rails that attach to the main unit in mobile mode, but that’s not going to work here obviously.
It turns out that one solution to having two extra bits of kit to lug around with your Quest might be to simply ditch them.
The Oculus Quest uses onboard cameras for inside out motion tracking. This is becoming the norm for all VR headsets, but it’s especially important for a standalone mobile HMD because you can’t use an external tracking camera. By using inside out tracking, proper six-degrees-of-freedom movement can be tracked.
Now Oculus is repurposing those same sensors for visual hand tracking. It’s coming to the Quest in 2020 as an experimental feature. It uses deep learning AI to determine the position of your hands in 3D space, which is then digitized for use in VR. We’ve seen something similar with the Leap Motion device attached to the front of VR headsets. However, the Leap Motion was a highly-specialized sensor, where these are just regular cameras using AI machine vision technology to do the heavy lifting. Adding a game-changing feature to an existing product.
If you want to explore the technical details of how it all works, Oculus has a pretty cool article here. What we really can’t wait for is to see this tech in action, putting us one step closer to a true next-gen VR experience.