One of my favorite things about using Apple’s Vision Pro, and something that makes it feel uniquely futuristic, is that it doesn’t have controllers. Instead, it tracks my hands. Its basic gesture controls, like pinching and swiping, are fantastic.
In more complex 3D immersive spaces, the hands-only gestural language seems to fall apart. Apple has worked its 2D navigation system across all of Vision OS, but the deeper 3D interactions aren’t fully there yet.
Meta’s Quest headsets, which are Apple’s closest competition, primarily use physical controllers but also have controller-free hand tracking, and sometimes Meta’s hand tracking feels better than Vision Pro for 3D interactions like grabbing objects in space. The differences between headsets and how we use hand tracking may be changing. These are early days for mixed reality-capable, hand-tracking VR headsets, and a conversation I had with one of VR’s biggest game developers suggested how much might still be changing soon.
Games as a doorway to new ideas
Owlchemy Labs, acquired by Google in 2017, created the classic VR games Job Simulator and Vacation Simulator. Both of those games are headed to the Vision Pro this year, adapted to work completely with hand tracking – no controller required.
Owlchemy has been exploring hand tracking for a while: Vacation Simulator already has it in an experimental mode on the Quest, and last year I tried a demo that experimented with more advanced hand-tracking interactions: using pinch-based gestures to move objects and squeeze letters to type on virtual keyboards, months before Apple gave its first Vision Pro demos.
In 2024, to this point, mixed reality headsets feel like they’re a bit split between hand-tracking-only and controller-optional designs. The Meta Quest headsets and Apple’s Vision…
Read the full article here