Meta's new Quest 2 demo shows the magic of hand tracking

Meta's new Quest 2 demo shows the magic of hand tracking

Meta's official hand tracking demo First Hand impressively shows what is possible with current hand tracking.

It has been almost three years since Meta introduced optical hand tracking for the Meta Quest. Since then, the technology has been continuously improved and recently made a significant quality leap with hand tracking 2.0 (test).

Now Meta releases "First Hand", an official hand tracking demo intended to reflect state-of-the-art technology. It is inspired in terms of graphic style and accessibility by the touch controller demo First Contact. That demo released in 2016, and remains a great introduction to virtual reality. That's why Meta rereleased it for Quest in 2019.

The best hand interactions in a demo

First Hand is meant to showcase the magic of controllerless interaction. It builds on the Interaction SDK, which provides developers with various tools and pre-built interaction patterns to make implementing hand interactions (with or without controllers) easier.

According to Meta, First Hand contains some of the most magical, robust, and intuitive hand interactions that are also applicable to many types of content.

You find yourself in a futuristic lab, opening a construction kit by typing in a code. This works amazingly well. After that, you interact with a hologram interface reminiscent of Minority Report. Step by step you build robotic gloves with which you can pull objects, shoot laser beams, and activate a protective shield. All this with your own hands.


First Hand: A must-see demo for VR fans

The highlight of the demo is a mini-game in which you have to use your newfound skills. A flying robot creates holograms of targets to hit. At the same time, they shoot projectiles at you. You can deflect the attacks with the shield. The demo is polished and definitely worth downloading to try out for yourself or for demonstration purposes.

As much progress as Meta Quest 2's hand tracking has made in the past few years, it still has technical limitations that detract from the experience. I'm referring to tracking dropouts, lack of precision, and the still-high latency. I'm curious to see how much better the hand tracking will be with Project Cambria (info). Meta's upcoming headset has a depth sensor that should improve hand tracking.

What I also still sorely miss is haptic feedback. Hand interactions without resistance feel weird. In 2024, a Meta wristband with a haptics feature could remedy this.

You can download First Hand from the App Lab.

Sources: Oculus Developer Blog