First hand: Hands-on with Meta's hand tracking demo for VR and MR
The free First Hand tech demo demonstrates hand tracking in virtual reality and mixed reality. I took a look at First Hand on the Quest 3.
With the release of the Quest 3, Meta has released a free demo that shows how hand tracking works with VR controllers in VR and MR.
First Hand can be downloaded for free from the Meta Store and works on the Quest 3, Quest Pro and Quest 2. I played First Hand on the Quest 3.
A demo with a story – First Hand
Even though First Hand is only supposed to be a demo, there is at least a little story built in. Before I start, I have to scan my apartment because First Hand starts in mixed reality.
As soon as the game area is scanned, strange energy creatures appear in my apartment. They explain to me that I am the so-called Creator and that I have to help them solve some problems using hand tracking.
In First Hands, the focus is not on pure hand tracking, but also on interacting with the Touch Plus controllers. I use my index finger to navigate menus.
To pick up plugs or flip switches, I have to use the controllers' haptic triggers to detect a grasp.
After getting a machine running in Mixed Reality, I can choose from four different locations where different tasks await me in Virtual Reality or Mixed Reality.
In First Hand, I explore four different areas
In three of the four areas, I have to build something and then use it. For example, I build a glider that I use to collect packages, or I build a robot and have it collect objects through mixed reality portals.
In the third environment, I build a glove that allows me to shoot lasers like Iron Man by holding out my palm. All of this works smoothly and without technical problems. However, the tasks only take a few minutes and are not particularly complex.
In the fourth world, I can move around a desert city using teleportation, which is supposed to work by pinching your index finger and thumb. However, I couldn't get First Hand to recognize this gesture - my only issue with the demo.
The analog sticks serve as an alternative control method for the teleportation points, which works perfectly. You can see how the motion controls should work in the following video.
For a task in the city, I first have to make a fist to tap on a wall, and then give a hand signal to a robot with my little finger. This hand-tracking puzzle is really intuitive.
Once all the problems in the four locations have been solved, I'll have to use some of the skills I learned in the Hub world to open various portals. After that, First Hand ends.
Is hand tracking the future of VR gaming?
First Hand entertained me for about an hour and is definitely worth a look as a free demo. Individual fingers and hand positions are recognized as an alternative input option and can make VR and MR games even more versatile and immersive.
However, I would have liked to see more puzzles with precise tracking of finger positions, which should have been the focus of this demo. Most of the puzzles are solved by picking up objects or pressing switches and buttons, which also works in VR games without hand tracking.
First Hand is just a demo that briefly shows how hand tracking works with VR controllers in a simple scenario. In a full VR game, these applications should build on each other and go deeper. Nevertheless, First Hand offers a positive outlook on the future of VR applications with hand tracking.
You can get First Hand in the Quest Store.
Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.