Developer demonstrates realistic avatar on Meta Quest Pro

Developer demonstrates realistic avatar on Meta Quest Pro

Meta Quest Pro can transfer eye, body and face movements to avatars. Meta's sample projects show what is possible.

Quest Pro (review) is the first Meta headset with gaze and face tracking. With the brand new Movement SDK, developers can now implement the new hardware features in their apps to make avatars appear more realistic and expressive .

The Movement SDK has three components:

  • Face Tracking (only supported by Meta Quest Pro),
  • Eye Tracking (supported by Meta Quest Pro only) and
  • Body Tracking (supported by Meta Quest Pro and Meta Quest 2).

Meta uses the term "body tracking" somewhat loosely. Both headsets track head and hand movements alone, three body points in space. They do not record movements of the hips, legs, and feet.

Body tracking is responsible for transferring captured body points to avatars. The goal: making movements of the entire body look as realistic as possible with help from an AI model trained with human movements.

Until recently, this feature was reserved for Metas Horizon products. Now it's available to all developers, so it could begin to breathe life into avatars in other VR apps as well.

Meta publishes sample avatar projects

On Github, Meta offers a sample code that allows developers to get to know the different aspects of the Movement SDK by means of concrete examples.

logo

The sample code includes the following components:

  • the Alien character Aura which demonstrates the possibilities of eye and face tracking,
  • a realistic avatar designed to show how face tracking, eye tracking, and body tracking work together to create a compelling AAA gaming experience, and,
  • two demos demonstrating the interaction between body tracking and virtual objects and the transfer of captured body points to different humanoid character models.

Meta occasionally shows off Aura and the realistic avatar. With the sample code, these are now available to developers for experimentation and demonstration purposes.

The Movement SDK in action

The XR developer and Youtuber Dilmer Valecillos demonstrates the features of the Movement SDK in a video. He also plays with Meta's sample code and goes through the individual components.

Why doesn't Meta offer proper demos for the new gaze and face capture? That's a good question. No demos included with the headset provide an entertaining introduction to the impressive new hardware features. This is an omission that shows that Meta Quest Pro is primarily aimed at developers. It is still an oversight considering the high price of the device.