Meta Quest Pro: What the new eye and face tracking can do

Meta Quest Pro: What the new eye and face tracking can do

With Quest Pro, Meta introduces new interfaces for avatars that better reflect the facial expressions and body movements of users.

At Meta Connect 2022, Meta introduced the Movement SDK, which allows developers to implement the following features in their VR apps:

  • Face Tracking (with Meta Quest Pro)
  • Eye Tracking (with Meta Quest Pro)
  • Body Tracking (with Meta Quest Pro and Meta Quest 2)

In the following sections, I briefly introduce the new features.

Quest Pro's Face & Eye Tracking

This feature draws on five new infrared sensors inside of the Meta Quest Pro. That is, three aimed at the eyes and the upper part of the face, two at the lower part of the face.

A pre-trained AI model derives facial expressions from the abstract sensor data based on the Facial Action Coding System (FACS), a widely used coding method for human facial expressions. Developers can map facial expressions to a wide variety of avatars. Whether comic-like or realistic, human or fantasy, it doesn't matter.

Meta has created two avatar examples for developers to experiment with. One is the Alien Aura, which, as you can see, can represent an impressive range of facial expressions.

Eye tracking also allows for eye contact or new forms of interaction. Face and eye tracking are off by default and must be activated by the user in the headset settings. According to Meta, sensor data remains on the device and is deleted as soon as it is no longer needed for data processing.

Body tracking

Body tracking is supported by the Meta Quest 2 (review) as well as the Meta Quest Pro (info). The term is somewhat misleading. In fact, this feature only includes head and hand movements. The headsets do not record movements of the hips, legs, or feet.

logo

The body tracking feature of the Movement SDK transfers the movement data to avatars in such a way that they look as natural as possible.

Machine learning plays a part here as well. "We used a large dataset of real human movements to correct for the errors that are common with inverse kinematics or IK-based approaches," explains Meta product manager Vibhor Saxena. The inverse kinematics method comes from robotics and describes the machine estimation of the spatial relationship of limbs based on limited data points.

Already extant Meta Horizon products already used this body tracking feature. Now it is available to developers and will thus enable more realistic avatars in other VR apps besides Horizon Home, Worlds and Workrooms without developers having to program their own IK models. The system is flexible. It can seamlessly switch between hand and controller tracking and can be mapped to differently shaped skeletons.

"We used a large data set of real human motions to learn and correct the errors that are commonly seen with inverse kinematic or IK-based approaches," Saxena promises. He might be referring to avatars with simulated legs, among other things, which will probably also fall under the Movement SDK when it is released.

If you want to delve deeper into the topic, the following developer session explains the innovations of the new SDK.

The Movement SDK is part of the Presence Platform, Meta's collective term for immersive interfaces and SDKs. Meta divides these into three pillars: mixed reality (Insight SDK), interactions (Interaction SDK, and Voice SDK), and social presence. The Movement SDK is the first to belong to the latter.