Meta Quest 3 has a much better understanding of your room

Meta Quest 3 has a much better understanding of your room

Meta Quest 3 takes the spatial understanding of Quest headsets to a new level. Meta explains what exactly is new.

Ad
Ad

The information comes from an official support page that Meta may have published too early. Meta Quest 3 will be revealed in detail on September 27 at Meta Connect 2023. The page was first spotted by Twitter user Luna.

Meta Quest 3 is optimized for mixed reality and will be able to blend the physical environment and digital elements more believably than previous Quest headsets thanks to high-quality color passthrough and an integrated depth sensor.

The foundation of mixed reality is the ability of the device to understand the spatial layout and to classify and distinguish objects within it. It does this by collecting spatial data.

What spatial data is and what it does

On the support page, Meta goes into detail about the spatial data that Quest headsets, and Meta Quest 3 in particular, collect. To use this feature, you must first grant access to spatial data to any application that takes advantage of mixed reality. If you don't, the mixed reality experience may be degraded or the application may not launch at all.

Ad
Ad
Close-up of the front of Meta Quest 3.

The depth sensor (center) of the Meta Quest 3 enables a better mixed reality experience. | Image: Meta

"Spatial data refers to the information collected about the size, shape, and location of walls, surfaces, and objects in a physical space," Meta writes on the page. "Apps that blend virtual and real-world environments use spatial data to understand the space around you and where you are within that space."

According to Meta, the headset creates a digital model of your environment by

  • recognizing objects and surfaces
  • labeling objects (examples: table, couch, windows)
  • estimating the size, shape, and distance of those objects in relation to each other and your headset.

The collection of spatial data enables different applications. Digital objects can be

logo

  • attached to or placed on physical objects,
  • bounce off or otherwise interact with physical objects, and
  • move more realistically within a space and past physical objects.

Meta Quest 3 supports new types of spatial data

Spatial data includes three types of spatial information: Scene data, Mesh data, and Depth data.

Ad
Ad

Scene data, Meta explains, is a simplified model of a room and enables more physical awareness of a user’s surroundings.

Mesh data includes information about the shape and structure of physical objects and allows realistic interactions between digital and physical objects.

Finally, Depth data contains information about the distance between objects and enables a realistic rendering of virtual objects in your room, including occlusion. Only Meta Quest 3 supports all three types of spatial data.

Table showing which types of spatial data Meta-Quest headsets support.

This table shows which types of spatial data Meta Quest headsets support. | Image: Meta

This is due to Meta Quest 3's depth sensor, which is integrated into a Quest headset for the first time. A short video illustrates this process and the types of interactions it makes possible.

Buy Quest 2, Quest Pro & Prescription Lenses
Quest 2
Quest Accessories
Quest Pro
VR Optician

Sources: