Meta to showcase its powerful new mixed reality API next week

At next week's GDC, VR studios will provide a first look at the possibilities of Meta's new Passthrough API.
The Game Developers Conference (GDC) takes place from March 17th to 25th in San Francisco. Meta will be present at the conference, hosting more than a dozen sessions and events.
At an event titled Merge Realities, Multiply Wonder: Expert Guidance on Mixed Reality Development, Meta will introduce the Passthrough API to developers. The new API supports "real-time use of customized computer-vision to be deployed in games" using the passthrough camera feed.
Meta promises "real-world examples of developers who are already leveraging our latest Passthrough Camera API to take their apps to the next level. See how they're using this powerful tool to create seamless and interactive MR experiences that blur the lines between reality and fantasy."
Meta will also give "an exclusive look at some of the exciting upcoming roadmap items that will revolutionize MR development. Discover how these new features will make it easier to create stunningly realistic and engaging MR experiences."
The event will take place on March 18, 2025. We hope it will coincide with the announcement of the general availability of the Passthrough API.
What is the Passthrough API?
Currently, mixed reality apps can only access abstract data about the physical environment, including a rough 3D mesh of the room and boxes representing individual pieces of furniture.
The Passthrough API, announced in September 2024, will give mixed reality apps access to the passthrough camera feed for the first time. This will enable apps to better recognize, analyze, and respond to their environment and individual objects - which is crucial for advanced mixed reality.
In a support article that has been prematurely published (first spotted by Luna), Meta gives examples of how apps can benefit from camera access.
- Object recognition. Developers can create apps that recognise and use specific objects within your real-environment. For example, a digital board game that incorporates physical game pieces or boards.
- Location recognition. Developers can create experiences that respond differently depending on where the camera feed shows you are located. For example, indoors or outdoors, at a famous landmark, or in a specific type of room.
- Other machine learning functionality. Developers are able to run custom machine learning models against data from the real-time camera feed. This could be used for retexturing/shading, games involving participants who are not wearing headsets, person/animal detection, or any number of custom industrial/training use-cases.
The Passthrough API has been announced for early 2025. Head of Reality Labs Andrew Bosworth said a month ago that this timeline is still valid and that some developers already have access to the API and are experimenting with it. So we should see the fruits of those experiments next week.
What new possibilities are you hoping for from the Passthrough API? Join the conversation on Facebook, Bluesky or X or share your opinion in the comments below.
For feedback, topic suggestions, or other ideas, please email us at hello@mixed-news.com.
Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.