Meta's Codec Avatars become more lifelike thanks to Gaussian splatting

Meta's Codec Avatars become more lifelike thanks to Gaussian splatting

Meta is experimenting with Gaussian splatting to make Codec Avatars look even more realistic than before.

With Codec Avatars, Meta hopes to one day enable photorealistic telepresence. The technology relies on VR headsets with eye and face tracking and sophisticated AI techniques. Mark Zuckerberg and Lex Fridman recently demonstrated the current state of the technology in a podcast.

One of the biggest technical hurdles to bringing Codec Avatars to market is that users have to have a detailed 3D scan of themselves taken in a special studio, a process that takes hours. In the future, Meta hopes that photos taken with a smartphone will be good enough. Another obstacle is rendering the photorealistic avatars on standalone VR headsets like Meta Quest that lack the necessary computing power. To solve this problem, Meta is working on a dedicated chip that could help render the avatars in real time.

The latest version of the avatar technology, Codec Avatars 2.0, has now received an update that uses Gaussian splatting for more realistic and efficient lighting of the avatars. Meta presents the new process in a research paper entitled "Relightable Gaussian Codec Avatars".

A new breakthrough

Gaussian splatting is a machine learning technique that allows to render a 3D scene based on a few images of a real location. Unlike the technically related NeRFs, Gaussian splats can be rendered in real time.


Meta researchers have applied Gaussian splatting to Codec Avatars with astonishing results: The avatars are illuminated more realistically than before, with even the finest details such as individual strands of hair and pores rendered realistically and responding to changing lighting conditions.

"Our method outperforms existing approaches without compromising real-time performance," write the researchers. They demonstrate the feasibility of their approach using a standard PC VR headset. The following video shows the technology in action.

More information can be found on the research project's website.