Meta's codec avatars get photorealistic clothes
The faces of Meta's codec avatars are already pretty convincing. In a new presentation, the VR outfits also impress.
Meta has been steadily improving its experimental Codec avatars since 2019. If all goes well, the photorealistic figures should one day replace their users in the virtual world. After putting on a VR headset, a digital twin appears in virtual reality that moves almost exactly like the real person. Among other things, Meta uses sensors for eyes and mouth integrated into the VR headset.
An AI model uses this data to create a lifelike animated image of a person, including eye movements, facial expressions, and teeth. This is still an internal research project that requires a lot of computing power to produce a lifelike representation in real time. However, the barriers to entry have already been lowered significantly: Thanks to advances in AI rendering, codec avatars can run on Quest 2.
Realistic clothes for Meta's AI avatars
A realistic clothing simulation should make the avatars look even more authentic in motion in the future. A publication in cooperation with researchers from Carnegie Mellon University and the University of Minnesota explains how this is to be achieved.
In the publication "Dressing Avatars: Deep Photorealistic Appearance for Physically Simulated Clothing" by Donglai Xiang and his co-authors, artificial intelligence once again plays an important role. A neural network calculates how, for example, a loose dress falls over different body shapes during turns. The new clothing system is still calculated separately; it runs in parallel with the usual Codec Avatare system.
Effects, shadows, and ambient occlusion enrich the visual impression. The size of the virtual garments can be freely adjusted, and the software can layer different types of fabrics. In the presentation video, even an oversized T-shirt glides realistically over a dress.
Too elaborate for Meta Quest 2?
Up to now, a powerful workstation with strong graphics cards is needed for the calculation. In one example, three GeForce RTX 3090s run the simulation with a smooth 30 frames per second. However, there could also be strong optimizations in this area eventually.
However, no one should expect a release too soon. In May 2022, research director Yaser Sheikh mentioned that Metaverse telephony, which could one day be available for AR headsets to cast people as realistic holograms in the environment, is still "five miracles" away from market readiness.
By the way, those who don't want to appear with their lifelike bodies in virtual reality need not worry for now. In an interview with Lex Fridman, Mark Zuckerberg has already mentioned that the realistic codec avatars are only intended as a supplement to the previous cartoon-style avatars.
Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.