Meta to reveal its next-gen VR research at Siggraph

Meta to reveal its next-gen VR research at Siggraph

At Siggraph 2022, Meta gives an insight into current VR research. An overview of the most interesting presentations.

The annual conference specializes in computer graphics and interactive technology. It provides a stage for exciting experiments, prototypes, and research projects as well as numerous lectures, discussion panels, and courses. This year's Siggraph will take place in Vancouver from August 8 to 11.

In June, Meta introduced a series of new VR headset prototypes. At Siggraph, Reality Labs researchers will give presentations on these and related projects.

HDR-VR: The Starburst prototype

Starburst is the codename of Meta's HDR prototype. The bulky VR headset hangs from the ceiling and is held by two handles. The prototype has a lamp with a luminosity of 20,000 nits installed. For comparison: Modern HDR TVs come up with several thousands of nits, Quest 2 with just 100 nits. According to Meta, the high brightness level can realistically simulate lighting conditions indoors or at night.

Meta developed this prototype to find out what effect HDR has on immersion. The researchers believe that HDR contributes more to visual realism than high resolution and varifocal displays.

Siggraph visitors will be able to try out the prototype on site. A research paper reveals details about Starburst's technology and operation.

Realistic VR Passthrough: NeuralPassthrough

Passthrough technology has many advantages over see-through AR optics, but brings its own set of problems.

One of these problems is that pass-through cameras cannot be positioned exactly at the spatial position of the eyes. The slightly shifted perspective on the world can lead to a form of motion sickness with prolonged use.

Meta's solution is AI-based gaze synthesis. At Siggraph, the researchers will present a method optimized for VR passthrough that synthesizes perspective-correct viewpoints in real-time and with high visual fidelity.


Those interested in learning more about the Siggraph presentation can read the scientific paper on NeuralPassthrough.

Perfect VR Optics: Distortion Simulator

Lenses cause image distortions that must be corrected by software. The challenge lies in developing the correction algorithms: Display researchers must first produce lenses and headsets to test and adapt the corresponding software in practice. Iterations can therefore take weeks and months.

To speed up this process, Reality Labs has developed a distortion simulator. It can be used to test different lenses, resolutions, and fields of view without having to build and put on a virtual reality headset.

At Siggraph, Meta will showcase the distortion simulator. A scientific paper accompanies the presentation.

Codec Avatars

There could also be an update to Meta's photorealistic telepresence avatars. Meta recently optimized its AI model to the point where a smartphone scan of the face is sufficient.

Research director Yaser Sheikh will give a talk at Siggraph on the latest codec avatar advances. He will discuss the systems required to visually and acoustically train codec avatars, as well as future challenges to achieve metaverse telephony "at scale."