Der Artikel kann nur mit aktiviertem JavaScript dargestellt werden. Bitte aktiviere JavaScript in deinem Browser und lade die Seite neu.
At Siggraph 2022, Meta gives an insight into current VR research. An overview of the most interesting presentations.
The annual conference specializes in computer graphics and interactive technology. It provides a stage for exciting experiments, prototypes, and research projects as well as numerous lectures, discussion panels, and courses. This year’s Siggraph will take place in Vancouver from August 8 to 11.
Starburst is the codename of Meta’s HDR prototype. The bulky VR headset hangs from the ceiling and is held by two handles. The prototype has a lamp with a luminosity of 20,000 nits installed. For comparison: Modern HDR TVs come up with several thousands of nits, Quest 2 with just 100 nits. According to Meta, the high brightness level can realistically simulate lighting conditions indoors or at night.
Meta developed this prototype to find out what effect HDR has on immersion. The researchers believe that HDR contributes more to visual realism than high resolution and varifocal displays.
One of these problems is that pass-through cameras cannot be positioned exactly at the spatial position of the eyes. The slightly shifted perspective on the world can lead to a form of motion sickness with prolonged use.
Meta’s solution is AI-based gaze synthesis. At Siggraph, the researchers will present a method optimized for VR passthrough that synthesizes perspective-correct viewpoints in real-time and with high visual fidelity.
Lenses cause image distortions that must be corrected by software. The challenge lies in developing the correction algorithms: Display researchers must first produce lenses and headsets to test and adapt the corresponding software in practice. Iterations can therefore take weeks and months.
To speed up this process, Reality Labs has developed a distortion simulator. It can be used to test different lenses, resolutions, and fields of view without having to build and put on a virtual reality headset.
Research director Yaser Sheikh will give a talk at Siggraph on the latest codec avatar advances. He will discuss the systems required to visually and acoustically train codec avatars, as well as future challenges to achieve metaverse telephony “at scale.”
Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.