Meta's Flamera headset prototype fights distortion with "bug eyes"
In see-through AR, how does light get to your eyes in the right perspective? Meta's Mixed Reality headset offers a unique answer.
Mixed reality headsets like the Quest Pro or the Apple Vision Pro have a problem. Their external cameras sit a few centimetres in front of your eyes. As a result, the computer-graphic-enhanced outside world is not captured from the correct perspective. Algorithms have to correct the deviation by projecting the image.
Meta wants to solve this problem with an approach that makes users look like insects. The light-field camera technology used in the Flamera prototype headset, with its many small lenses, is not new, but Meta has adapted it for its own purposes. The researchers placed numerous apertures behind the lens grid - one behind each lens. These allow only those rays of light to reach an image sensor that would otherwise reach the eye without the headset.
Meta's vision for distortion-free pass-through AR
Meta calls this technique "light field passthrough." Even with this approach, the image is first captured by the sensor and processed to fit the display and lens in front of the user's eyes. However, Meta says that the display is more accurate in perspective because the outside world is captured by the curved lens array on the front of the headset.
The technology often combines the light passing through two adjacent lenses. The shutters then ensure that only the light relevant to the direction of vision passes through. In a trailer, you can see how the other beams are blocked.
In contrast, conventional light-field cameras capture too many light beams, resulting in far too low an image resolution. With the new design, the relevant parts of the light field hit the limited sensor pixels exactly, says Meta.
The raw sensor data looks like small circles of light, each containing only a portion of the desired view of the outside world - almost like looking through a colander. Flamera rearranges the pixels and calculates a reconstruction of the entire image using a depth map. According to Meta, the result is much less prone to distortion than current headsets, opening up new possibilities for augmented reality applications.
"By starting our headset design from scratch instead of modifying an existing design, we ended up with a camera that looks quite unique but can enable better passthrough image quality and lower latency", explains researcher Grace Kuo. The name Flamera, stands for "flat camera".
As thin as possible
On the way to the current prototype, getting thick objects like the electronics on the camera sensor out of the way wasn't so easy. "The Flamera optical design works best when the headset is thin, which lets us put the passthrough cameras as close to the user’s eyes as possible," Kuo says.
Another advantage of the new design is that it requires significantly less computing power. In the older Neural Passthrough project, an entire workstation was needed to compensate for artefacts caused by the AI calculations. In the new Flamera design, the camera hardware and reprojection algorithms will work together more effectively from the outset.
Display Systems Research Director Douglas Lanman admits that the prototype design still looks pretty wacky. But it already shows one way that perspective-correct pass-through AR could work in the future. If you want to see what it looks like, you can check out the hardware at the Siggraph conference from August 6 to 10.
Note: Links to online stores in articles can be so-called affiliate links. If you buy through this link, MIXED receives a commission from the provider. For you the price does not change.