Apple WWDC Keynote - the most important AR announcements

Apple WWDC Keynote - the most important AR announcements

Apple has unveiled its potential headset processor, talks gaming and surround sound, and shows how we'll be able to use our smartphone to measure our surroundings in 3D in the future - for better AR?

As suspected in advance, there were no details about the potential mixed reality headset at Apple's WWDC keynote. Apple's keynote speakers and chief Tim Cook generally didn't get carried away talking about AR and VR. Still, there were some announcements that could be relevant for Apple's future XR strategy.

Potential headset processor and games upscaling.

Apple revealed the M2 processor, which is optimized for high performance with the lowest possible power consumption. This makes the M2 an obvious choice for Apple's possible mixed reality headset. However, the M2 will first be used in Apple's new MacBook Air and MacBook Pro (13-inch).

According to Apple, the 5-nanometer chip will outperform the M1 by 18 percent performance per watt in CPU tasks, 35 percent in graphics and 40 percent in AI calculations ("Neural Engine"). It also offers 50 percent more memory bandwidth (100 GB/s) and up to 24 GB of unified memory.

According to Apple, the M2 can stream multiple 4K and 8K videos simultaneously, which could be interesting for XR video broadcasts of sporting events or for office work on the Mac with XR extensions, for example. Apple is reportedly working with Hollywood directors on video content for the headset.

Apple dedicated a part of the keynote to gaming, traditionally a domain of Windows PCs. The new M2 chip is supposed to be able to display current games like Resident Evil 8 in high resolution and smoothly - also thanks to a new upscaling process for Metal. Apple relies on a similar process like AMD (Spatial Upscaling for Super Resolution) instead of the so far higher-quality AI upscaling like Nvidia (DLSS).

Spatial sound and spatial measurements

Apple also wants to improve the surround sound with iOS 16. Owners of newer iPhones with the TrueDepth camera system can scan their ears. Based on this ear scan, Apple then creates an individual surround sound profile for the respective person. This high-quality surround sound could improve future AR and VR experiences.

Probably the most interesting announcement in the AR context is the new Swift API Roomplan. It automatically generates a 3D plan of the environment using the camera and LiDAR in the iPhone and iPad.

logo

Meta offers a similar feature for Quest 2, but only for the floor and walls. Apple's solution also automatically draws characteristic objects like furniture or a kitchen into the map. Meta could follow suit here with Cambria.

The 3D map of the environment is a basis for high-quality augmented reality at home or in the office. It enables users to locate digital objects such as screens in the physical world in a visually realistic, permanent and precise way. Developers can also process the 3D scans in USD or USDZ format in other applications.

More AR announcements could follow - WWDC continues through this Friday, June 10. On June 9, a deep dive for ARKit 6 is on the schedule, for example. Apple promises 4K video recording of AR experiences, support for HDR video and high-resolution background image capture, as well as location anchors. Apple also wants to talk about features of high-quality AR apps and about 3D scanning.

Sources: Roomplan, Apple M2