Apple Vision Pro offers great opportunities for innovative processes

Apple Vision Pro offers great opportunities for innovative processes

The opportunity for Apple Vision Pro lies not in the hardware, but in its integration into the Apple ecosystem, says our guest author.

By Fabian Kreuzer

In product development, rapid testing is critical for sound evaluation. This is especially challenging for concepts that are better validated in a real-world environment or in 3D. In product design, Virtual Reality is already being used to evaluate concepts, but it often involves significant additional work.

The power of Apple Vision Pro lies not in its hardware, but in its seamless integration into the existing Apple ecosystem. Rather than creating an additional workload when switching to spatial computing, it simply adds a few clicks to the existing work process.

Efficient testing of embedded interfaces with Figma and Apple Vision Pro

Currently, testing the concepts of hardware interfaces in the context of a product requires a significant amount of time and money. For example, when developing a touch interface for a coffee maker, the right display, electronics, and programming must be integrated into the prototype to create a realistic setup for user testing.

This collaboration between disciplines such as modeling, electronics, programming, and design makes concept testing time-consuming and costly.

Picture: Fabian Kreuzer

The interaction between MacOS and visionOS could now create an efficient way to place and test Figma or ProtoPie prototypes on the product without additional effort. This would significantly speed up the innovation process.

Interaction concepts and visualizations could be developed and tested together with the hardware at every stage. User testing could be done on foam models, 3D-printed prototypes, and near-series products. Any display size and shape could be tested without additional effort. There would be no need to order, implement, and program possible display variants until all options have been evaluated. This would open up significant innovation potential for industries such as automotive, medical, consumer electronics and home appliances to validate ideas quickly and cost-effectively.

Picture: Fabian Kreuzer

This specific example makes it clear that the success of Apple Vision Pro would probably not come from a new application, but rather from the meaningful extension of existing products and applications with the help of spatial computing.

The symbiosis of Vision Pro with products such as the MacBook, iPad, Apple Pencil, iPhone and Apple Watch would likely enhance existing applications.

Mac

Picture: Fabian Kreuzer

With Mac Virtual Display, work efficiency when working from home, on the road, or during "workation" would not be limited by the small size of the screen. For example, existing applications could be freely placed around the Mac, and the potential of three-dimensional visualization could be fully exploited. Using CAD software, products could be designed as usual and simultaneously visualized and discussed in three dimensions.

Combining hardware and software prototypes in product development would also provide new and efficient ways to test concepts. For example, interfaces could be designed with Figma and interactions tested directly on hardware prototypes.

logo

iPad

Picture: Fabian Kreuzer

Working with the iPad in Mixed Reality would offer similar benefits to working with a MacBook.

In addition, the iPad would offer greater mobility, integration of the gyro sensor, and other interaction options such as integration of the touch screen or the Apple Pencil.

iPhone

Picture: Fabian Kreuzer

A big advantage of integrating iPhone into XR would be the already learned, quick and easy iPhone interaction. There would be no need to learn new interaction logic. In addition, the iPhone is the most obvious way to access personal content. The iPhone interface would add XR functionality when viewed through the Vision Pro.

Of course, there would also be new possibilities for integrating the iPhone with the existing sensors into gaming applications, for example.

Apple Watch

Picture: Fabian Kreuzer

Integrating the Apple Watch into XR applications would primarily provide the ability to analyze physical stress and movement and adapt the XR experience accordingly, for example, to generate personalized training content.

AirPods

Image: Fabian Kreuzer

Optimal XR applications would require a realistic soundscape with 3D audio. It makes sense to integrate existing audio devices into the XR experience.

About the author: Fabian Kreuzer is a freelance Interaction Designer and Expert for holistic, brand-defining and multi-sensory user experiences. As Competence Lead Interaction Innovation and Design Lead he has realized numerous UX and UI projects for international design agencies and clients.