Apple Vision Pro may be able to read your mind

Apple Vision Pro may be able to read your mind

A former Apple employee gives insight into the development of the Apple Vision Pro and talks about “mind reading” through AI.

According to Sterling Crispin, he was a Neurotechnology Prototyping Researcher in the Technology Development Group at Apple and was instrumental in the development of the recently announced Vision Pro VR/AR headset. Among other things, he worked on technology that uses artificial intelligence to predict user behavior.

According to Crispin, much of the technology developed during his time at Apple is confidential. However, he said that because some aspects of his work became public through patents, he is now allowed to talk about them. “Generally as a whole, a lot of the work I did involved detecting the mental state of users based on data from their body and brain when they were in immersive experiences,” Crispin wrote on Twitter.

Vision Pro development: Mind reading through AI?

These states would be measured by eye movements, electrical activity in the brain, heartbeat and rhythm, muscle activity, blood density in the brain, blood pressure, skin conductance, and more. Using this data, “AI models are trying to predict if you are feeling curious, mind wandering, scared, paying attention, remembering a past experience, or some other cognitive state,” Crispin said.

Such AI analysis of users in mixed reality or virtual reality applications should detect cognitive intentions in addition to emotional states. In his thread, the researcher reveals that he has been involved in technologies such as user decision prediction – a kind of mind-reading by AI.

“Your pupil reacts before you click, in part because you expect something will happen after you click. So, you can create biofeedback with a user's brain by monitoring their eye behavior, and redesigning the UI in real time to create more of this anticipatory pupil response. It’s a crude brain computer interface via the eyes,” Crispin described.

logo

Does Vision Pro know how focused you are?

Crispin also worked on predicting a user’s state of concentration or relaxation. Machine learning and analysis of body and brain signals can predict how focused or relaxed someone is, or how intensely he or she is studying, Crispin describes. These insights could have been used in Apple's immersive office or meditation environments.

According to Crispin, virtual environments could be updated in real time based on this data to aid concentration. “So, imagine an adaptive immersive environment that helps you learn, or work, or relax by changing what you’re seeing and hearing in the background.” So far, Apple has only introduced a dial on the headset's frame that allows users to adjust the level of immersion themselves.

How many of the technologies described actually made it into the Apple Vision Pro, and to what extent, Crispin doesn't know: “I’m really curious what made the cut and what will be released later on.” Crispin left the company at the end of 2021.

Sources: Twitter