Quest 2 fitness app Supernatural tracks knee strikes by analyzing head and hand movements

Quest 2 fitness app Supernatural tracks knee strikes by analyzing head and hand movements

The fitness app Supernatural captures knee strikes. How is that possible with Meta Quest 2? The founder of the studio gives an answer.

Supernatural is one of the most successful VR apps for Meta Quest 2. It’s so successful that Meta bought the studio at the end of 2021. Or at least wanted to buy: Currently, the US competition authority FTC is trying to prevent the acquisition.

Supernatural is reminiscent of Beat Saber: You smash flying objects to the rhythm of the music. Unlike Beat Saber, however, the focus is entirely on fitness. The VR app offers hundreds of workouts and real trainers but costs a $20 monthly fitness subscription.

Outside the U.S. and Canada, Supernatural is not yet available due to licensing issues. The VR app offers well-known music tracks from all genres, which have only been licensed for North America so far.

The latest update introduces a new category of notes: To land a hit, you have to smash oncoming objects with your knee. This is supposed to activate your lower body and core.

The question is how the VR app accomplishes this. Meta Quest 2 only captures head and hand movements. The device cannot really recognize what the lower body does.

There has been a lot of speculation around the feature. One theory said that the VR app uses sensor data from the lower tracking cameras to detect the position of the knees. In the Facebook group for Supernatural, studio founder Chris Milk gives an answer:

logo
  • checkMIXED.de ohne Werbebanner
  • checkZugriff auf mehr als 9.000 Artikel
  • checkKündigung jederzeit online möglich
ab 2,80 € / Monat
logo

“Knee strikes are tracked using an algorithm that analyzes the movement of your head and hands to infer whether or not you are performing the move,” Milk writes.

According to Milk, Supernatural does not use a new interface. An AI model that is trained for knee strikes and recognizes when the user is about to kick based on subtle movement patterns is apparently sufficient.

AI motion prediction: much untapped potential

Milk acknowledges that the tracking is not perfect. “If you just stand there they should whip by you, but if you are trying to hit them, or hitting them with the opposite knee, they will still likely explode. Overall, it works surprisingly well, giving you a great sense of accomplishment when the targets explode just as you’re driving your knees through them,” Milk writes.

Milk also says they hope to offer more accurate lower-body tracking when new tracking functionality comes out in the future.

In a research paper, Meta recently demonstrated how well artificial intelligence can predict movement. According to the paper, an AI model can believably animate a full-body avatar based solely on sensor data from the headset and the two controllers. Similar technology should help simulate legs for meta-avatars in the near future. The latency is still high.

Sources: Twitter