Meta brings hand and body tracking to Instagram AR

Meta brings hand and body tracking to Instagram AR

Meta brings new effects to its AR platform Spark AR: Using depth perception, filter creators can influence music and spatial effects.

Meta expands augmented reality technology for Instagram: From now on, creatives can use hand and body tracking when creating new Instagram filters, for example for musical effects, gestures or small dance interludes.

The required version 136 of the developer software Spark AR Studio has been available since April 25. With it, creatives benefit from new depth mapping features for depth detection as well as a more controllable occlusion.

Musical Instagram filters with Spark AR

In practice, the filters allow Instagram users to play with music in front of their smartphone camera. Like a DJ, they slowly filter the highs out of a song by waving their hands up or down in front of the camera. They can influence sounds with effects, such as an oscillator for tone generation or a vocoder for a robotic singing voice.

Developing these Instagram filters via visual module connections is somewhat reminiscent of entry-level music software like Bitwig or Ableton Live. A new audio engine simplifies the composition of sound-based filters.

So-called “asset patches” are also now available. These are program modules in which interactions or animations can be built in. They are used, for example, when mixing the volume or different voices (mixer, gain and compressor). Single gestures, like the finger in front of the mouth, mute the sound for a short time.

The new depth mapping function “Camera Depth Texture” recognizes distances and extracts this information as texture. It thus enables spatial lighting effects and post-processing effects.

Recommended articles

Qualcomm announces AR developer kit, Snapdragon Spaces available worldwide

Spatial effects via camera or LiDAR

In some higher-end smartphones, Meta’s AR software uses integrated LiDAR sensors for depth detection. Other models extract depth information from the smartphone’s camera image. These algorithms were primarily developed for better photos, but are also used for optical distance measurement.

logo
  • checkMIXED.de ohne Werbebanner
  • checkZugriff auf mehr als 9.000 Artikel
  • checkKündigung jederzeit online möglich
ab 2,80 € / Monat
logo

The “Occlusion Feature” mentioned at the start for occluding objects has recently become more accessible. Depending on the viewing angle, virtual objects disappear behind real objects like walls or shelves – either partially or even completely. This makes digital objects appear more realistic in real space.

Two new templates are also included in the update: “Piano Project” and the “Audio Visualizer” simplify test runs with new effects.

More new features in version 136 can be found on the official Spark AR blog, where you can also get instructions on how to create your own effects using Spark AR Studio. The software is available for Windows and Mac. Playback apps are available for iOS, Android, PC, Mac, and Meta Quest (2).

Preparing for Meta Cambria

Meta introduced Instagram’s AR filters section in 2019 and says hundreds of millions use the AR effects in its app ecosystem every month. TikTok offers similar AR software with Effect Studio, as does Snap with Lens Studio.

A new use case for Meta’s AR filter database full of user content could emerge before the end of the year: When the first high-quality VR headset comes onto the market with Project Cambria, AR filters could be embedded in reality much more immersively than is possible with a smartphone.

Sources: SparkAR-Blog