Meta Quest (2): Meta brings high-quality hand interactions

Meta Quest (2): Meta brings high-quality hand interactions

Hand interactions are the be-all and end-all of many VR apps, but costly to develop. Now Meta offers high-quality standard solutions.

The Interaction SDK is now available in an experimental version. The experimental status means that developers are allowed to try out the new interface, but not yet implement it in apps that are available in the Oculus Store or the App Lab. The software still has to mature for this.

The Interaction SDK supports both touch controllers and optical hand tracking. The purpose is to provide developers with a basic set of high-quality hand interactions. This saves a lot of time and resources that studios can put into content development instead.

A versatile tool for hand interactions

The SDK is flexible: developers can pick and choose the hand interactions they need for their VR apps.

Among other things, the software simulates grabbing, throwing, and passing digital objects. With one tool, developers can create natural-looking hand and finger positions that arise when grasping objects with less effort than before.

The software also supports gesture recognition and simulates interactions with virtual buttons, keys, and user interfaces. The controllers and hands can now also be used for precise aiming and selection of controls.

Meta tested the SDK with several studios that implemented hand interactions in their VR games. Feedback poured into further development and improvement of the software. Partners include the studios behind Chess Club VR, ForeVR Darts, and Finger Gun. On the Oculus developer blog, they praise the capabilities and efficiency of the new interface.

Keyboard tracking for VR apps

The Keyboard Tracking SDK is released at the same time as the Interaction SDK. Developers may now implement the latter in their Store apps.

logo

This interface allows the integration of meta keyboard tracking, which visually captures physical keyboards and brings them into virtual reality as a digital copy. Hands are also captured and rendered. The Logitech K830 and Apple's Magic Keyboard are currently supported. Input is transmitted via Bluetooth.

Keyboard tracking is intended to make working in VR easier and is likely to be used primarily in productivity apps. Meta also tested this in advance with developers, including vSpatial.

Meta Quest 2: A flood of new interfaces

The Interaction SDK and Tracked Keyboard are part of the Presence Platform, a group of new mixed-reality and interface SDKs unveiled at Connect 2021.

The following overview shows the Presence Platform elements announced so far and their status. Generally available means that the interfaces may already be integrated into store apps.

  • Insight SDK for mixed reality apps. These include:
    • the Passthrough API for using the passthrough mode (status: generally available),
    • Spatial Anchors for permanent spatial anchoring of digital objects in physical space (status: experimentally available), and
    • Scene Understanding for machine recognition and classification of physical objects (status: not available).
  • Voice SDK for voice control (status: generally available).
  • Interaction SDK for better hand interactions (status: experimentally available).
  • Tracked Keyboard SDK for keyboard sensing (status: generally available).

Read more about Meta:

Sources: Oculus Developer Blog