Metaverse moderation: Microsoft tightens security measures
Meta took a lot of criticism for its own Metaverse moderation. Microsoft wants to avoid similar controversies and is now pulling the ripcord.
Microsoft’s VR and AR chief Alex Kipman today announced sweeping changes to the company’s own Metaverse platform, AltspaceVR, designed to increase user safety.
The following changes are effective immediately:
- The social hubs will be completely shut down. The social hubs are the public and largely unmoderated playgrounds of the Metaverse platform, where you can meet and get to know other users without a prior appointment.
- The security bubble that prevents other avatars from entering the intimate zone is now on by default.
- Anyone attending an event is now automatically muted.
In the coming weeks, Microsoft also plans to implement the following:
- The age rating for events will be improved and moderation in general will be strengthened.
- In the future, users will have to log in with a Microsoft account.
- The company plans to integrate age-related controls that will allow parents to grant and restrict access to AltspaceVR for children and teens over 13.
Microsoft to establish Metaverse guardrails
On the AltspaceVR blog, Kipman justifies the deep moves “As platforms like AltspaceVR evolve, it is important that we look at existing experiences and evaluate whether they’re adequately serving the needs of customers today and in the future,” Kipman writes. “This includes helping people better connect with those who have shared common interests while also ensuring the spaces they access are safe from inappropriate behavior and harassment.”
Everyone should feel safe on platforms like AltspaceVR, which is why Microsoft has a responsibility to “establish guardrails,” Kipman says. He calls AltspaceVR a building block for the future of the Metaverse.
- MIXED.de ohne Werbebanner
- Zugriff auf mehr als 9.000 Artikel
- Kündigung jederzeit online möglich
Microsoft reacts preemptively
The security measures are a reaction to criticism surrounding Meta’s Horizon Worlds Metaverse platform, which launched in the U.S. and Canada in late 2021. Users reported physical harassment and toxic behavior, some of which went unpunished.
Meta responded to the former by introducing a personal safety zone. However, it is unclear how Meta intends to prevent inappropriate behavior, since 3D rooms pose entirely new moderation problems. Bans and monitoring will only partially solve the problem. According to experience reports, Meta currently seems to rely more on non-interference and hopes that the issue will solve itself by users muting or blocking each other.
By disabling public Metaverse rooms and activating the security bubble by default, Microsoft at least wants to cushion these issues.
The introduction of control mechanisms for child and youth protection is also due to negative press: A few weeks ago, the Guardian reported that the British data protection authority ICO wants to subpoena Meta. The reason: the Meta Quest 2 VR goggles were said to offer insufficient protection and expose children to dangerous content, especially in the Metaverse.
Read more about Meta and Metaverse:
- Meta: Employees are now “metamates”
- Love in the Metaverse: This documentary wows critics
- Meta rolls out improved Metaverse avatars