The Metaverse Police: A VR content moderator shares his insights

The Metaverse Police: A VR content moderator shares his insights

A veteran content moderator shares insights into his work in virtual reality and the challenges he faces on a daily basis.

Ad
Ad

Tate Ryan-Mosley talks to Ravi Yekkanti, a content moderator for social VR experiences like Horizon Worlds, in his newsletter The Technocrat. Yekkanti oversees virtual worlds from a variety of vendors for an outside company. In his job, Yekkanti says, not a day goes by that he doesn't encounter bad behavior.

Moderation in VR: fundamental differences from social media

Yekkanti has been a content moderator since 2014, and he stresses that moderating VR content is very different from moderating traditional content. VR moderation feels very real because you are directly moderating player behavior, Yekkanti said.

The big difference from “normal” online platforms is that it's not just text and images that need to be controlled. VR moderators are primarily evaluating speech and behavior in real time. Moderation in virtual reality is therefore still considered relatively uncharted territory. This is illustrated by an experiment that revealed serious moderation problems at Meta in early 2022.

The relatively young VR platform Horizon Worlds has repeatedly had to deal with harassment. Meta is trying to combat it with security measures such as personal interaction limits an 18+ areas in Horizon Worlds.

Ad
Ad

On the hunt for discrimination and hate in VR

As a moderator, Yekkanti himself is part of the virtual world and not recognizable as an “official”. VR moderators act incognito. If they were marked in any way, users could consciously adjust their behavior.

His own behavior and appearance could therefore trigger negative reactions from certain players as well. His Indian accent alone makes him a target for ridicule, discrimination, and bullying, Yekkanti says.

Some players just want to be bad

To prepare for moderation in the metaverse, moderators undergo technical and mental training. They learn how to remain undetected, how to start conversations, and how to help other players navigate.

At the same time, they are prepared to deal with problematic behavior. Well-trained moderators are becoming increasingly important for social VR platform providers, as harassment is not uncommon in the metaverse. Metas XR CEO Andrew Bosworth describes toxic behavior as an existential threat to Metas' Metaverse plans.

logo

Ad
Ad

Yekkanti experiences this behavior on an almost daily basis. “Not all players behave the way you want them to behave. Sometimes people come just to be nasty. We prepare by going over different kinds of scenarios that you can come across and how to best handle them,” the moderator explains.

Here is how Content Moderators handle violations

Content Moderators gather all relevant information, such as the name of the game, the participants, the length of each session, and the history of the conversations. Often, they must also consider when a line has been crossed.

For example, using profanity out of frustration is considered borderline behavior. “We still track it because there may be children on the platform,” Yekkanti says. If it gets too personal, he says, he definitely has to step in as a moderator.

For serious violations of the code of conduct, he says, there are several options, such as muting or removing players from the game. In addition, such incidents are reported to the customers, who can then take further action.

Ad
Ad

Content moderation can save lives

Despite the problems, the moderator emphasizes that his work is both fun and important. One example is the successful rescue of a kidnapped child who had posted a plea for help on an online platform.

Yekkanti then called the rescue services and the child was saved. This experience taught him that his work actually impacts the real world and contributes to the safety of users.

Sources: Technology Review