News from the world of VR, AR, 3D

Meta tries to deal with bad user behavior in virtual reality

Meta has announced a new feature to increase the personal space for people's avatars in VR worlds.

The metaverse is still at the concept stage, but recent attempts to create virtual worlds are already facing an age-old problem: persecution.

Bloomberg tech columnist Parmy Olson spoke to BBC's Tech Tent about her own "creepy" experience. Another woman compared her own traumatic experience in virtual reality to sexual assault.

Meta has announced a new Personal Boundary feature starting February 4th. The feature prevents avatars from approaching a set distance from each other, creating more privacy for people and making it easier to prevent unwanted interactions. It will allow you to stop others from "invading your avatar's personal space," Meta said. "If someone tries to get too close to you, the system will stop them from moving forward once they reach the border." The feature is available in the Meta Horizon Worlds and Horizon Venues software.

The firm said that it was "a powerful example of how virtual reality can help people interact comfortably with each other," but acknowledged that there is still a lot of work to be done.

"I've had moments where I felt awkward as a woman," Parmy Olson said of her VR interaction. She visited Meta's Horizon Worlds, a virtual reality platform where anyone aged 18 and over can create an avatar and chat. “I immediately saw that I was the only woman, the only female avatar. The men came up to me and stared at me silently,” Parmy Olson told Tech Tent. “Then they started taking pictures of me and giving me pictures, and I had a moment when a guy approached me and said something to me. In virtual reality, if someone is near you, then the voice sounds like someone is literally talking to you right in your ear. It stunned me."

She experienced similar discomfort on Microsoft's social virtual reality platform. “I was talking to another woman and a few minutes after we were chatting, a guy came up and started chatting with us and following us around saying inappropriate things and we had to block him,” she said. "Since then, I've heard from other women who had similar experiences." She said that while she wouldn't call it harassment, it was "creepy and embarrassing."

Nina Jane Patel went a lot further this week when she told the Daily Mail she was harassed at Horizon Venues, comparing it to sexual assault. She described how a group of male avatars "groped her" and subjected her to a flood of sexual innuendo. They took a picture of her and sent a message saying, "Don't pretend you didn't like it."

Meta told the newspaper that she was sorry. “We want everyone to have a positive experience and to be able to easily find security tools that can help in a situation like this and also help us investigate and take action.”

Content moderation in the nascent metaverse will be challenging, with Meta CTO Andrew Bosworth acknowledging that it will offer both "big opportunities and big threats."

"I could feel a lot more real if you insulted me because it's more like a physical space," he told the BBC late last year. However, he also said that people in virtual reality would be able to have "much more power" over their surroundings. "If I blocked you, you would cease to exist for me, and then your ability to harm me immediately disappeared."

He wondered if people would want the kind of moderation that exists on platforms like Facebook when chatting in virtual reality. “Do you really want the system or the person standing next to you to listen to you? Probably not. So, I think we have a privacy compromise — if you want to have a high level of content, security, or what we would call integrity, that's a privacy trade-off."

In the Meta's vision of a metaverse where different rooms are run by different companies, the trade-off becomes even more difficult when people move from the Meta's controlled virtual world to others. “I cannot give any guarantees either as to the confidentiality or as to the honesty of this conversation,” he said.

Parmy Olson agreed that "it will be very difficult for Facebook, Microsoft and other companies to take care of this." “When you scan text for hate speech, it is difficult but doable you can use machine learning algorithms. To process visual information about an avatar or how close it is to another person will require a lot of money to be invested in computing, a lot of processing power will be required.”

Facebook is investing $10 billion in its metaverse plans, and part of that amount will be needed to create new ways to moderate content.

“We have learned a lot in the last 15 years, so we are going to bring all this knowledge to do our best to give people a lot of control over their own experiences,” Bosworth told the BBC.

Dr Beth Singler, an anthropologist at the University of Cambridge who studies the ethics of virtual worlds, said: "Facebook is no longer able to find out what is happening in online spaces." She thinks that Meta should learn from games, since games like Second Life and World of Warcraft have offered virtual worlds for years, limiting who avatars can talk to.

She believes the Meta's decision to use avatars without legs may also be intentional — most likely a technical reason due to the lack of leg sensors, but it could also be a way to limit the "below the belt" problems that can result from a completely physical presence. But the presence of strict rules about what avatars can look like will create its own challenges for those "trying to express a certain identity," she added.


2022-02-04 12:00