GamingNews

Trust and safety: Leaders from Roblox and EA say Web3 has learned from past lessons

The gaming industry has a justly earned reputation when it comes to ugly behavior from users — from hate groups to grooming and illegal goods. With the metaverse and the influx of user-generated content, there’s a whole new avenue for harmful content and people intent on causing harm.

But along with this new stage of gaming and technology comes an opportunity to do things differently, and do them better — particularly when it comes to trust and safety for minors. During GamesBeat Summit Next, leaders in the trust, safety and community space came together to talk about where the responsibility for a safer metaverse lies, among creators and platforms, developers and guardians in a panel sponsored by trust and safety solution company ActiveFence.

Safety has to be a three-legged stool, said Tami Bhaumik, VP of civility and partnerships at Roblox. There has to be responsibility from a platform standpoint like Roblox, which provides safety tools. And because democratization and UGC is the future, they also have a vested interest in empowering creators and developers to create the cultural nuance for the experiences that they’re developing. The third leg of that stool is government regulation.

“But I also believe that regulation has to be evidence-based,” she said. “It has to based in facts and a collaboration with industry, versus a lot of the sensationalized headlines you read out there that make a lot of these regulators and legislators write legislation that’s far off, and is quite frankly a detriment to everyone.”

Those headlines and that legislature tends to spring from those instances where something slips through the cracks despite moderation, which happens often enough that some guardians are frustrated and not feeling listened to. It’s a balancing act in the trenches, said Chris Norris, senior director of positive play at Electronic Arts.

“We obviously want to make policy clear. We want to make codes of conduct clear,” he said. “At the same time, we also want to empower the community to be able to self-regulate. There needs to be strong moderation layers as well. At the same time, I want to make sure that we’re not being overly prescriptive about what happens in the space, especially in a world in which we want people to be able to express themselves.”

Moderating enormous communities must come with the understanding that the size of the audience means that there are undoubtedly bad actors among the bunch, said Tomer Poran, VP of solution strategy at ActiveFence.

“Platforms can’t stop all the bad guys, all the bad actors, all the bad activities,” he said. “It’s this situation where a best effort is what’s demanded. The duty of care. Platforms are putting in the right programs, the right teams, the right functions inside their organization, the right capabilities, whether outsourced or in-house. If they have those in place, that’s really what we as the public, the creator layer, the developer and creator layer, can expect from the platform.”

One of the issues has been that too many parents and teachers don’t even know that account restrictions and parental controls exist, and across platforms, the percentage of uptake on parental controls is very low, Bhaumik said.

“That’s a problem, because the technology companies in and of themselves have great intent,” she said. “They have some of the smartest engineers working on innovation and technology in safety. But if they’re not being used and there’s not a basic education level, then there’s always going to be a problem.”

But whatever the community is, it’s the platform’s responsibility to manage it in accordance with that audience’s preferences. Generally speaking, expecting G-rated behavior in an M-rated game doesn’t fly very far, Norris said.

“And back to developers, how are you thoughtfully designing for the community you want, and how does that show up, whether it’s in policy and code of conduct, whether it’s in game features or platform features?” he said. “Thinking about, what does this allow people to do, what are the affordances, and what are we thinking about how those might potentially impact the guardrails you’re trying to set up as a function of policy and code of conduct.”

In the end, safety shouldn’t be a competitive advantage across the industry or across platforms, Norris added — these things should be table stakes.

“Generally in the video game industry, we’ve been an industry of ‘don’t.’ Here are the five pages of things we don’t want you to do,” he said. “We haven’t articulated, what do we want you to do? What sort of community do we want? How are we thinking about all the ways in which this medium can be social and connective and emotive for a lot of people?”


Author: VB Staff
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!