AI & RoboticsNews

Beyond the banhammer: How AI is changing content moderation

Presented by Cohere


How do you build a more inviting and inclusive game community? In this VB On-Demand event, AI/ML experts from Cohere and Google Cloud dive into managing player-generated content at scale to increase retention, promote positivity, and build a fun game community

Watch free on-demand here!


Building and nurturing strong communities is crucial in a crowded game market, says David Wynn, head of solutions consulting at Google Cloud for Games. Last year more than 10,000 new games were released on Steam alone, and that record looks like it will be broken again this year. As studios and publishers battle for players’ attention, they are finding that communities can make the experience stickier and more meaningful to their player base. But that’s only if it doesn’t fall prey to the toxicity that can plague so many online spaces.

“Building a community helps bring in fundamental human aspects of talking with friends, sharing experiences and building relationships,” Wynn says. “If that works as part of the game experience that you’re trying to create, it becomes all the more imperative to make sure that you design it correctly, and that you make sure it’s a good experience for everyone involved.”

The challenges are fundamental, the ones that are baked into human interaction in every crowded arena, full of a diversity of experiences, from race, gender and class, to religion, and more. But also, the broad array of differences in how people like to interact, expect to interact and are incentivized to interact, Wynn says, and all of that combined creates the community of a game or title.

“People will bring in their own experiences, perspectives and potentially challenges to the community. Even if we create virtual worlds, they still come from here, and they bring everything they experience here to them,” he says. “We can craft, through tools and through the knowledge that others have built already, experiences to change how they interact. The multiplicity and the scale are both things that studios and publishers need to keep in mind, because the world is going to come at us hard. As much as we would like to think we can build our own islands, people got here from somewhere, and they’re bringing it with them.”

What can go wrong is unique to a title in terms of how a community experience is shaped to facilitate your objectives, how complex an experience you design and how invested your players get, and that directly impacts your moderation and intervention styles. A frowny face might mean a bad day; it could also be indicative of a larger, more insidious trend, or signal a new layer of moderation is required.

Adding AI to the content moderation mix

Back in the day, the number of interventions available when things got toxic was limited, both in theory and in practice. A moderator or admin could apply the banhammer if they decide that behavior is unacceptable — if they see it at the right time, or it’s reported at the right time. Or certain types of words could be blocked with simple string substitution, so that it appears as four asterisks instead of an F-bomb. They’re effective tools that get the message across, if a fairly blunt approach, difficult to fine-tune, and virtually impossible to scale.

Natural language processing (NPL), AI and machine learning-based models have opened up significantly more sophisticated interventions with even more readily available classification. Whether your moderation team is overworked or your usual methods ended up returning false positives, these algorithms allow community owners to catch problems before they start, and they can do it at scale.

“AI does take resources, effort and attention to train, but it’s especially resource-efficient to run, and at scale, opens up a whole new avenue of identifying the behavior we either want to minimize or amplify,” Wynn says. “It also creates new types of interventions, whether it’s through chat bots or through interesting types of augmentation that’s not just “if, if else” string substitution.”

AI/ML can also analyze broader patterns — not just text, but also communication that includes voice transcriptions, in identifying behaviors like griefing, or giving other players a hard time. It’s the type of stuff that, in synthetic environments, need to be identified reliably so that it can be addressed or mitigated quickly.

“None of this is new. I’m sure people were figuring out how to make Pong annoying to play against when it was first released,” Wynn says. “But what you’re seeing with the new AI/ML models being developed and published is that you don’t have to be a data scientist in order to translate these large language models into something that actually works for your game, even if you’re a smaller studio, or you’re trying to make a go of it yourself. Instead, you have an API from somebody like Cohere that you can just grab and then start messing with right away to see the benefit from it.”

For more on identifying the patterns that make communities start to sour, the AI/ML solutions available to anyone, the most effective ways to implement them and more, don’t miss this VB On-Demand event.

Watch free on-demand here!

Agenda

  • Tailoring tools to your community’s unique vernacular and policies
  • Increasing the capacity to understand the nuance and context of human language
  • Using language AI that learns as toxicity evolves
  • Significantly accelerating the ability to identify toxicity at scale

Presenters

  • David Wynn, Head of Solutions Consulting, Google Cloud for Games
  • Mike Lavia, Enterprise Sales Lead, Cohere
  • Dean Takahashi,Lead Writer, GamesBeat (moderator)


Author: VB Staff
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!