GamesBeat Summit 2022 returns with its largest event for leaders in gaming on April 26-28th. Reserve your spot here!
GGWP has raised $12 million in seed funding to use AI and content moderation to reduce toxicity in online games.
Dennis “Thresh” Fong is still a pretty darn good gamer decades after he was one of the earliest esports celebrities. But when he played with some of his entrepreneur friends, they weren’t quite as good and often got hassled by toxic players. So Fong and his friends decided to do something about it.
He founded the San Francisco company at the end of 2020 with Crunchyroll founder Kun Gao and AI expert George Ng. Their aim is to solve the massive problem of online toxicity in games.
“We played games together and, because of my level, we were queued with higher-skilled people. And so they would get a lot of hate with flaming and toxicity thrown their way. We were just saying, ‘Why is this still exist? Why is this still a problem in gaming? Why haven’t game companies figured it out yet?’ We had no intention of starting a company but we looked into it. We wanted to provide some advice and recommendations to game companies.”
Event
GamesBeat Summit 2022
Re-experience the excitement of connecting with your community live at GamesBeat Summit’s in-person event on April 26 in Los Angeles, CA, and virtually April 27-28. 30+ sessions and 500+ attendees are set to arrive, so don’t want to miss this opportunity to expand your network. Early bird pricing ends March 25. Get your pass today!
Bitkraft Ventures led the seed round, alongside Makers Fund, Griffin Gaming partners (GGP), Sony Innovation Fund, Riot Games, as well as game leaders such as Twitch founders Emmett Shear and Kevin Lin, founder of YouTube Steve Chen, Krafton CEO CH Kim, and influencer and gamer personality Pokimane.
GGWP, a nod to “Good Game, Well Played,” was created with the mission to democratize positive play using technology, Fong said.
The big problem
Online games have a documented and massive problem: millions of toxicity-related complaints and reports across the biggest titles daily make the task impossible to address and scale with human moderators alone. GGWP’s platform allows publishers to customize a moderation system that can catch, contextualize and respond to every single incident reported.
“There’s a lot of research that shows bad behavior is bad for business,” Fong said. “22% of players have quit playing a game due to toxicity. With GGWP, we are modernizing game moderation. With the ability to respond at scale, we can dramatically improve game experiences, and in turn, improve game businesses.”
Also linked closely to toxicity is cheating and other kinds of hacking, where player behavior crosses lines and ruins the experience for everyone. Such problems were so huge in Call of Duty: Warzone that Activision had to roll out its new technology, Ricochet, to deal with cheaters. Riot Games did the same with anti-cheat tech for Valorant.
Fong consulted with a lot of his friends in gaming and found that no one had really solved the problem. In fact, some of them were just unable to deal with the volume of complaints about toxicity that came in by the millions. Fong and his friends looked into how to automate responses using a combination of AI and human moderation, where the AI floated the worst cases up to the humans for faster responses.”The companies responded that they can’t built it internally because they didn’t have the kind of [AI] technical talent that you need,” Fong said. “They wanted us to build it.”
Fong was available as he had sold his previous company, Raptr/Plays.tv, to Take-Two Interactive in 2019. So the founders created the company and quickly raised $2 million. Then they started building about a year ago and they now have a dashboard for how to monitor gamers over time based on their reputations.
They built the team up to 35 people and raised the additional funding. The team includes machine learning experts, engineers, and data scientists. In addition to the game companies, influencers like Pokimane were concerned about the problem of toxicity and how to fight it. The company has operated out of a venture capitalist’s empty office in Palo Alto, but it will eventually find space in San Francisco.
Fong won’t name names as to who has failed with dealing with toxicity. But he noted that some companies have acknowledged that they are overwhelmed and deal with a tiny percentage of hundreds of millions of player reports. On the other extreme, Facebook has thousands of human moderators policing its site in addition to AI solutions.
“It literally goes into a black hole and sol there is no transparency,” Fong said.
How GGWP approaches the problem
GGWP is a smart, AI-powered moderation platform that enables publishers to easily view a game’s aggregate community health, review incidents that occur with contextual information, and understand each players’ overall impact in the community. As simple as calling an API to get started, GGWP is remarkably easy for developers to use and integrate into their games.
“We take a 360-degree approach. Other companies deal with one part of the problem like text chat or moderation or whatever,” Fong said. “The heart of what we believe is players should be accountable for their actions historically. Every time your system is looking at somebody, you should be looking at kind of the history of the player and using kind of that data to decide whether something happened and how severe it was. We believe the first line of defense, particularly in game moderation, applies to any online community. And that is in looking at player reports.”
GGWP’s player report management system aggregates, triages, and prioritizes player reports and provides context around incidents by displaying historical and holistic data on the players involved, including their reputation scores, credibility rating, and the severity of the incident.
“Gaming has grown into an incredible force in entertainment, but it’s also become increasingly more difficult to address toxic behaviors that are often associated with many games today,” said Joseph Tou, managing director of Sony Ventures, in a statement. “Gaming should be fun for all. We are thrilled to support GGWP and its innovative AI approach to both detect bad behavior in games and encourage positive gameplay.”
GGWP’s models can automatically detect disruptive behavior like AFKing (dropping out) in a match, griefing, intentional friendly fire, feeding, and speed hacking, as well as chat-based toxicity such as identity hate, incitement to self-harm, and criticizing gameplay. Its chat models also uniquely take a comprehensive and nuanced view of a chat incident by taking context and sentiment into account.
Game companies are giving it early nods for being helpful.
“Riot believes that gaming is for everybody and we’re passionate about fostering a community that is welcoming for all,” said Brendan Mulligan, director of corporate development at Riot Games, in a statement. “The team at GGWP truly shares that passion and we’re excited to see them provide powerful, scalable tools for sustaining healthy communities to our peers of all sizes across the industry.”
And that’s why others invested.
“GGWP has an effective and easy to use platform powered by incredibly sophisticated technology,” said Jens Hilgers, Bitkraft Ventures managing director, in a statement. “The games industry desperately needs GGWP’s AI-based moderation, and if you consider the fact that we all increasingly live online, social media and other online spaces will need solutions for toxicity in our digital societies.”
GGWP gives companies different ways to understand the problem and particular gamers.
“We also show you the distribution of the reputation score within their community. And then you can sort by, for example high-score people, low-score people, and people with the worst reputation,” Fong said.
Fong showed me an anonymized report about one player who had been reported for cheating 228 times by 204 unique people. He could see that the person played the game for a while and then evidently installed the cheat program in September andthen played a lot of games and saw the reputation score plummet.
“Now you can automate sanctions against someone like this, but most games don’t have a dashboard like this,” Fong said. “If you had some a tool like this, even for a moderator, you can act instantly. It doesn’t take much. Our point is this. We can streamline a moderator’s tasks and make their lives 100 times faster and more efficient,” Fong said.
Keeping data private
One thing GGWP has to watch for is false reports, as toxic players will sometimes use the reporting system against legit players as a way to take revenge for losing a match.
The overall game can get a community health score and use that to benchmark the game within a genre of titles in terms of player toxicity. GGWP crawls Reddit for posts about toxicity in specific games, and then determine whether or not that is a positive or negative sentiment. (Spiketrap does this kind of thing).
“The idea is to provide our partners where the toxicity hot spots are,” Fong said.
But even though they know a lot about each player, they still have a responsibility to preserve privacy. Fong’s company gets access to the player data and the reports, but the data is anonymous, so GGWP doesn’t know exactly who these players are.
“Instead of having the reports go to an email inbox, they point the player reports to our API [application programming interface]. It’s like a line of code,” Fong said.
Then GGWP’s AI goes to work organizing the data and triaging the worst cases. Since GGWP gets a dump of the history of all the reports, it can build player profiles and give them reputation scores. It can also identify if you have a habit of falsely reporting people in matches. Many players also have a habit of quitting a match when a team is losing (dubbed AFK, or away from keyboard).
“That’s pretty toxic and our system can detect that,” Fong said. “Over time, that has a negative impact on your reputation. If it is a ranked match, that carries a different weight. Reports among the highest-level players also carry more weight.”
The sanctions can be differentiated as well. Players can be warned and informed of what they did wrong. They can be temporarily banned, or permanently banned using the same tool. One of GGWP’s partners can translate voice chat to text.
“The reporting tool catches everything, from cheating to verbal toxicity or chat abuse,” Fong said.
The tool uses AI-based detections for toxicity, based on your actions in a game that could be toxic. If you abandon your teammates and let them die, that kind of activity can be reported and built into a reputation score.
“If I’m looking at a something that you said, it’s looking at your entire history, you don’t always need a human moderator to act on an incident,” he said. “Our goal is like 99% of the cases are triaged automatically by AI. And that’s how we can make this much more efficient. Now, most companies still want human moderators involved. So our goal is not to replace the human moderator. It’s just to make them like 100 times more efficient.”
Escalation and positive feedback
If players start making death threats or use code for that, like telling players to kill themselves, then the system will give priority to such reports and involve a human moderator.
“We want the industry to go to a system that is more efficient and effective and is based on a score, like your Uber driver rating or eBay feedback,” he said. “We also have a big focus on positive behavior and rewarding it. Our goal was never to build a pure punishment system. If you’re trying to pick up a teammate and revive them under adverse conditions, that should help your score.”
The good thing is that less than 5% of people in a game community are bad all of the time, Fong said.
“The vast majority of reports are just average people having a bad day or bad game,” he said. “Your goal is not always to ban players. That’s like the last resort. The goal is to reform people and teach them what is right and wrong.
Some games, like Call of Duty, may have a higher threshold for toxicity as it’s an adult game with mature themes. By contrast, a lot of kids play Roblox, so the definition of toxicity is fluid among such companies. But the key is always to respond quickly so that the player understands why any action was taken against them.
Sometimes it takes a few months to ban toxic players as they build up the reputation profile, particularly when the behavior is on a borderline. But Fong remembers that all it takes is one person to be toxic in a game of 100 people to ruin the game for the other 99.
“There’s a cascading effect,” said Fong. “If a new player experiences toxicity in the first five games, the likelihood that they’ll quit and never come back is extra high. And games can build communities with a reputation of being very toxic, which scares away a lot of players. I think people are now realizing that it’s bad for business.”
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn More
Author: Dean Takahashi
Source: Venturebeat