Xbox has released its fourth Transparency Report, in which it details its efforts to protect players and what countermeasures it’s using to stave off toxicity and harmful elements. According to Xbox, it has invested in “the responsible application of AI” to help improve detection. The report also reveals how the company’s recently launched safety tools are working for the community, including the recently launched voice reporting feature.
According to the report, two of Xbox’s early AI investments are Auto Labeling, which identifies words and phrases that meet certain criteria and could potentially be harmful. According to the company, this helps moderators sort through false reports more quickly. The other is Image Pattern Matching, which uses databases and image matching techniques to identify potentially harmful imagery, allowing moderators to remove it more swiftly.
Xbox Transparency Report also revealed some metrics for the success of some of its new safety features. For example, Xbox new voice reporting feature had resulted in 138,000 voice captures. Of the voice reports that led to an Enforcement Strike, 98% of reported players didn’t display repeat behavior and received no further strikes. Speaking of which, Xbox also reported on the Enforcement Strike system, which it launched in August 2023. 88% of players who received an Enforcement Strike didn’t receive another enforcement. According to the report’s data, the vast majority of enforcements were for cheating or inauthentic accounts.
The company also discussed forward-looking innovations in its safety strategies. These include the launch of the Family Toolkit, which offers guidance on how to best use Xbox’s safety and family-friendly features for parents or caretakers. It’s also offering users the chance to give more information about their online experiences via its Global Online Safety Survey. It’s also offering a kid-friendly safety lesson within Minecraft Education called CyberSafe: Good Game.
Author: Rachel Kaser
Source: Venturebeat
Reviewed By: Editorial Team