AI & RoboticsNews

Bumble open-sourced its AI tool for catching unwanted nudes

Since 2019, Bumble has used machine learning to protect its users from lewd photos. Dubbed Private Detector, the feature screens images sent from matches to determine if they depict inappropriate content. It was primarily designed to catch unsolicited nude photos, but can also flag shirtless selfies and images of guns – both of which aren’t allowed on Bumble. When there’s a positive match, the app will blur the offending image, allowing you to decide if you want to view it, block it or report the person who sent it to you.

In a recent blog post, Bumble announced it was open-sourcing Private Detector, making the framework available on Github. “It’s our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place,” the company said, in the process acknowledging that it’s only one of many players in the online dating market.

Unwanted sexual advances are a frequent reality for many women both online and in the real world. A 2016 study found that 57 percent of women felt they were harassed on the dating apps they used. More recently, a 2020 study from the United Kingdom found that 76 percent of girls between the ages of 12 and 18 have been sent unsolicited nude images. The problem extends beyond dating apps too, with apps like Instagram working on their own solutions.


Author: I. Bonifacic
Source: Engadget

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!