AI & RoboticsNews

Bumble open-sourced its AI tool for catching unwanted nudes

Since 2019, Bumble has used machine learning to protect its users from lewd photos. Dubbed Private Detector, the feature screens images sent from matches to determine if they depict inappropriate content. It was primarily designed to catch unsolicited nude photos, but can also flag shirtless selfies and images of guns – both of which aren’t allowed on Bumble. When there’s a positive match, the app will blur the offending image, allowing you to decide if you want to view it, block it or report the person who sent it to you.

In a recent blog post, Bumble announced it was open-sourcing Private Detector, making the framework available on Github. “It’s our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place,” the company said, in the process acknowledging that it’s only one of many players in the online dating market.

Unwanted sexual advances are a frequent reality for many women both online and in the real world. A 2016 study found that 57 percent of women felt they were harassed on the dating apps they used. More recently, a 2020 study from the United Kingdom found that 76 percent of girls between the ages of 12 and 18 have been sent unsolicited nude images. The problem extends beyond dating apps too, with apps like Instagram working on their own solutions.


Author: I. Bonifacic
Source: Engadget

Related posts
Cleantech & EV'sNews

Einride deploys first daily commercial operations of autonomous trucks in Europe

Cleantech & EV'sNews

ChargePoint collaborates with GM Energy to deploy up to 500 EV fast chargers with Omni Ports

Cleantech & EV'sNews

How Ukraine assassinated a Russian general with an electric scooter

CryptoNews

Day-1 Crypto Executive Orders? Bitcoin Bulls Brace for Trump's Big Move

Sign up for our Newsletter and
stay informed!