NewsPhotography

Apple Will Scan Photos Stored on iPhone, iCloud for Child Abuse: Report

Apple is reportedly planning to scan photos that are stored both on iPhones and in its iCloud service for child abuse imagery, which could help law enforcement but also may result in increased government demands for access to user data.

In a report by The Financial Times and summarized by The Verge, the new system will be called “neuralMatch” and will proactively alert a team of human reviewers if it believes that it has detected imagery that depicts violence or abuse towards children. As an artificial intelligence tool, neuralMatch has apparently been trained using 200,000 images from the National Center for Missing and Exploited Children to help identify problem images and will be rolled out first in the United States.

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” The Financial Times reports. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

As noted by The Verge, John Hopkins University professor and cryptographer Matthew Green raised concerns about the implementation.

“This is a really bad idea,” he writes in a Twitter thread. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear. Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.”

Green argues that the way Apple is implementing this will start with photos that people have already shared with the cloud, so theoretically the initial implementation won’t hurt anyone’s privacy. The Verge notes that Apple already checks iCloud files against known child abuse imagery, like every other cloud provider, but what the company plans to do here goes beyond that and will allow access to local iPhone storage.

“But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal,” he continues. “Even if you believe Apple won’t allow these tools to be misused, there’s still a lot to be concerned about. These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”

“The idea that Apple is a ‘privacy’ company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them,” Green concludes.


Image credits: Header photo licensed via Depositphotos.


Author: Jaron Schneider
Source: Petapixel

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!