Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

CUPERTINO, Calif. — Apple briefed academics this week about their plans to install software in iPhones sold in the U.S. to "scan for child abuse imagery."

The initiative was reported today by The Financial Times, which reported that Apple’s plans are “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The Financial Times confirmed that “the scheme will initially roll out only in the U.S.”

Security researchers told the publication that while they may be “supportive of efforts to combat child abuse,” they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

'An Absolutely Appalling Idea'

Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch “an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of  […] our phones and laptops.”

Researchers point out that “although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.”

Matthew Green, a security professor at Johns Hopkins University, warned about the expansive implications of such a technology.

“This will break the dam — governments will demand it from everyone,” Green noted.

The Financial Times described the intrusive nature of the new technology: “Apple’s neuralMatch algorithm will continuously scan photos that are stored on a U.S. user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as ‘hashing,’ will be compared with those on a database of known images of child sexual abuse.”

How the Algorithm Was Trained

Whether this private system assesses an image on a user’s computer to be legal or illegal will be based on how Apple has set up the algorithm. In this case, as reported, the system “has been trained on 200,000 sex abuse images collected by the U.S. non-profit National Center for Missing and Exploited Children.”

“According to people briefed on the plans, every photo uploaded to iCloud in the U.S. will be given a ‘safety voucher’ saying whether it is suspect or not,” the report added. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

The report does not specify what safeguards would be in place in case of a mistake or "false positive," when the algorithm identifies a piece of legal content as CSAM and law enforcement is compelled to act — or who would be liable for a life-and-reputation-destroying misidentification.

Related:  

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

FSC Drops Florida AV Lawsuit in Wake of SCOTUS Decision

A U.S. district court judge granted on Tuesday a motion by Free Speech Coalition to dismiss the trade association’s lawsuit over Florida’s age verification law, a case that had been on hold pending the Supreme Court’s recent ruling on the constitutionality of state AV laws.

Ukrainian President Responds to Porn Legalization Petition

President Volodymyr Zelensky responded Tuesday to an OnlyFans model’s petition to decriminalize pornography in Ukraine, stating that he would wait for the legislative process to play out “in accordance with established procedure.”

Only Tax Deductions Joins Pineapple Support as Sponsor

Adult industry accounting firm Only Tax Deductions has joined the ranks of over 70 adult businesses and organizations committing funds and resources to Pineapple Support.

Adult Empire Launches 'Conversations' Podcast Series

Adult Empire has launched a new official podcast series hosted by Nicole Chappelle and Charlie.

Sex Work CEO Launches 'Teams Plan' for AI Assistant

Sex Work CEO has introduced the new Teams Plan for its AI-powered, NSFW text generator, GPTease.

2025 XBIZ Amsterdam Website Launches With Call for Speakers

XBIZ is pleased to announce that the website for its annual European conference, XBIZ Amsterdam, is now live.

NC Governor Vetoes Bill Targeting Adult Industry, Override Possible

North Carolina Governor Josh Stein today vetoed a bill imposing new regulations that adult industry observers have warned could push adult websites and platforms to ban most adult creators and content.

25,000 Sign Petition to Legalize Pornography in Ukraine

An OnlyFans model’s petition to decriminalize pornography in Ukraine has amassed the 25,000 signatures required for official consideration by President Volodymyr Zelensky.

WannaCollab Joins Pineapple Support as Supporter-Level Sponsor

WannaCollab has joined the ranks of over 70 adult businesses and organizations committing funds and resources to Pineapple Support.

FSC Unpacks SCOTUS Age Verification Ruling in Webinar

The Free Speech Coalition conducted a public webinar Tuesday to help adult industry stakeholders understand the Supreme Court’s recent decision in FSC v. Paxton, and its potential implications.

Show More