Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

CUPERTINO, Calif. — Apple briefed academics this week about their plans to install software in iPhones sold in the U.S. to "scan for child abuse imagery."

The initiative was reported today by The Financial Times, which reported that Apple’s plans are “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The Financial Times confirmed that “the scheme will initially roll out only in the U.S.”

Security researchers told the publication that while they may be “supportive of efforts to combat child abuse,” they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

'An Absolutely Appalling Idea'

Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch “an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of  […] our phones and laptops.”

Researchers point out that “although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.”

Matthew Green, a security professor at Johns Hopkins University, warned about the expansive implications of such a technology.

“This will break the dam — governments will demand it from everyone,” Green noted.

The Financial Times described the intrusive nature of the new technology: “Apple’s neuralMatch algorithm will continuously scan photos that are stored on a U.S. user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as ‘hashing,’ will be compared with those on a database of known images of child sexual abuse.”

How the Algorithm Was Trained

Whether this private system assesses an image on a user’s computer to be legal or illegal will be based on how Apple has set up the algorithm. In this case, as reported, the system “has been trained on 200,000 sex abuse images collected by the U.S. non-profit National Center for Missing and Exploited Children.”

“According to people briefed on the plans, every photo uploaded to iCloud in the U.S. will be given a ‘safety voucher’ saying whether it is suspect or not,” the report added. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

The report does not specify what safeguards would be in place in case of a mistake or "false positive," when the algorithm identifies a piece of legal content as CSAM and law enforcement is compelled to act — or who would be liable for a life-and-reputation-destroying misidentification.

Related:  

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Nerdgasm: A Look at the Naughty Side of Pop Culture Geekdom

From “Call of Duty” to cosplay, from tabletop dice rolls to dungeon-inspired dirty talk, the worlds of geek fandom and fantasy are no longer confined to the basement. They’ve kicked down the door, shed the “Firefly” tee and gone full frontal.

Kyrgyzstan Parliament Moves to Outlaw Internet Pornography

A parliamentary committee of the Supreme Council of Kyrgyzstan on Tuesday approved a measure to outlaw online adult content in the country.

Sweden Bans Purchase of 'Remote' Sexual Services

The Riksdag, Sweden’s parliament, has approved a proposal to criminalize purchasing sexual services performed remotely by streamers and custom content creators.

Asa Akira to Deliver XBIZ Talk at Miami Conference

XBIZ is pleased to announce that decorated performer, Pornhub brand ambassador, and author Asa Akira is set to deliver an exclusive talk at XBIZ Miami.

JustFor.fans Launches 'Fentanyl Test Strip' Initiative

JustFor.fans (JFF) has launched a test strip initiative to combat the nationwide fentanyl crisis.

2025 XBIZ Miami Speaker Lineup Announced

XBIZ is pleased to announce the release of the full speaker lineup for XBIZ Miami, the latest edition of the adult industry’s premier summer conference, set to take place May 19-22 at the Nautilus Sonesta Miami Beach hotel in South Beach.

AV Bulletin: Arizona's About-Face, What New Laws Mean for Adult

Industry stakeholders and free speech advocates have anxiously been awaiting the Supreme Court’s decision in Free Speech Coalition v. Paxton, which could significantly impact state age verification laws around the United States. In the meantime, state legislatures continue to weigh and pass AV bills, the U.K. and the EU are moving ahead with their own AV mandates and strategies, and legal challenges continue to play out in U.S. courts — with some cases on hold pending the SCOTUS ruling in Paxton.

Million Billion Media Launches New Website

Management and PR agency Million Billion Media (MBM) has launched a new website.

'Neon Nightswim' Party Returns to XBIZ Miami

XBIZ is pleased to announce that the annual Neon Nightswim Pool Party will once again illuminate XBIZ Miami on Tuesday, May 20.

FSC Addresses UK Age Verification Guidelines

The Free Speech Coalition (FSC) has published an article offering guidance on the U.K.'s Online Safety Act and the various guidelines put forward by the country's telecommunications regulator Ofcom.The article follows:

Show More