Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

CUPERTINO, Calif. — Apple briefed academics this week about their plans to install software in iPhones sold in the U.S. to "scan for child abuse imagery."

The initiative was reported today by The Financial Times, which reported that Apple’s plans are “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The Financial Times confirmed that “the scheme will initially roll out only in the U.S.”

Security researchers told the publication that while they may be “supportive of efforts to combat child abuse,” they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

'An Absolutely Appalling Idea'

Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch “an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of  […] our phones and laptops.”

Researchers point out that “although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.”

Matthew Green, a security professor at Johns Hopkins University, warned about the expansive implications of such a technology.

“This will break the dam — governments will demand it from everyone,” Green noted.

The Financial Times described the intrusive nature of the new technology: “Apple’s neuralMatch algorithm will continuously scan photos that are stored on a U.S. user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as ‘hashing,’ will be compared with those on a database of known images of child sexual abuse.”

How the Algorithm Was Trained

Whether this private system assesses an image on a user’s computer to be legal or illegal will be based on how Apple has set up the algorithm. In this case, as reported, the system “has been trained on 200,000 sex abuse images collected by the U.S. non-profit National Center for Missing and Exploited Children.”

“According to people briefed on the plans, every photo uploaded to iCloud in the U.S. will be given a ‘safety voucher’ saying whether it is suspect or not,” the report added. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

The report does not specify what safeguards would be in place in case of a mistake or "false positive," when the algorithm identifies a piece of legal content as CSAM and law enforcement is compelled to act — or who would be liable for a life-and-reputation-destroying misidentification.

Related:  

Copyright © 2024 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Seoul Authorities Force Cancellation of Adult Expo for 'Distorting Perceptions of Sex'

Seoul authorities repeatedly prevented 2024 KXF The Fashion — a popular Korean adult industry expo featuring Japanese AV performers — scheduled for this week from finding a suitable venue, leading organizers to cancel the event.

FSC to Hold Discussion on Adult Industry Rights With Congressional Candidate Joe Cohn

Free Speech Coalition will hold a virtual discussion with Joe Cohn, a strong advocate for the adult industry’s rights who is running for Congress in November.

Sophie Dee, Ricky Johnson to Deliver 'XBIZ Talks' at Miami Conference

XBIZ is pleased to announce that Sophie Dee and Ricky Johnson will each deliver an “XBIZ Talk” at next month’s XBIZ Miami conference

FSC to Host Webinar on Derisking and the Adult Industry

The Free Speech Coalition (FSC) is hosting a webinar on derisking, titled "Derisking: Examining Its Impact on the Adult Industry's Access to Banking," on April 24 at 11 a.m. (PDT).

Democratic Governor Fails to Veto Kansas Age Verification Bill

Kansas’ Democratic governor, Laura Kelly, expressed strong reservations about the state’s version of the age verification bills being sponsored around the country by anti-porn religious conservative activists, but ultimately decided not to veto it, allowing the legislation to become law by default without her signature.

FSC's Alison Boden Testifies Against California Age Verification Bill, Urges Action to Defeat It

Free Speech Coalition Executive Director Alison Boden testified Tuesday against AB 3080, California’s version of the age verification bills being sponsored around the country by anti-porn religious conservative activists.

Phoenix Marie Sues Aylo, Danny D Over Incident on Digital Playground Set

Phoenix Marie has filed a lawsuit against Aylo, performer/producer Danny D and other defendants, alleging she has suffered defamation and damage to her career over a 2023 incident on a Digital Playground set in Spain.

New Premium Creator Platform 'Lemon Social' Launches

Premium creator platform Lemon Social has debuted.

Atlanta Authorities Renew Attack on Adult Boutique Tokyo Valentino

The saga of beleaguered Georgia adult boutique Tokyo Valentino continues with a renewed attempt by authorities to shut down another of its locations.

MomPOV Producer Pleads Guilty in GirlsDoPorn Case

MomPOV producer Doug Wiederhold, who was formerly the partner of GirlsDoPorn owner Michael Pratt as well as the first male talent for GDP, pleaded guilty Thursday to a federal conspiracy charge.

Show More