Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

Apple Unveils Plan to Scan All iPhones to Determine if Adult Content is Illegal

CUPERTINO, Calif. — Apple briefed academics this week about their plans to install software in iPhones sold in the U.S. to "scan for child abuse imagery."

The initiative was reported today by The Financial Times, which reported that Apple’s plans are “raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

Apple’s proposed system, neuralMatch, would “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The Financial Times confirmed that “the scheme will initially roll out only in the U.S.”

Security researchers told the publication that while they may be “supportive of efforts to combat child abuse,” they are nevertheless “concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.”

'An Absolutely Appalling Idea'

Ross Anderson, professor of security engineering at the University of Cambridge, called neuralMatch “an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of  […] our phones and laptops.”

Researchers point out that “although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques.”

Matthew Green, a security professor at Johns Hopkins University, warned about the expansive implications of such a technology.

“This will break the dam — governments will demand it from everyone,” Green noted.

The Financial Times described the intrusive nature of the new technology: “Apple’s neuralMatch algorithm will continuously scan photos that are stored on a U.S. user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as ‘hashing,’ will be compared with those on a database of known images of child sexual abuse.”

How the Algorithm Was Trained

Whether this private system assesses an image on a user’s computer to be legal or illegal will be based on how Apple has set up the algorithm. In this case, as reported, the system “has been trained on 200,000 sex abuse images collected by the U.S. non-profit National Center for Missing and Exploited Children.”

“According to people briefed on the plans, every photo uploaded to iCloud in the U.S. will be given a ‘safety voucher’ saying whether it is suspect or not,” the report added. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

The report does not specify what safeguards would be in place in case of a mistake or "false positive," when the algorithm identifies a piece of legal content as CSAM and law enforcement is compelled to act — or who would be liable for a life-and-reputation-destroying misidentification.

Related:  

Copyright © 2024 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Aylo Challenges EU's DSA Mandate to Reveal Legal Names of Advertisers, Including Performers

Aylo has filed an appeal with the Court of Justice of the European Union (CJEU) challenging the EU's Digital Services Act mandate to disclose the legal names of advertisers, including performers, in a publicly accessible database.

Irish Senate Endorses Age Verification Proposal Based on Anti-Porn US State Laws

The Senate of the Republic of Ireland — known by its Gaelic name Seanad Éireann — has endorsed an age verification bill introduced by a senator who said he was inspired by the U.S. state laws promoted by religious conservative anti-porn crusaders.

Meta Admits to Updating Database of Banned Images Based on 'Media Reports'

Meta has told its Oversight Board that the company relies on “media reports” when deciding to add images to its permanent database of banned content for its platforms, including Instagram and Facebook.

MintStars Launches Tipping Solution MintPay

MintStars has unveiled payment processing solution MintPay, aiming to make tipping creators easier for fans.

Popular Pakistani Actor and Director Yasir Hussain Proposes Legalizing Porn

Prominent Pakistani actor, director and TV personality Yasir Hussain has sparked debate in the majority-Muslim country by suggesting that pornography should be legalized there and that society should own up to so many Pakistanis already being habitual consumers.

AEBN Publishes Popular Searches for May and June

AEBN has released the top search terms for the months of May and June from its straight and gay theaters in all 50 states and the District of Columbia.

Senior Labour MP Launches Attack on All Porn, Sex Work

A senior Labour MP on Tuesday launched an attack against adult websites, saying they are “characterized by lawlessness,” and called for further criminalization of all sex work.

Conservative Taxpayers Group Criticizes KOSA's Overreach

Conservative newspaper The Washington Times on Tuesday published an opinion piece by the executive director of the Taxpayers Protection Alliance, criticizing the Kids Online Safety Act (KOSA) on constitutional grounds.

Los Angeles-Area Man Pleads Guilty to Wire Fraud Over Bogus Adult Sites

A Los Angeles-area man pleaded guilty on Monday to defrauding investors out of more than $1 million “by making false promises that they would receive an ownership interest in several adult entertainment webcam websites and then using their money on personal expenses, including luxury items,” according to the Department of Justice.

More Conservative Organizations Distance Themselves From Anti-Porn Project 2025

A growing list of conservative groups that previously endorsed Project 2025 — which calls for the criminalization of adult content production and distribution — have reportedly distanced themselves from the self-described “presidential transition” blueprint, following Donald Trump’s repeated claims that he disagrees with an unspecified number of its positions.

Show More