Apple to Expand Automated Surveillance of iPhone Message Images

Apple to Expand Automated Surveillance of iPhone Message Images

CUPERTINO, Calif. — Apple’s iOS 16 update for iPhones, expected this fall, will expand worldwide a controversial “communications security” tool that will use proprietary AI to detect nudity in text messages.

The worldwide expansion of this “message analysis” feature, currently available only in the U.S. and New Zealand, will begin in September when iOS 16 is rolled out to the general public.

Phone models prior to the iPhone 8 will not be affected; for Mac devices, the Ventura update will offer this option.

The nudity detection feature has been touted by Apple as part of its Expanded Protections for Children initiative, but privacy advocates have raised questions about the company’s overall approach to private content surveillance.

Apple describes the feature as a tool to "warn children when receiving or sending photos that contain nudity.”

The feature, Apple notes, is not enabled by default: “If parents opt in, these warnings will be turned on for the child accounts in their Family Sharing plan.”

When content identified as nudity is received, “the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. Similar protections are available if a child attempts to send photos that contain nudity. In both cases, children are given the option to message someone they trust for help if they choose.”

The AI feature bundled with the default Messages app, the company explained, “analyzes image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages. The feature is designed so that no indication of the detection of nudity ever leaves the device.”

According to Apple, the company “does not get access to the messages, and no notifications are sent to the parent or anyone else.”

In the U.S. and New Zealand, this feature is included starting with iOS 15.2, iPadOS 15.2 and macOS 12.1.

As French news outlet RTL noted today when reporting the expansion of the feature, “a similar initiative, consisting of analyzing the images hosted on the photo libraries of users’ iCloud accounts in search of possible child pornography images, had been strongly criticized before being dismissed last year.”

As XBIZ reported, in September 2021 Apple announced that it would “pause” that initiative. The feature would have scanned images on users’ devices in search of CSAM and sent reports directly to law enforcement.

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Teasy Agency Launches Marketing Firm

Teasy Agency has officially launched Teasy Marketing firm.

Ofcom Investigates More Sites in Wake of AV Traffic Shifts

U.K. media regulator Ofcom has launched investigations into 20 more adult sites as part of its age assurance enforcement program under the Online Safety Act.

MintStars Launches Debit Card for Creators

MintStars has launched its MintStars Creator Card, powered by Payy.

xHamster Settles Texas AV Lawsuit, Pays $120,000

Hammy Media, parent company of xHamster, has settled a lawsuit brought by the state of Texas over alleged noncompliance with the state’s age verification law, agreeing to pay a $120,000 penalty.

RevealMe Joins Pineapple Support as Partner-Level Sponsor

RevealMe has joined the ranks of over 70 adult businesses and organizations committing funds and resources to Pineapple Support.

OnlyFans Institutes Criminal Background Checks for US Creators

OnlyFans will screen creators in the United States for criminal convictions, CEO Keily Blair has announced in a post on LinkedIn.

Pineapple Support to Host 'Healthier Relationships' Support Group

Pineapple Support is hosting a free online support group on enhancing connection and personal growth.

Strike 3 Rejects Meta 'Personal Use' Defense in AI Suit

Vixen Media Group owner Strike 3 Holdings this week responded to Facebook parent company Meta’s motion to dismiss Strike 3’s suit accusing Meta of pirating VMG content to train its artificial intelligence models.

Pornhub, Stripchat: VLOP Designation Based on Flawed Data

In separate cases, attorneys for Pornhub and Stripchat this week told the EU’s General Court that the European Commission relied on unreliable data when it classified the sites as “very large online platforms” (VLOPs) under the EU’s Digital Services Act, news organization MLex reports.

New Age Verification Service 'AgeWallet' Launches

Tech company Brady Mills Agency has officially launched its subscription-based age verification solution, AgeWallet.

Show More