Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

BRUSSELS — Digital rights and privacy advocates are raising the alarm about an EU legislative proposal unveiled yesterday by the European Commission, allegedly to address “misuse of online services” and “prevent and combat child sexual abuse online.”

In a statement, the Brussels-based European Commission justified its plan for addressing the issue of CSAM by contending that “the current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires.”

The proposed rules, the EC notes, “will oblige providers to detect, report and remove CSAM on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

The proposal creates a new continental bureaucracy, the EU Centre on Child Sexual Abuse, granted powers of surveillance to “facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analyzing reports from providers to identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims.”

A New Euro Bureaucracy, With Online Surveillance Powers

This new EU Centre bureaucracy will be the clearing house for what the proposal calls “detection orders.”

These “detection orders,” the EC explained, “will be issued by courts or independent national authorities. To minimize the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.”

According to the intricate system for content review and moderation, the proposal mandates:

  • Providers of hosting or interpersonal communication services “will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.”
  • Then, each country in the EU “will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new CSAM or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.”
  • When a website or platform receives one of these detection orders, they “will only be able to detect content using indicators of CSAM verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting CSAM. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.”
  • Providers that have detected CSAM online “will have to report it to the EU Centre.”
  • The new laws should enable national authorities to “issue removal orders if the CSAM is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.”
  • The new laws will also mandate app stores "ensure that children cannot download apps that may expose them to a high risk of solicitation of children."
  • The transnational, politically appointed EU Centre will monitor online service providers, and determine if they are indeed “complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to CSAM online, by providing indicators to detect CSAM and receiving the reports from the providers.”
  • The EU Centre will have vague competencies, interacting with “national law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law enforcement.”

The document concludes by claiming that it is part of a new European strategy “for a better internet for kids.”

'A Worrying Day' for EU Privacy

The advocacy group European Digital Rights released a statement yesterday lambasting the European Commission’s online CSAM proposal for “failing to find right solutions to tackle child sexual abuse.”

“Today is a worrying day for every person in the EU who wants to send a message privately without exposing their personal information, like chats and photos, to private companies and governments,” EDRi stated, adding that the European Commission’s proposal includes “measures which put the vital integrity of secure communications at risk.”

The group criticized Commissioner Ylva Johansson, who has been behind the continental push for increased surveillance, for spearheading a proposal "which could still force companies to turn our digital devices into potential pieces of spyware, opening the door for a vast range of authoritarian surveillance tactics."

The proposal, the statement continued, puts “journalists, whistleblowers, civil rights defenders, lawyers, doctors and others who need to maintain the confidentiality of their communications at risk.”

Tech news site TechDirt has called the EC proposal “Europe’s own version of the EARN IT Act,” likening it to the controversial bipartisan U.S. proposal to limit Section 230 protection and establish a new state bureaucracy to monitor and surveil internet content, purportedly for the purpose of battling supposed “online harms.”

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

NYC Adult Businesses Seek SCOTUS Appeal in Zoning Case

Attorneys representing a group of New York City adult businesses are asking the U.S. Supreme Court to hear an appeal of a lower court’s decision allowing enforcement of a 2001 zoning law aimed at forcing adult retail stores out of most parts of New York City.

Teasy Agency Launches Marketing Firm

Teasy Agency has officially launched Teasy Marketing firm.

Ofcom Investigates More Sites in Wake of AV Traffic Shifts

U.K. media regulator Ofcom has launched investigations into 20 more adult sites as part of its age assurance enforcement program under the Online Safety Act.

MintStars Launches Debit Card for Creators

MintStars has launched its MintStars Creator Card, powered by Payy.

xHamster Settles Texas AV Lawsuit, Pays $120,000

Hammy Media, parent company of xHamster, has settled a lawsuit brought by the state of Texas over alleged noncompliance with the state’s age verification law, agreeing to pay a $120,000 penalty.

RevealMe Joins Pineapple Support as Partner-Level Sponsor

RevealMe has joined the ranks of over 70 adult businesses and organizations committing funds and resources to Pineapple Support.

OnlyFans Institutes Criminal Background Checks for US Creators

OnlyFans will screen creators in the United States for criminal convictions, CEO Keily Blair has announced in a post on LinkedIn.

Pineapple Support to Host 'Healthier Relationships' Support Group

Pineapple Support is hosting a free online support group on enhancing connection and personal growth.

Strike 3 Rejects Meta 'Personal Use' Defense in AI Suit

Vixen Media Group owner Strike 3 Holdings this week responded to Facebook parent company Meta’s motion to dismiss Strike 3’s suit accusing Meta of pirating VMG content to train its artificial intelligence models.

Pornhub, Stripchat: VLOP Designation Based on Flawed Data

In separate cases, attorneys for Pornhub and Stripchat this week told the EU’s General Court that the European Commission relied on unreliable data when it classified the sites as “very large online platforms” (VLOPs) under the EU’s Digital Services Act, news organization MLex reports.

Show More