Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

Digital Rights Groups Lambast European Commission's Online Surveillance Proposal

BRUSSELS — Digital rights and privacy advocates are raising the alarm about an EU legislative proposal unveiled yesterday by the European Commission, allegedly to address “misuse of online services” and “prevent and combat child sexual abuse online.”

In a statement, the Brussels-based European Commission justified its plan for addressing the issue of CSAM by contending that “the current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires.”

The proposed rules, the EC notes, “will oblige providers to detect, report and remove CSAM on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

The proposal creates a new continental bureaucracy, the EU Centre on Child Sexual Abuse, granted powers of surveillance to “facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analyzing reports from providers to identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims.”

A New Euro Bureaucracy, With Online Surveillance Powers

This new EU Centre bureaucracy will be the clearing house for what the proposal calls “detection orders.”

These “detection orders,” the EC explained, “will be issued by courts or independent national authorities. To minimize the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.”

According to the intricate system for content review and moderation, the proposal mandates:

  • Providers of hosting or interpersonal communication services “will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.”
  • Then, each country in the EU “will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new CSAM or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.”
  • When a website or platform receives one of these detection orders, they “will only be able to detect content using indicators of CSAM verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting CSAM. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.”
  • Providers that have detected CSAM online “will have to report it to the EU Centre.”
  • The new laws should enable national authorities to “issue removal orders if the CSAM is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.”
  • The new laws will also mandate app stores "ensure that children cannot download apps that may expose them to a high risk of solicitation of children."
  • The transnational, politically appointed EU Centre will monitor online service providers, and determine if they are indeed “complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to CSAM online, by providing indicators to detect CSAM and receiving the reports from the providers.”
  • The EU Centre will have vague competencies, interacting with “national law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law enforcement.”

The document concludes by claiming that it is part of a new European strategy “for a better internet for kids.”

'A Worrying Day' for EU Privacy

The advocacy group European Digital Rights released a statement yesterday lambasting the European Commission’s online CSAM proposal for “failing to find right solutions to tackle child sexual abuse.”

“Today is a worrying day for every person in the EU who wants to send a message privately without exposing their personal information, like chats and photos, to private companies and governments,” EDRi stated, adding that the European Commission’s proposal includes “measures which put the vital integrity of secure communications at risk.”

The group criticized Commissioner Ylva Johansson, who has been behind the continental push for increased surveillance, for spearheading a proposal "which could still force companies to turn our digital devices into potential pieces of spyware, opening the door for a vast range of authoritarian surveillance tactics."

The proposal, the statement continued, puts “journalists, whistleblowers, civil rights defenders, lawyers, doctors and others who need to maintain the confidentiality of their communications at risk.”

Tech news site TechDirt has called the EC proposal “Europe’s own version of the EARN IT Act,” likening it to the controversial bipartisan U.S. proposal to limit Section 230 protection and establish a new state bureaucracy to monitor and surveil internet content, purportedly for the purpose of battling supposed “online harms.”

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Aylo, Pineapple Support Partner for Mental Health Video Series

Aylo has teamed up with Pineapple Support to create a safety video series aimed at educating performers and creators about mental health.

Ofcom Investigates FTV Sites for Possible AV Noncompliance

U.K. media regulator Ofcom is investigating First Time Videos, which operates the sites FTVGirls.com and FTVMilfs.com, for possible failure to comply with age assurance requirements under the Online Safety Act.

Stalwart Defender: Jeffrey Douglas on 30 Years Fighting for Free Expression

“If you had told me in 1995 that I would be on the FSC board for 30 years, I would have laughed out loud,” says Jeffrey Douglas.

FSC Publishes Analysis of Federal Trade Commission Event Promoting AV

Free Speech Coalition (FSC) has published an analysis of a Federal Trade Commission (FTC) event held this week that promoted age verification among other forms of speech regulation.

GirlsDoPorn Owner Michael Pratt Pleads Guilty to Sex Trafficking

Michael Pratt, former owner of the rogue website GirlsDoPorn, pleaded guilty in the U.S. District Court for the Southern District of California on Thursday to sex trafficking and conspiracy to commit sex trafficking charges, according to a report by City News Service.

Master Nico Relaunches Site Through YourPaysitePartner

Master Nico has relaunched his official website through YourPaysitePartner (YPP).

Federal Judge Grants Partial Halt of Florida AV Law

The United States District Court for the Northern District of Florida, Tallahassee Division, has granted a preliminary injunction against HB 3, the state's age verification law, as a lawsuit filed by two online trade associations challenging the law makes its way through the courts.

Aylo Releases Statement on Suspending Access to Pornhub in France

Technology and media company Aylo, which operates adult sites including Pornhub, YouPorn, and Redtube, has released a public statement regarding its decision to block access to its sites in France.

Pornhub Blocks Access in France in Response to SREN Law

Pornhub parent company Aylo has opted to block access to its sites in France rather than comply with age verification requirements under the country’s Security and Regulation of the Digital Space (SREN) law.

ASACP Highlights Study on Parental Controls

The Association of Sites Advocating Child Protection (ASACP) is highlighting the results of a study on the underutilization of parental controls.

Show More