BRUSSELS — The European Commission yesterday officially identified 19 major platforms and search engines to be targeted for compliance under the controversial Digital Services Act (DSA).
As XBIZ reported, the DSA has been widely criticized over privacy concerns as the EC attempts to tackle the issue of “illegal and harmful” content, including CSAM. Language in the DSA about having to “mitigate risk” concerning “gender-based violence online and the protection of minors online and their mental health” has raised the concerns of legal experts and digital rights activists.
The European Commission — which serves as the executive branch of the European Union — has justified the plan by contending that “the current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children.”
Under the new rules, “very large” providers such as those designated by yesterday’s announcement will be required to “assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”
Targeting 'Very Large' Platforms and Search Engines
Yesterday’s decision officially designated 17 “very large online platforms” (VLOPs) and two “very large online search engines” (VLOSEs), each of which, according to the EC, reaches at least 45 million monthly active users.
The VLOPs are: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube and German retailer Zalando.
The two VLOSEs are Bing and Google Search.
Following their designation, an EC statement explained, these companies “will now have to comply, within four months, with the full set of new obligations under the DSA.”
Those new obligations, the EC declared, “aim at empowering and protecting users online, including minors, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools.”
Under the subheading “Strong protection of minors,” the EC listed the following directives:
- Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors;
- Targeted advertising based on profiling towards children is no longer permitted;
- Special risk assessments including for negative effects on mental health will have to be provided to the Commission four months after designation and made public at the latest a year later;
- Platforms will have to redesign their services, including their interfaces, recommender systems, terms and conditions, to mitigate these risks.”
The risk mitigation plans of designated platforms and search engines, the EC noted, “will be subject to an independent audit and oversight by the Commission.”
Industry Attorneys Monitoring Developments
According to industry attorney Corey Silverstein of Silverstein Legal, the impact of the new designations and consequent obligations “could be substantial because many of the platforms that have been designated as VLOPs and VLOSEs are frequently utilized by the adult entertainment industry.”
Assuming these platforms decide to comply with the DSA, Silverstein told XBIZ, there may be major changes coming to what these platforms allow on their services within the EU.
“This could end up leading to major content moderation and outright blocking of adult content in the EU, including the blocking of websites that display adult entertainment from being listed in search results,” Silverstein warned. “Unfortunately, there is no definitive answer as to how these platforms will react but the industry will need to closely monitor this development.”
Free speech law expert Lawrence Walters, of the Walters Law Group, told XBIZ that the impact of the new designations on adult content creators “will depend on how the platforms and search engines implement the DSA requirements related to safety and security of minors, reporting of allegedly illegal content, recommendation systems and advertising procedures.”
The new European requirements, Walters added, are likely to cause “increased friction between adult content creator accounts and these large platforms and search engines.”
Walters advised keeping an eye on the results of the first required annual risk assessment by the designated service providers, as that will provide insight into how those providers are responding to the new compliance obligations — and the impact on free expression.
Walters also noted that as the larger adult platforms continue to grow, some may pass the EC’s benchmark of having 45 million monthly active users, and therefore “face the potential for future designation under the DSA, which could have more direct impact on their users and creators.