WASHINGTON — The Free Speech Coalition has issued a reminder notice that the notice-and-removal requirements of the federal TAKE IT DOWN Act will go into effect on May 19.
The notice follows:
The TAKE IT DOWN Act created a federal criminal prohibition on the nonconsensual publishing of intimate images (including AI-generated “deepfakes”) and requires covered platforms to establish a notice-and-removal process for such content within 48 hours of a valid request. While the ban on nonconsensual imagery kicked in immediately after the law was passed, the notice-and-removal requirements go into effect on May 19, 2026.
Criminal Prohibition on Publishing Nonconsensual Intimate Visual Depictions (already in effect)
What’s covered: The law criminalizes two categories of content:
- authentic intimate visual depictions published without consent
- “digital forgeries” – AI-generated or otherwise computer-manipulated intimate images of an identifiable individual that a reasonable person would find indistinguishable from authentic depictions.
Who’s liable: Any person who knowingly publishes such content using an interactive computer service. This targets the individual uploader/publisher, not the platform.
Notice-and-Removal Obligations for Covered Platforms (effective May 19)
Who must comply: “Covered platforms” – websites, online services, online applications, or mobile applications that serve the public and primarily provide a forum for user-generated content (including messages, videos, images, and audio).Process required: Covered platforms must establish a process by which an individual (or their authorized representative) can submit a removal request. The request must include a signature, identification of the content, a good faith statement that it was published without consent, and contact information.
Removal timeline: Upon receiving a valid request, a covered platform must remove the content as soon as possible, but no later than 48 hours after receipt. Platforms must also make reasonable efforts to identify and remove known identical copies.
Notice requirement: Platforms must post a clear, conspicuous, plain-language notice of their removal process and how to submit a request.
Enforcement: Failure to comply with the notice-and-removal obligations is treated as an unfair or deceptive act or practice under the FTC Act, enforced by the Federal Trade Commission.
Important Notes
- The definition of “covered platform” is broad enough to include most sites that host user-generated content. Platforms that host any user-uploaded content should assume they are covered and consult with counsel.
- Under the law, consent to create an intimate visual depiction does not equal consent to publish it.
- Covered platforms must respond to “valid” removal requests, which must be in writing and include:
- a physical or electronic signature of the requestor (or their representative)
- identification of, and information sufficient for the platform to locate, the offending content
- a statement of the requestor's good-faith belief that the depiction was not consensual
- the requestor's contact information
- The law includes no provisions that address how platforms can or should deal with erroneous or fraudulent removal requests.
For more information, visit FreeSpeechCoalition.com.