WASHINGTON — The U.S. Senate Committee on Commerce, Science, and Transportation held a hearing Wednesday on potential changes to Section 230 of the Communications Decency Act, which protects interactive computer services — including adult platforms — from liability for user-generated content.
Three bills to completely repeal Section 230 are currently pending in Congress. However, Wednesday’s hearing did not address those proposals. Instead, the hearing — titled “Liability or Deniability? Platform Power as Section 230 Turns 30” — took a more moderate approach, focusing on avenues for reforming the law.
The current push for Section 230 repeal or reform stems from two main sources.
First, politicians on both sides of the aisle have vilified “Big Tech” for allegedly profiting from illegal and harmful content, and therefore seek to pressure platforms to moderate such content more intensively by making them liable when third parties post it. At the hearing, Sen. Marsha Blackburn stated, “Big Tech has proven they are incapable of regulating or policing themselves. They will not do it.”
Second, right-wing critics claim that social media platforms use the rule as a shield allowing them to censor conservative speech and seek to limit platforms’ right to moderate content as they see fit. During the hearing, Sen. Eric Schmitt railed against Biden administration efforts to limit the social media reach of COVID-19 conspiracy theorists and 2020 election deniers, characterizing those efforts as violations of the First Amendment.
Sen. Ted Cruz, who chairs the committee, raised both issues, calling for Congress “to prevent social media from harming Americans, especially children, while not incentivizing Big Tech censorship.”
Cruz stopped short of advocating for repeal, however.
“I’m concerned that a full repeal or sunset would lead platforms to engage in worse behavior — to engage in more censorship to protect themselves from litigation,” Cruz said. “But we should consider whether reform of Section 230 is needed.”
Sen. Brian Schatz, the committee’s ranking Democrat present, also called for reform.
“We can work together and fix the law,” Schatz said. “This idea that we can’t touch it, otherwise internet freedom incinerates, is preposterous.”
Potential Impact on Adult
Current efforts to reform Section 230 are not generally directed overtly or specifically at online adult companies. Much of the testimony and discussion in the hearing therefore focused on issues such as minors who suffered abuse or died by suicide after encountering predators or self-harm content on social media, as well as on whether algorithmic design and AI-generated content should be covered by Section 230 protections.
However, industry attorneys and advocates have voiced strong concerns that opening up Section 230 to tinkering could easily pave the way for a variety of specific “carve-outs,” in the tradition of FOSTA/SESTA’s exemptions revoking liability protections for sites that “unlawfully promote and facilitate” prostitution or sex trafficking — and that a carve-out aimed at or including the industry would render adult sites liable for user-generated content, opening the floodgates for civil lawsuits.
While most of those lawsuits could likely ultimately be defended against on First Amendment grounds, Section 230 enables defendants to avoid expensive litigation. As Techdirt’s Mike Masnick has written, the law “provides a procedural advantage in getting vexatious, frivolous nuisance lawsuits shut down much faster than they would be otherwise.” Without Section 230 protections, true “Big Tech” platforms would still be able to defend their moderation choices, but smaller companies would become intensely vulnerable to such lawsuits.
Testifying at the hearing, Stanford Law School platform regulation expert Daphne Keller told the committee that a world without Section 230 “would impose legal uncertainty and expense that today’s incumbent giants could survive but their smaller rivals could not.”
“We have a lot of data to predict what happens when platforms are held liable for the speech of their users,” Keller said. “Platforms receive huge numbers of false allegations under laws like the DMCA here or the Digital Services Act in Europe, from people demanding the removal of perfectly legal speech. Governments do this, companies do this against their competitors — and platforms have strong incentives to simply comply.”
The likely applicability of that analysis to adult companies is reinforced by attitudes toward the industry within the current administration.
During the hearing, Democratic Sen. Tammy Baldwin cautioned warned against “informal, often coercive efforts by government officials to pressure private companies into moderating or removing content that they cannot legally censor directly.”
Keller was more specific, citing, in her written testimony, Federal Communications Commission chair Brendan Carr’s threats against ABC, which temporarily drove comedian Jimmy Kimmel off the air.
Carr also authored the section of Project 2025’s “Mandate for Leadership” blueprint that called for scrapping Section 230’s current approach. The same document included a call to criminalize all adult content, asserting that pornography “has no claim to First Amendment protection.”
“Pornography should be outlawed,” the document states. “The people who produce and distribute it should be imprisoned.”
The document has proved to be an accurate overall road map for the second Trump administration’s priorities. Nor is Carr the only administration voice to be affiliated with efforts to outlaw adult content. Trump advisor Russell Vought has discussed banning pornography “from the back door,” and Vice President Vance has called for banning it outright.