Instagram Addresses Confusing New Content Moderation Policy

Instagram Addresses Confusing New Content Moderation Policy

MENLO PARK, Calif. — Instagram published a blog post today announcing some changes in the app's moderation policy and appeals process.

The announcement outlined some changes that representatives of the Facebook-owned company had mentioned last month at the unprecedented meeting between the Instagram public policy team and a group of representatives from the Adult Performers Actors Guild (APAG) at Facebook's monumental Bay Area headquarters. 

XBIZ attended the meeting and published an exclusive report the same day, chronicling the conversation between the sex workers’ advocacy group and several top-level Facebook/Instagram execs responsible for deciding which content is allowed on their platforms.

Today’s blog post described the new policy in the typically vague, imperious language common among the leadership of Facebook and other social media giants. With a combination of august arrogance, Sesame-Street-like language that is meant to appear friendly, harmless and inclusive, and a tone of finality and knowing-best, the statement uncannily mimics the public speech patterns of company figurehead Mark Zuckerberg.

“Today, we are announcing a change to our account disable [sic] policy,” reads the anonymous blog post, which an Instagram spokesperson confirmed as “official” and “quotable.” “Together with Facebook, we develop policies to ensure Instagram is a supportive place for everyone. These changes will help us quickly detect and remove accounts that repeatedly violate our policies.”

“Under our existing policy,” the statement continues, “we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time.”

Besides this extremely vague reframing of Instagram's still-secret formula for content censorship, the company also introduced “a new notification process to help people understand if their account is at risk of being disabled.”

“This notification will also offer the opportunity to appeal content deleted,” Instagram stated.

These appeals “will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we’ll be expanding appeals in the coming months. If content is found to be removed in error, we will restore the post and remove the violation from the account’s record.”

"We’ve always given people the option to appeal disabled accounts through our Help Center, and in the next few months, we’ll bring this experience directly within Instagram," the short blog post concluded.

Sex Workers' Concerns Remain

XBIZ contacted an Instagram spokesperson with relevant questions about the vexing new blogpost.

The company says that the goal is to make Instagram “a supportive place for everyone,” so we asked the rep whether legal sex workers and producers of adult content are included in “everyone.”

“This includes everyone who uses Instagram, irrespective of their profession,” the rep answered.

What is the rational in equating “nudity and pornography” with “bullying and harassment, hate speech, drug sales” and “terrorism”? Is Instagram implying that nudity (specified as female nipples and male and female genitalia and buttocks) and pornography (an often derogatory, stigmatizing term for adult content) is equivalent to all those other types of harmful content? (“Drug sales” is also problematic: how does Facebook define “drugs”?)

“No,” the rep clarified. “We have a set of rules that govern what you can or can’t post on Instagram — that’s not to say we equate the severity of one rule with another. Simply, this is a list of things we don’t allow. We do not allow the sales of any drug on Instagram, including illicit and pharmaceutical drugs.”

The blog post states that Instagram is “now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time.” Does Instagram intend to keep the “certain percentage,” “certain number” and “window of time” figures a secret, or will it announce what these numbers actually are?

“We don’t share these numbers as it allows bad actors to game our preventative measures,” said the rep.

What does “we’ll bring [the appeals] experience directly within Instagram” even mean?

“Currently people need to go to our Help Centre (a website) to appeal accounts that are removed from Instagram,” clarified the rep. “With this upcoming update, people can appeal directly in the app. In other words, this process will be a lot simpler for our community.”

Finally, we asked if Instagram was committed to keeping the platform “a safe and supportive place” for adults and sex workers?

“See above,” was the Instagram spokesperson’s succinct answer.

For more background on Instagram and the ongoing War on Porn, click here for the XBIZ Explainer and here for an account of the historic meeting between APAG leadership and the Facebook/Instagram public policy team.

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

NYC Adult Businesses Seek SCOTUS Appeal in Zoning Case

Attorneys representing a group of New York City adult businesses are asking the U.S. Supreme Court to hear an appeal of a lower court’s decision allowing enforcement of a 2001 zoning law aimed at forcing adult retail stores out of most parts of New York City.

Teasy Agency Launches Marketing Firm

Teasy Agency has officially launched Teasy Marketing firm.

Ofcom Investigates More Sites in Wake of AV Traffic Shifts

U.K. media regulator Ofcom has launched investigations into 20 more adult sites as part of its age assurance enforcement program under the Online Safety Act.

MintStars Launches Debit Card for Creators

MintStars has launched its MintStars Creator Card, powered by Payy.

xHamster Settles Texas AV Lawsuit, Pays $120,000

Hammy Media, parent company of xHamster, has settled a lawsuit brought by the state of Texas over alleged noncompliance with the state’s age verification law, agreeing to pay a $120,000 penalty.

RevealMe Joins Pineapple Support as Partner-Level Sponsor

RevealMe has joined the ranks of over 70 adult businesses and organizations committing funds and resources to Pineapple Support.

OnlyFans Institutes Criminal Background Checks for US Creators

OnlyFans will screen creators in the United States for criminal convictions, CEO Keily Blair has announced in a post on LinkedIn.

Pineapple Support to Host 'Healthier Relationships' Support Group

Pineapple Support is hosting a free online support group on enhancing connection and personal growth.

Strike 3 Rejects Meta 'Personal Use' Defense in AI Suit

Vixen Media Group owner Strike 3 Holdings this week responded to Facebook parent company Meta’s motion to dismiss Strike 3’s suit accusing Meta of pirating VMG content to train its artificial intelligence models.

Pornhub, Stripchat: VLOP Designation Based on Flawed Data

In separate cases, attorneys for Pornhub and Stripchat this week told the EU’s General Court that the European Commission relied on unreliable data when it classified the sites as “very large online platforms” (VLOPs) under the EU’s Digital Services Act, news organization MLex reports.

Show More