Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

Google's New Explicit Image Takedown Policy Unlikely to Affect Commercial Images

MOUNTAIN VIEW, Calif. — Google’s updated policies allowing individuals to remove “personal, explicit images” from Google Search results will not affect most commercial images created by a third party with appropriate contracts and releases.

As XBIZ reported, the policy, which was unveiled last week, was designed to target nonconsensual explicit imagery and to enable individuals “to remove from Search any of their personal, explicit images that they no longer wish to be visible in Search,” Google VP for Trust Danielle Romain shared through the platform’s blog.

At the time, Romain specified that the new policy “doesn’t apply to content you are currently commercializing.”

Still, questions lingered among adult companies and creators about situations involving explicit images of individuals who were under contract and/or had given full releases to third-party content producers, including studios and companies.

A Google rep told XBIZ that under the new takedown policy, individuals “can request the removal of third-party created content that features them, if it has been removed by the original publisher.”

The Google rep directed XBIZ to the full text of the new policy, which states that for the company to consider the content for removal, it must meet the following requirements:

The imagery shows you (or the individual you’re representing) nude, in a sexual act, or an intimate state.

You (or the individual you’re representing) didn’t consent to the imagery or the act and it was made publicly available, or the imagery was made available online without your consent.

You are not currently being paid for this content online or elsewhere.

For non-authorized commercial content, such as pirated material, that does not fall under those requirements, Google instead recommends requesting the removal under DMCA.

Two Specific Scenarios

According to the policy, if Individual A agrees to perform in an explicit sex scene for Company B and signs a contract, release form and 2257 form, which are in the possession of Company B, but then later changes their mind and wants the content removed from Search, the content can only be removed if Company B has withdrawn it from distribution.

Under the new policy, Google would also not automatically remove content if, for example, Individual A agreed to perform in an explicit sex scene for Company B, but Company B later sold the content and transferred the rights to Company C, which marketed it in a way that Individual A disapproved of, leading Individual A to request its removal from Search.

The performer might have other options, however, particularly if the third-party publisher were found to have utilized predatory means in the production of the content featuring the reporting user. A notable example of that scenario would be the GirlsDoPorn case.

Another scenario in which the performer could request removal of search images is if the third-party producer relinquished its rights to the content.

Related:  

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

NYC Adult Businesses Seek SCOTUS Appeal in Zoning Case

Attorneys representing a group of New York City adult businesses are asking the U.S. Supreme Court to hear an appeal of a lower court’s decision allowing enforcement of a 2001 zoning law aimed at forcing adult retail stores out of most parts of New York City.

Teasy Agency Launches Marketing Firm

Teasy Agency has officially launched Teasy Marketing firm.

Ofcom Investigates More Sites in Wake of AV Traffic Shifts

U.K. media regulator Ofcom has launched investigations into 20 more adult sites as part of its age assurance enforcement program under the Online Safety Act.

MintStars Launches Debit Card for Creators

MintStars has launched its MintStars Creator Card, powered by Payy.

xHamster Settles Texas AV Lawsuit, Pays $120,000

Hammy Media, parent company of xHamster, has settled a lawsuit brought by the state of Texas over alleged noncompliance with the state’s age verification law, agreeing to pay a $120,000 penalty.

RevealMe Joins Pineapple Support as Partner-Level Sponsor

RevealMe has joined the ranks of over 70 adult businesses and organizations committing funds and resources to Pineapple Support.

OnlyFans Institutes Criminal Background Checks for US Creators

OnlyFans will screen creators in the United States for criminal convictions, CEO Keily Blair has announced in a post on LinkedIn.

Pineapple Support to Host 'Healthier Relationships' Support Group

Pineapple Support is hosting a free online support group on enhancing connection and personal growth.

Strike 3 Rejects Meta 'Personal Use' Defense in AI Suit

Vixen Media Group owner Strike 3 Holdings this week responded to Facebook parent company Meta’s motion to dismiss Strike 3’s suit accusing Meta of pirating VMG content to train its artificial intelligence models.

Pornhub, Stripchat: VLOP Designation Based on Flawed Data

In separate cases, attorneys for Pornhub and Stripchat this week told the EU’s General Court that the European Commission relied on unreliable data when it classified the sites as “very large online platforms” (VLOPs) under the EU’s Digital Services Act, news organization MLex reports.

Show More