educational

Robot and Spider Control

Editor’s note: Search engine spiders are typically the only kind of spiders that Webmasters want to see hanging around. These robots quietly crawl their way around the World Wide Web seeking out every page they can find, and reporting their contents back to their search engine masters. This is usually a welcome operation as it often leads to more ‘free’ traffic – but occasionally robots find their way into places we wish they wouldn’t, exposing sensitive information for the world to see… Here’s how to help prevent this from happening: ~ Stephen

Before submitting your site to the search engines, you will want to consider what pages and links you want the search engine "robot" (the program that indexes your site) to "spider" (follow), and what pages you don’t want it to follow – since you may have pages with sensitive information, a ‘scrap directory’ full of "work in progress," or a protected "members area" that you would not like listed.

This goal can easily be achieved in two ways. The first way is with a robots.txt file placed in the root directory of your Website, but you must have full domain privileges in order for this to work. While this article is not meant to deal with the intricacies of the robots.txt file, a quick word of warning is in order: never leave this file empty, as it will indicate to some robots that you do not want any part of your site indexed.

The other way to stop most ‘bots’ from searching or indexing your page is to use META exclusion tags. This is often the only way that Webmasters on virtual or free hosts without full server access can hope to control a spider’s wanderings and reports on a page-by-page basis. The syntax is simple:

<META name="ROBOTS" content="ALL,NONE,INDEX,FOLLOW,NOINDEX,NOFOLLOW">

The default value for the robots tag is "ALL" which allows the robot to index the page, then spider all links, indexing the linked pages too. "NONE" performs the opposite, disallowing the robot from either indexing the page, or spidering the links on it, in essence ignoring the page altogether.

"INDEX" indicates that robots should include this page in their search engines, while "FOLLOW" means that robots should follow (spider) the links on this page. Conversely, a value of "NOINDEX" allows links from the page to be spidered, even though the page itself is not indexed, while a value of "NOFOLLOW" allows the page to be indexed, but no links from the page are to be spidered.

Some Sample Snippets
Here’s some example robot controlling META tags, which would be put in between your document’s <HEAD> and </HEAD> tags:

<META name="ROBOTS" content="NOINDEX">
- This will prevent the bot from indexing that page.

<META name="ROBOTS" content="NOFOLLOW">
- This allows the page to be indexed, but any hyperlinks in that page will not be spidered.

<META name="ROBOTS" content="NOINDEX,NOFOLLOW">
- Is a combination of the two, where the page will not be indexed, and other links will not be followed. This tag may also prevent some mirroring software from downloading the page.

While there are many other META tags that can be used to improve your rankings, controlling what’s ranked is the first step, after which it’s wiser to invest your time in optimizing your description and keywords tags in order to boost your search engine rankings, which is the subject of my next article…

Copyright © 2026 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More Articles

profile

LoyalFans' Anastasia Pierce Bridges Creator Education, Empowerment and Ownership

Anastasia Pierce beams when she talks about her 26 years in the industry. Full of passionate energy, she clearly doesn’t just work in adult; she loves it.

Women In Adult ·
opinion

Growing Site Revenue Under Ever-Changing Compliance Rules

Over the past year, many merchants have reported earnings that were flat or even a bit down. This is due to three main factors: age verification regulations, click-to-cancel rules, and banks backing away from cross-sales due to regulatory requirements and the rollout of the Visa Acquiring Monitoring Program (VAMP).

Cathy Beardsley ·
opinion

AI Safeguards for Platform Compliance and Trust

If your platform hosts user-generated content (UGC), then you already know protecting your brand is not merely a matter of good design or strong community guidelines. It requires systems that can verify who your users are, filter what they upload and ensure your business stays on the right side of regulators, payment processors and public opinion.

Christoph Hermes ·
opinion

How to Eliminate User Redirects and Improve Checkout Retention

Running an adult site, you work hard to create traffic and make sure your funnel is optimal, with the end goal of getting users to make a purchase. Then, right at that critical moment, what do you do? You send them somewhere else. Not good.

Jonathan Corona ·
profile

Stripchat's Jessica on Building Creator Success, One Step at a Time

At most industry events, the spotlight naturally falls on the creators whose personalities light up screens and social feeds. Behind the booths, parties and perfectly timed photo ops, however, there is someone else shaping the experience.

Jackie Backman ·
opinion

Inside the OCC's Debanking Review and Its Impact on the Adult Industry

For years, adult performers, creators, producers and adjacent businesses have routinely had their access to basic financial services curtailed — not because they are inherently higher-risk customers, but because a whole category of lawful work has long been treated as unacceptable.

Corey Silverstein ·
opinion

How to Build Operational Resilience Into Your Payment Ecosystem

Over the past year, we’ve watched adult merchants weather a variety of disruptions and speedbumps. Some even lost entire revenue streams overnight — simply because they relied too heavily on a single cloud provider that suffered an outage, lacked sufficient redundancy and failover, or otherwise fell short when it came to making sure their business was protected in case of unwelcome surprises.

Cathy Beardsley ·
opinion

Building a Stronger Strategy Against Card-Testing Bots

It’s a scenario every high-risk merchant dreads. You wake up one morning, check your dashboard and see a massive spike in transaction volume. For a fleeting moment, you’re excited at the premise that something went viral — but then reality sets in. You find thousands of transactions, all for $0.50 and all declined.

Jonathan Corona ·
opinion

A Creator's Guide to Starting the Year With Strong Financial Habits

Every January brings that familiar rush of new ideas and big goals. Creators feel ready to overhaul their content, commit to new posting schedules and jump on fresh opportunities.

Megan Stokes ·
profile

Pornnhub's Jade Talks Trust and Community

If you’ve ever interacted with Jade at Pornhub, you already know one thing to be true: Whether you’re coordinating an event, confirming deliverables or simply trying to get an answer quickly, things move more smoothly when she’s involved. Emails get answered. Details are confirmed. Deadlines don’t drift. And through it all, her tone remains warm, friendly and grounded.

Women In Adult ·
Show More