Plugin Planet

Premium WordPress Plugins

To help Blackhole Pro trap only bad bots, it’s important to add a couple of rules to your site’s robots.txt file. This post explains the details.

Robots Rules

In order for Blackhole to work properly, add the provided rules to your site’s robots.txt file. The rules are provided on the plugin settings screen in the “Robots Rules” section. They will look something like this (note that this is just an example):

User-agent: *
Disallow: /?blackhole

If your robots.txt is blank, then add both rules. If your robots.txt already includes User-agent: * then add only the second rule. For more complex robots.txt configurations, consult your web developer.

Important: proper robots.txt syntax is critical for good SEO. So make sure to validate your robots.txt rules after making any changes. For more information check out my post, better robots rules for WordPress.

Why is this necessary?

Why the robots.txt rules? Because you only want to trap “bad” bots, not good bots. The robots.txt rules explicitly instruct all bots to NOT crawl the blackhole link. So any bots that follow the link are disobeying robots.txt and will be banned from further site access.

Resources