Blackhole Pro requires adding a few rules to your site’s robots.txt file. If you are using the WordPress-generated dynamic/virtual robots.txt file, the plugin adds the required rules for you automatically. Only if you are using an actual/physical robots.txt
on the server do you need to add the rules manually (copy/paste). This post explains more about the virtual vs. physical robots.txt
files, and explains how to convert from a virtual robots file to a physical one. The post also provides some tips for testing and validating your robots rules.
robots.txt
file, the required robots rules are added to robots.txt
automatically. So you only need to add the provided robots rules if you are using a actual/physical robots.txt
file. Check out this post for more information.Where is my robots.txt file?
By default if your WordPress site does not have its own existing robots.txt
file, like if an actual/physical text file named “robots.txt” does not exist on the server, WordPress will serve its own virtual robots.txt
. The contents of the virtual robots file looks something like this (depending on your site setup and configuration):
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://example.com/wp-sitemap.xml
You can view your site’s robots rules by requesting the following URL in a fresh browser (empty cache/cookies/history):
https://example.com/robots.txt
Where example.com
is replaced with your actual domain name. Requesting that URL when there is no physical robots.txt
file on the server, will result in WordPress serving its own virtual copy that contains the rules shown above.
Further, there are some plugins, such as “Rank Math” and “Hide My Admin” that can be used to serve a virtual robots.txt
file even if a physical copy of robots.txt
exists on the server. So keep an eye out for that. If you’re getting weird/unexpected results when trying to view your robots file, it could be that there is some plugin that is interfering somehow.
Bottom Line: there are two possible ways your site may be serving its robots file: physically or virtually. You will need to find out which is happening in order to add any rules or make any changes to robots.txt
.
Add rules to PHYSICAL robots.txt
If an actual/physical copy of robots.txt
exists on your server, then you can open it and add the required Blackhole robots.txt rules and/or make any changes as desired. This should be straightforward: open the file, add the code, save changes, done.
Add rules to VIRTUAL robots.txt
If an actual/physical copy of robots.txt
does NOT exist on the server, or if you have a plugin that is serving its own virtual copy of robots.txt
, then you can add the required Blackhole robots.txt rules via one of three methods, depending on how your virtual robots file is being served. Let’s have a closer look..
robots.txt
file, the required robots rules are added to robots.txt
automatically. So you only need to add the provided robots rules if you are using a actual/physical robots.txt
file. Check out this post for more information. The following information remains for reference purposes only.Method 1: add robots rules programmatically
If your site’s robots.txt
file is served virtually by WordPress, or generated by a plugin. The easiest way to add the required Blackhole rules is to do it programmatically. Simply add the following function via your (child) theme’s functions.php
file, or add via simple plugin:
function blackhole_robots_wordpress_custom($output, $public) {
return $output . "\n" .'User-agent: *'. "\n" .'Disallow: /*blackhole'. "\n" .'Disallow: /?blackhole'. "\n";
}
add_filter('robots_txt', 'blackhole_robots_wordpress_custom', 10, 2);
No edits are required. Copy/paste, save changes, and done.
If for whatever reason it is not possible to add the above code snippet, you will need to replace the virtual WP-generated robots.txt
file with an actual physical copy on the server. In such case, one of the following techniques will get you there.
Method 2: robots.txt served by WordPress
If your site’s robots.txt
file is served virtually by WordPress:
- Before doing anything, open your site’s virtual
robots.txt
in a browser - Copy the contents of the file and set aside for use in a later step
- Create a new file named “robots.txt” in the public root directory of your site
- Open the file and paste in the robots rules you copied in step 2 above
- Now you can add the Blackhole rules as explained here
After completing those steps, your robots file should look similar to the example below
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://example.com/wp-sitemap.xml
User-agent: *
Disallow: /*blackhole
Disallow: /?blackhole
Again, this is just an example; your rules may look different depending on your setup, current version of WordPress, and so forth.
Finally, make sure to save the changes made to the file, and upload to your server.
Method 3: robots.txt served via plugin
OR, if your site’s robots.txt
file actually exists on the server, but you are using some plugin to override and serve a virtual robots file:
- Find out how to add rules and/or make changes to the plugin’s virtual file
- Ask the plugin provider if you cannot find instructions in the plugin documentation
Testing
Regardless of which method you use to deliver your robots.txt
file, once you’ve finished making changes to your site, it is recommended to validate your rules with a free online robots.txt checker. There are several good/free tools available to help with this: