Plugin Planet

Premium WordPress Plugins

Blackhole Pro installation instructions include adding two rules to your site’s robots.txt file. Usually this is straightforward: open your site’s robots.txt file, add the two lines, and done. But.. what if your site is using a virtual robots file? This post explains the issue and explains how to add rules to virtual robots.txt.

Related: Optimizing robots.txt for WordPress? Check out my tutorial on Better robots.txt rules for WordPress »

Where is my robots.txt file?

By default if your WordPress site does not have its own existing robots.txt file, like if an actual/physical text file named “robots.txt” does not exist on the server, WordPress will serve its own virtual robots.txt. The contents of the virtual robots file looks like this:

User-agent: *
Disallow: /wp/wp-admin/
Allow: /wp/wp-admin/admin-ajax.php

You can test this by requesting the following URL in a fresh browser (empty cache/cookies/history):

https://example.com/robots.txt

Where example.com is replaced with your actual domain name. Requesting that URL when there is no physical robots.txt file on the server, will result in WordPress serving its own virtual copy that contains the rules shown above.

Further, there are some plugins, such as “Rank Math” and “Hide My Admin” that can be used to serve a virtual robots.txt file even if a physical copy of robots.txt exists on the server. So keep an eye out for that. If you’re getting weird/unexpected results when trying to view your robots.txt, it could be that there is some plugin that is interfering somehow.

Bottom Line: there are two possible ways your site may be serving its robots.txt file: physically or virtually. You will need to find out which is happening in order to add any rules or make any changes to robots.txt.

Add rules to physical/existing robots.txt

If an actual/physical copy of robots.txt exists on your server, then you can open it and add the required Blackhole robots.txt rules and/or make any changes as desired. This should be straightforward: open the file, add the code, save changes, done.

Add rules to VIRTUAL robots.txt

If an actual/physical copy of robots.txt does not exist on the server, or if you have a plugin that is serving its own virtual copy of robots.txt, then you can add the required Blackhole robots.txt rules via one of two methods, depending on how your virtual robots file is being served.

Method 1: robots.txt served by WordPress

If your site’s robots.txt file is served virtually by WordPress:

  1. Before doing anything, open your site’s virtual robots.txt in a browser
  2. Copy the contents of the file and set aside for use in a later step
  3. Create a new file named robots.txt in the public root directory of your site
  4. Open the file and paste in the robots rules you copied in step 2 above
  5. Now you can add the Blackhole rules as explained here
  6. When you are finished, your robots file should look similar to the example below
User-agent: *
Disallow: /wp/wp-admin/
Allow: /wp/wp-admin/admin-ajax.php
Disallow: /?blackhole

Again, this is just an example; your rules may look different depending on your setup, current version of WordPress, and so forth.

Finally, make sure to save the changes made to the file, and upload to your server.

Method 2: robots.txt served via plugin

OR, if your site’s robots.txt file actually exists on the server, but you are using some plugin to override and serve a virtual robots file:

  1. Find out how to add rules and/or make changes to the plugin’s virtual file
  2. Ask the plugin provider if you cannot find instructions in the plugin documentation

Testing

Regardless of which method you use to deliver your robots.txt file, once you’ve finished making changes to your site, it is recommended to validate your rules with a free online robots.txt checker. There are several good/free tools available to help with this.