The Complete Guide to WordPress robots.txt

The Complete Guide to WordPress robots.txt

To ensure that your site ranks highly in Search Engine Result Pages (SERPs), you’ll need to make it easy for search engine ‘bots’ to explore its most important pages. Having a well-structured robots.txt file in place will help direct those bots to the pages you want them to index (and avoid the rest).

In this article, we’re going to cover:

  1. What a robots.txt file is, and why it’s important
  2. Where the WordPress robots.txt file is located.
  3. How to create a robots.txt file.
  4. What rules to include in your WordPress robots.txt file.
  5. How to test the robots.txt file, and submit it to Google Search Console.

By the end of our discussion, you’ll have everything you need to configure a perfect robots.txt file for your WordPress website. Let’s dive in!

Download All in One WordPress Cheat Sheet

What a WordPress robots.txt File Is (And Why You Need One)

An example of a robots.txt for WordPress.
WordPress’ default robots.txt file is pretty basic, but you can easily replace it.

When you create a new website, search engines will send their minions (or bots) to ‘crawl’ through it and make a map of all the pages it contains. That way, they’ll know what pages to display as results when someone searches for related keywords. At a basic level, this is simple enough.

The problem is that modern websites contain a lot more elements than just pages. WordPress enables you to install plugins, for example, which often come with their own directories. You don’t want these to show up in your search engine results, however, since they’re not relevant content.

What the robots.txt file does is provide a set of instructions for search engine bots. It tells them: “Hey, you can look here, but don’t go into those rooms over there!” This file can be as detailed as you want, and it’s rather easy to create, even if you’re not a technical wizard.

In practice, search engines will still crawl your website even if you don’t have a robots.txt file set up. However, not creating one is inefficient. Without this file, you’re leaving it up to the bots to index all your content, and they’re so thorough that they might end up showing parts of your website you don’t want other people to have access to.

More importantly, without a robots.txt file, you’ll have a lot of bots crawling all over your website. This can negatively impact its performance. Even if the hit is negligible, page speed is something that should always be at the top of your priorities list. After all, there are few things people hate as much as slow websites.

Where the WordPress robots.txt File Is Located

When you create a WordPress website, it automatically sets up a virtual robots.txt file located in your server’s main folder. For example, if your site is located at yourfakewebsite.com, you should be able to visit the address yourfakewebsite.com/robots.txt, and see a file like this come up:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/

This is an example of a very basic robots.txt file. To put it in human terms, the part right after User-agent: declares which bots the rules below apply to. An asterisk means the rules are universal and apply to all bots. In this case, the file tells those bots that they can’t go into your wp-admin and wp-includes directories. That makes a certain amount of sense since those two folders contain a lot of sensitive files.

However, you may want to add more rules to your own file. Before you can do that, you’ll need to understand that this is a virtual file. Usually, the WordPress robots.txt location is within your root directory, which is often called public_html or www (or is named after your website):

WordPress root folder

However, the robots.txt file WordPress sets up for you by default isn’t accessible at all, from any directory. It works, but if you want to make changes to it, you’ll need to create your own file and upload it to your root folder as a replacement.

We’ll cover several ways to create a new robots.txt for WordPress in a minute. For now, though, let’s talk about how to determine what rules yours should include.

What Rules to Include in Your WordPress robots.txt File

In the last section, you saw an example of a WordPress-generated robots.txt file. It only included two short rules, but most websites set up more than that. Let’s take a look at two different robots.txt files, and talk about what they each do differently.

Here is our first WordPress robots.txt example:

User-agent: *
Allow: /
# Disallowed Sub-Directories
Disallow: /checkout/
Disallow: /images/
Disallow: /forum/

This is a generic robots.txt file for a website with a forum. Search engines will often index each thread within a forum. Depending on what your forum is for, however, you might want to disallow it. That way, Google won’t index hundreds of threads about users making small talk. You could also set up rules indicating specific sub-forums to avoid, and let search engines crawl the rest of them.

You’ll also notice a line that reads Allow: / at the top of the file. That line tells bots that they can crawl through all of your website pages, aside from the exceptions you set below. Likewise, you’ll notice that we set these rules to be universal (with an asterisk), just as WordPress’ virtual robots.txt file does.

Now let’s check out another WordPress robots.txt example:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
User-agent: Bingbot
Disallow: /

In this file, we set up the same rules WordPress does by default. However, we also added a new set of rules that block Bing’s search bot from crawling through our website. Bingbot, as you might imagine, is the name of that bot.

You can get pretty specific about which search engine’s bots get access to your website, and which ones don’t. In practice, of course, Bingbot is pretty benign (even if it’s not as cool as Googlebot). However, there are some malicious bots out there.

The bad news is that they don’t always follow the instructions of your robots.txt file (they are rebels, after all). It’s worth keeping in mind that, while most bots will follow the instructions you provide in this file, you aren’t forcing them to do so. You’re just asking nicely.

If you read up on the subject, you’ll find a lot of suggestions for what to allow and what to block on your WordPress website. However, in our experience, fewer rules are often better. Here’s an example of what we recommend your first robots.txt file should look like:

User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/

Traditionally, WordPress likes to block access to the wp-admin and wp-includes directories. However, that’s no longer considered a best practice. Plus, if you add metadata to your images for Search Engine Optimization (SEO) purposes, it doesn’t make sense to disallow bots from crawling that information. Instead, the above two rules cover what most basic sites will require.

What you include in your robots.txt file will depend on your specific site and needs, however. So feel free to do some more research on your own!

How to Create a WordPress robots.txt File (3 Methods)

Once you’ve decided what will go in your robots.txt file, all that’s left is to create one. You can edit robots.txt in WordPress either by using a plugin or manually. In this section, we’ll teach you how to use two popular plugins to get the job done and discuss how to create and upload the file on your own. Let’s get to it!

1. Use Yoast SEO

The Yoast SEO plugin.

Yoast SEO hardly needs an introduction. It’s the most popular SEO plugin for WordPress, and it enables you to optimize your posts and pages to make better use of your keywords. Aside from that, it also provides you with help when it comes to increasing your content’s readability, which means more people will be able to enjoy it.

Personally, we’re fans of Yoast SEO due to its ease of use. That applies just as much to creating a robots.txt file. Once you install and activate the plugin, navigate to the SEO -› Tools tab in your dashboard, and look for the option that says File editor:

The Yoast SEO file editor.

Clicking on that link will send you to a new page, where you can edit your .htaccess file without leaving your dashboard. There’s also a handy button that reads Create robots.txt file, which does exactly what you’d expect:

Creating a robots.txt for WordPress.

Once you click on this button, the tab will display a new editor where you can modify your robots.txt file directly. Keep in mind that Yoast SEO sets its own default rules, which override your existing virtual robots.txt file.

Whenever you add or remove rules, remember to click on the Save changes to robots.txt button, so they persist:

Edit robots.txt in WordPress.

That’s easy enough! Now let’s see how another popular plugin does the same thing.

2. Through the All in One SEO Pack Plugin

The All in One SEO Pack plugin.

All in One SEO Pack is the other big name when it comes to WordPress SEO. It includes most of the features Yoast SEO does, but some people prefer it because it’s a more lightweight plugin. As far as robots.txt goes, creating the file with this plugin is also just as simple.

Once you have the plugin set up, navigate to the All in One SEO > Feature Manager page in your dashboard. Inside, you’ll find an option called Robots.txt, with a conspicuous Activate button right below it. Go ahead and click on that:

Activating the plugin's robot.txt feature.

Now, a new Robots.txt tab will show up under your All in One SEO menu. If you click on it, you’ll see options to add new rules to your file, save the changes you make, or delete it altogether:

Adding new rules to your robots.txt file.

Do note that you can’t make changes to your robots.txt file directly using this plugin. The file itself is grayed out, unlike with Yoast SEO, which enables you to type in whatever you want:

A greyed out robots.txt for WordPress.

In any case, adding new rules is simple, so don’t let that small downside discourage you. More importantly, All in One SEO Pack also includes a feature that can help you block ‘bad’ bots, which you can access from your All in One SEO tab:

Using the plugin's bad bot blocking feature.

That’s all you need to do if you choose to use this method. However, let’s talk about how to create a robots.txt file manually, if you don’t want to set up an extra plugin just to take care of this task.

3. Create and Upload Your WordPress robots.txt File Via FTP

Creating a txt file couldn’t be simpler. All you have to do is open up your favorite text editor (such as Notepad or TextEdit), and type in a few lines. Then you can save the file, using any name you want and the txt file type. It literally takes seconds to do this, so it makes sense that you might want to edit robots.txt in WordPress without using a plugin.

Here’s a quick example of one such file:

Creating a robots.txt file manually.

For the purposes of this tutorial, we saved this file directly to our computer. Once you have your own file created and saved, you’ll need to connect to your website via FTP. If you’re not sure how to do that, we have a guide to doing this using the beginner-friendly FileZilla client.

Once you’re connected to your site, navigate to the public_html folder. Then, all you have to do is upload the robots.txt file from your computer over to your server. You can do that by either right-clicking on the file using your FTP client’s local navigator, or by simply dragging and dropping it into place:

Uploading the file to your root folder.

It should only take a few seconds for the file to upload. As you can see, this method is nearly as simple as using a plugin.

How to Test Your WordPress robots.txt File and Submit It to Google Search Console

Once your WordPress robots.txt file is created and uploaded, you can use Google Search Console to test it for errors. The Search Console is a collection of tools Google offers to help you monitor how your content appears in its search results. One of these tools is a robots.txt checker, which you can use by logging into your console and navigating to the robots.txt Tester tab:

The Search Console tester feature.

Inside, you’ll find an editor field where you can add your WordPress robots.txt file code, and click on the Submit button right below. Google Search Console will ask if you want to use that new code, or pull the file from your website. Click on the option that says Ask Google to Update to submit it manually:

Submitting your file to the Search Console

Now, the platform will check your file for errors. If there are any, it will point them out for you. However, you’ve seen more than one WordPress robots.txt example by now, so chances are high that yours is perfect!

Conclusion

In order to increase your site’s exposure, you’ll need to ensure that search engine bots are crawling the most relevant information. As we have seen, a well-configured WordPress robots.txt file will enable you to dictate exactly how those bots interact with your site. That way, they’ll be able to present searchers with more relevant and useful content.

Do you have any questions about how to edit robots.txt in WordPress? Let us know in the comments section below!

WordPress robots.txt FAQ

Here are some frequently asked questions about WordPress robots.txt.

Can I Disable robots.txt in WordPress?

No, you cannot disable robots.txt in WordPress as it is a standard file used to communicate with search engines and instructs them which pages to crawl or not. However, you can modify the content of the file to control how search engines interact with your site.

How Do I Optimize My WordPress Site’s robots.txt?

To optimize your WordPress site’s robots.txt, you can use plugins like Yoast SEO to generate a robots.txt file with recommended settings. You can also modify the file manually to control which pages and directories to block or allow search engines to crawl.

Author
The author

Will M.

Will Morris is a staff writer at WordCandy. When he's not writing about WordPress, he likes to gig his stand-up comedy routine on the local circuit.