Block bots, boost rankings: How to master WP robots.txt

What is a WordPress robots text file?

Robots.txt is a simple text file stored in the root directory of your web server that instructs bots where they can and cannot go on a website. It is universal to all platforms, including websites powered by WordPress.

To view the robots text file of any website just add /robots.txt to the domain’s URL. For example, Google has a rather impressive robots.txt file.

The Robots Exclusion Protocol was originally developed in the early days of the web as a solution for controlling what parts of a website were off-limits to search engine crawlers and other web bots.

What is a bot anyway?

Bots (AKA web robots) are Internet software programs that visit websites. Search engine crawlers are a type of bot that visits (crawls) websites to index their content for inclusion in search engine results.

What is a robots text file used for?

Robots text files are most commonly used for restricting search engine crawler access to private or non-relevant website files like those located in admin directories and login URLs.

Other uses include limiting the frequency in which bots are allowed to crawl your website and optimizing crawl budgets so that bots are only focused on the most important pages of your site.

Where is robots.txt located in WordPress?

The robots.txt file is stored in the same location for every website, including WordPress. Which is located in the root directory of your web server.

Example: https://sitename.com/robots.txt

Is a robots text file necessary for WordPress?

No. If a bot visits your website without a robots text file in place it will crawl and index pages as it normally would.

Does a robots text file prevent indexing by search engines?

No. The use of a noindex meta tag is the ONLY way to prevent crawler indexing of website content from search engines. Robots.txt are essentially guidelines that bots can choose to ignore if they want to.

Google, for example, adheres to the majority of robots.txt directives save for any commands that involve managing crawl frequency. Which tells search engine crawlers how often they are allowed to visit.

Google will also index disallowed content if it has acquired external links from other websites.

WordPress robots txt file SEO example

Here’s an example of a simple search engine optimized robots.txt file for a WordPress-powered website.

WordPress Robots txt Example

What the WordPress robots text file means

It designates which bots (user-agents) the instructions apply to, what sections and pages of the website are accessible (allow/disallow), and the location of the website’s XML sitemap.

  • User-agent: * – all web bots can access the website given the following directives
  • Disallow: /wp-login.php – web bots are NOT allowed access to this URL
  • Disallow: /wp-admin/ – web bots are NOT allowed access to any URLs in this directory
  • Allow: /wp-admin/admin-ajax.php – web bots are allowed to access this URL located within a disallowed directory
  • Sitemap: https://sitename.com/sitemap_index.xml – the URL location of the website’s XML sitemap
Use robots.txt to block ChatGPT

Use robots.txt to block Google Bard (not SGE) and Vertex

In order to stop Google SGE (search generative experience) from using your website you’ll have to resort to blocking Googlebot.

How do I test a robots.txt file?

The most optimal method of testing your robots txt file is with Google’s robots.txt tester tool, which is available in Google Search Console for verified websites.

Tame crawlers by keeping your WP robots.txt updated

An updated robots.txt file assists search engine crawlers with indexing the most important content on your website. Keeping this file up to date often requires the assistance of web development to identify areas of the site that are off-limits to bots. The ongoing optimization and maintenance of a robots txt file for WordPress is a component of a successful managed SEO strategy.

Contact us for a free robots text file evaluation.

About us

We’re an SEO consultancy that has optimized websites for hundreds of organizations located throughout the US (and beyond) since 2009.

Leave a Reply