The SEO Starter Guide to Google Search ConsoleChapter eight

How to Generate & Test a Robots.txt File

Sometimes you don’t want Googlebot indexing every page of your site – the robots.txt file tells webcrawlers which pages they should access and which pages they should ignore. You can use this to make sure that certain content is blocked from search engines like staging sites, internal search result pages, paid search landing pages, or more.

Creating and Editing Your Robots.txt File

Note: You can click on all screenshots below to view at a larger size.

Step 1: Log in to Search Console and select Crawl, then click robots.txt on the left navigation

Step 2: You are now looking at your robots.txt file and can make any edits within the tester

Step 3: Once you’ve completed your edits and your robots.txt looks the way you want it, click Submit

submit robots txt file in gsc

Step 4: Download the code and upload the updated robots.txt to your site’s root domain

Step 5: After you’ve uploaded the new version to your website, return to Google Search Console and click View uploaded to verify that the correct version is live

view uploaded robots txt in gsc

Step 6: Click Submit to let Google know your robots.txt has been updated

Testing a Robots.txt File in Search Console

You can use the Tester at the bottom of the page to choose from Google’s user-agents (different robots/crawlers), enter a URL, and test whether or not the URL is recognized by that crawler with the current robots.txt directives.

testing robots txt in gsc