Google released its official robots.txt checker tool

Google has released an updated robots.txt testing tool in Webmaster Tools. The tool can be found in the Crawl section.

 

new robots.txt checker tool

new robots.txt checker tool

As Google’s Asaph Amon, described about the tool. “To guide your way through complicated directives, it will highlight the specific one that led to the final decision. You can make changes in the file and test those too, you’ll just need to upload the new version of the file to your server afterwards to make the changes take effect. Our developers site has more about robots.txt directives and how the files are processed.”

 

test for robot blocks

“Additionally, you’ll be able to review older versions of your robots.txt file, and see when access issues block us from crawling,” Amon explains. “For example, if Googlebot sees a 500 server error for the robots.txt file, we’ll generally pause further crawling of the website.”