Google Unable to Crawl robots.txt?

Rate this post
Many webmasters face the difficulty of googlebot not able to crawl their robots.txt file. There are several reasons for this like the server not allowing access to robots, improper configuration of robots.txt file etc. In order to check whether googlebot has successfully crawled the robots.txt file, you can take help of the following tools:-

1- Login to Google Webmaster Tools and check your robots.txt with “fetch as Google” tool located under health.

2- Check the server health with intodns and solve any DNS issues with it.

3- Check the server load time with Webpagetest.

4- Check your robots.txt for any errors with robotschecker.

5- Check if the server is allowing robot access with Websniffer.

Last, but not the least, have a look at this Google Webmaster video by Matt Cutts –



Also See:- 

Google’s Take on Sitewide Links
Google Human Quality Raters Do Not Influence A Website Ranking Directly
Google Now Cards
Google Expands Knowledge Graph
Google Disavow Links Tool
Query Highlighting on Google
How to Create Backlinks That Google Loves
Google to Take Action Against Spammy Guest Blogging
Seo Tools