3 ways to fix crawl errors using Google webmaster
Posted on: February 14, 2017
Over the last five years, a lot has taken place as Google keeps working on new methods of anchoring your online marketing efforts. After changing Webmaster Tools to Google Console, it has become one of the most important components that every website owner, brand manager, and marketers have to know to help to fix website issues. Some of these issues that can compromise and cripple your marketing effort completely are website crawl errors. They make it difficult for Google bots to crawl your pages and, therefore, people cannot find you and the pages cannot be ranked.
Note that Google Console categorizes crawl errors into URL error and site errors. The site errors can be very devastating because they have the potential to compromise the entire website usability. However, URL level errors are page specific which means that if they affect one page, the others could still be running effectively.
This post will outline how to fix 3 main website crawl errors:
DNS Errors
These are Domain System Errors and are the commonest with most websites. The errors mean that your website does not connect and always give a DNS timeout issue. If Google console indicates that Google bots can still connect, you do not have to worry. However, if the DNS errors are severe, it is important to act immediately.
To address the problem, make sure to utilize Fetch and Render since Google stays in the Search Console. This shows how Google sees your website in comparison to the user. You can also use tools such as ISUP.me or Web-Sniffer.net that show all the present HTTP(s) responses and headers. If the problem is more complex and cannot be fixed at this level, make sure to check with your DNS provider.
Server errors
The commonest server error is when a server takes a lot of time before responding and the request times out. Google Bots can only wait for a very short period and then give up. If the problem happens severally, the bots will even stop trying. While DNS error means that bots cannot locate your site, server errors mean that though the pages can be found, the bots are finding it hard crawling them. The problem often happens because the page has a lot of information to handle.
To address the issue, it is important to get a hosting provider that can accommodate a sudden increase in traffic. For example, can your hosting provider handle the website when it goes viral? Besides, you should use Fetch as Google to establish whether bots can find the page.
Robots failure
This error indicates that Google Bots are unable to get your website robots.txt file. Remember that this file is used when you have some files that you do not want to be crawled and indexed. If the file is unavailable, the server will give Error 404 and then proceed to crawl other parts. If your page is small with minimal changes on a regular basis, this error should not make you worried. However, sites that change content every day such as blogs need to fix it immediately.
To address the issue, go back to the Robots.txt file and configure it appropriately. You should particularly double or triple check the pages you instruct Bots not to crawl. Make sure to also triple check the ‘disallow:/’ to ensure it DOES NOT feature anywhere because it will make your entire site unavailable in Google searches.