A common question asked is How to Fix Crawl Errors in Google Search Console?

Practice preventative maintenance, make weekly spot checks of crawl errors in Google search console, to keep your site ranking high.

Fix Crawl Errors in Google Search Console for high rankings

How often should you check for site errors?

At minimum, check Google search console at least every 90 days to look for site errors.

DNS Errors

DNS (Domain Name System) errors are the first and most prominent error because if the Googlebot is having DNS issues, it means it can’t connect with your domain via a DNS timeout issue or DNS lookup issue.

A DNS issue is extremely important, as it’s the first step in accessing your website. You should take swift and violent action if you’re running into DNS issues that prevent Google from connecting to your site in the first place.

Google recommends using their Fetch as Google tool to view how Googlebot crawls your page. Fetch as Google lives in Search Console.

Server Errors

A server error means your server is taking too long to respond, and the request times out. The Googlebot that’s trying to crawl your site can only wait a certain amount of time to load your website, and if it takes too long, the Googlebot stops trying.

A DNS error means the Googlebot can’t lookup your URL. Server errors mean, that although the Googlebot can connect to your site, it can’t load the page.

Server errors may happen if your website gets overloaded. To avoid this, make sure your hosting provider can scale up to accommodate sudden bursts of website traffic.

Google’s official direction for fixing server errors:

“Use Fetch as Google to check if Googlebot can currently crawl your site. If Fetch as Google returns the content of your homepage without problems, you can assume that Google is generally able to access your site properly.”

Robots failure

A Robots failure means Googlebot cannot retrieve your robots.txt file, located at yourdomain.com/robots.txt.

A robots.txt file is only necessary if you don’t want Google to crawl certain pages. Ensure that your robots.txt file is properly configured or use none at all.

URL Errors

URL errors are different from site errors because they effect only specific pages on your site.

Google ranks the most important errors first and some of these errors may already be resolved.

One helpful tactic to repair URL errors is to mark all errors as fixed and check back in a few days.

Soft 404

A soft 404 error is when a page displays as 200 (found) when it should display as 404 (not found). If the pages listed as soft 404 errors aren’t critical pages they aren’t urgent to fix.

404 Error

A 404 error means that the Googlebot tried to crawl a page that doesn’t exist on your site. Googlebot finds 404 pages when other sites or pages link to that non-existent page.  404 errors are urgent if important pages on your site are showing up as 404. If your page is dead, make the page live again. If you don’t want that page live, 301 redirect it to the correct page.

Access denied

Access denied means Googlebot can’t crawl the page. Unlike a 404, Googlebot is prevented from crawling the page in the first place.

Similar to soft 404 and 404 errors, if the pages being blocked are important for Google to crawl and index, take immediate action. To fix access denied errors,  remove the element that’s blocking the Googlebot’s access.

Not followed

Not to be confused with a “nofollow” link directive, a “not followed” error means that Google couldn’t follow that particular URL.

Most often these errors come from Google issues with Flash, Javascript, or redirects.

If you’re dealing with not followed issues on a high-priority URL, then these are important.

If your issues are stemming from old URLs that are no longer active, or from parameters that aren’t indexed, the priority level is lower.

Be sure to review how Google is currently handling your parameters and specify changes in the URL Parameters tool if you want Google to treat your parameters differently.