A couple of weeks ago, Google launched an improved Crawl Stats Report in the Search Console, helping businesses take their search engine optimization (SEO) game to a new level.
In a nutshell, the new Crawl Stats Report comes with features that allow digital marketers to “spy” on Google. For instance, you can now track changes in crawling patterns which are often an indicator that something is either improving or going wrong with a website.
Let’s look at some of the new Crawl Stats Report’s key features and learn how they can be interpreted for maximum results.
The total number of requests grouped by response code crawled file type, crawl purpose, and Googlebot type.
This addition enables you to see the total number of crawl requests issued for URLs on your site, whether successful or not. Why is this important? Sites that get more crawl requests from Googlebot (the generic name for Google's web crawler) are more likely to score a higher SERP ranking. If your server responds to requests quicker, bots might be able to crawl more pages on your site.
In the new Crawl Stats Report, unsuccessful requests are now neatly organized, so you can clearly see if they are due to either of these problems: Domain Name System (DNS) resolution issues (suggesting network connectivity troubles), redirect loop errors (which are a sign that some of the site's cached information is incorrect or out of date), etc.
Here's how you can optimize your pages and resources for crawling:
Some positive signs to look for include:
These tables are a goldmine of information, thanks to example URLs. You simply click into any of the grouped data type entries (response, file type, purpose, Googlebot type) and see a list of example URLs of that type.
High level & detailed information on host status issues
Host status describes whether or not your site was available for crawling in the last 90 days. Ideally, it should be accompanied by a green icon. This would be the case, even if it’s been more than a week since Google encountered a single significant crawl availability issue. The Response table can shed more light on this matter, so you can decide whether you need to take any action.
If your availability status is red, it shows that Google ran into multiple crawl availability errors in the last week on your site and you should check if this is a recurring problem.
Over time charts
You’ll also be able to see crawl data totals and overtime charts for total requests, total download size and average response time.
One tab to pay attention to is the average response time that Google took to download resources from your website during the crawl process. If your site has poor page response times, it means that bots won’t recrawl pages as often, negatively impacting your rankings.
One way to solve this issue is by preventing large and irrelevant resources (such as decorative images) from being loaded by Googlebot using robots.txt.
How to interpret crawl spikes and drops
LOW CRAWL RATE
Unless you’ve just launched your site and don’t have many backlinks, a low crawl activity could point out a couple of issues:
HIGH CRAWL RATE
Spikes in crawl rate can typically be triggered by a significant volume of fresh content recently added such as a large new website section. Another reason might be if you unblocked a large section of your site from crawling. Whatever the case, here are some ways to protect your website from being heavily crawled:
For Google to consider your website for ranking, it first needs to properly crawl it. The Crawl Stats Report helps you understand how Googlebot is interacting with your website, so you can make the necessary changes for SEO success.
However, to detect all issues preventing your site from outranking competitors, specialized SEO tools might be needed.