Why You Should Care About Google's Crawl Budget?

To keep Google's index up-to-date and user-friendly, the internet giant sends so-called web crawlers to the World Wide Web. These bots search the internet for new information. This process is called "crawling". Some sites are crawled more often and more deeply than others. This is where the Crawl Budget comes into play. We tell you what it is and how website owners can influence it.

What Does That Mean: Crawl Budget?

The Crawl Budget specifies how many pages of a website are crawled and migrated to the index. The number of pages which are actually crawled is determined by Google itself and depends on the domain popularity and the "trust rank" of the page. But there are crawl limits for each website.

The performance of the website mainly determines the crawl rate. If you have fast loading webpages  - so if you respond quickly to your users - the crawl rate increases. And vice versa: when your website has a long load time, the Google bot assumes you have a weak server, and the crawl rate decreases, meaning that fewer content is crawled and indexed. The limit can also be set by the webmaster in the Google Search Console.

search console crawl rate site setting
search console crawl rate site setting

However, because your crawl limit is high, it doesn't necessarily mean Googlebot crawls more URLs of your site. Google still decides this itself. But how exactly can you influence the crawl rate?

The Crawl Demand Is Important Too

There are two factors that make Googlebot increase your crawl rate on your site:

1. Popularity - Popular URLs from large sites are crawled more frequently to keep Google's index fresh 

2. Freshness - Google doesn't like lazy websites. So if you work on the pages and keep them fresh, the Googlebot will be alerted and will look more often.


If you combine the crawl rate and the crawl demand, you get the Crawl Budget. If this budget decreases, fewer pages of your site are crawled and they get found less.

If you don't want your Crawl Budget to shrink, you have to avoid the following:

  • Website / Server Errors (404 error messages for example...)
  • Hacked Pages
  • Infinite Space - Pages without content
  • Bad Content or Spam
  • Confusing Navigation - The navigation must remain user-friendly 
  • On-Site duplicate content - same content across different URLs

Summary: Why Should You Care About This?

To ensure a consistent crawl budget, your site needs to be optimized continuously. Non-functioning bottlenecks are an absolute no go. The loading times of each URL also play an important role. The crawl rate is not a ranking factor by Google because it is not about where you land in the search results, but whether and with how many URL's you show up.

The Googlebot determines what, how much and how often he crawls. However, you can still control the crawl budget by generating good and fresh content which are important for your business and your users. Pages with poor information for the users or error pages can be identified and excluded from crawling.

If you want to learn more about the crawl budget and how the Googlebot works, visit the Google Webmaster Central Blog for more information.

Halide
03 Feb 2017

Not happy with your online performance?

GET A FREE CONSULTATION

Be our next client


Our team is here to quick audit your ppc campaigns, organic traffic performance or website usability.

Just get in touch with us and get your free consultation!