Tuesday, December 24, 2013

Google Fetch As Googlebot Won't Crawl When Site Is Too Slow

Google Fetch As Googlebot Won't Crawl When Site Is Too Slow Google Page Speed
A Google Webmaster Help thread has one webmaster complaining his site isn't being crawled by Google and isn't showing up in the search results. The reason, his site can't handle Googlebot crawling it.

The site is pretty static and basic but the server is a cheap or free host that can't handle much activity. So Googlebot can't crawl it without taking down the site and thus stays away until it can get through to crawl it without negatively impacting the site.

The interesting thing is that if you use the Fetch As Googlebot feature when this is the case, it will fail as well. So you can actually somewhat diagnose a major site speed issue with Fetch as Googlebot.

John Mueller from Google said:

Looking at your site, I do see that we'd like to crawl more from the server, but we're holding back because we think the server might not be able to handle the load. This is the reason why the Fetch as Google requests aren't making it through. In particular, we're seeing a fairly high response-time for URLs from the server, which often signals that the server is pretty busy even without us crawling.
 by Barry Schwartz

0 comments:

Post a Comment