Googlebot Now Notifies Problems with Site URL Structure

Googlebot is becoming clever and smarter day by day. It would no longer be patient enough to crawl all your search result pages which carry those complex URLs. Instead it would send you a warning that it found out high number of URLs.

Webmasters have been allowing search engines to index their internal search result pages which in turn portray a false image that the site is huge. This can also help in getting a large number of pages indexed in search engines.

Recently, Google has been sending warning messages to websites which have similar practices or if their URL structure is too complex to understand.

Subject: Googlebot found an extremely high number of URLs on your site: www.example.cm

Message: Googlebot encountered problems while crawling your site http://www.example.com/.

Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site. The above notification mail also includes a list of URLs which face the issue. Though there is no mention of a ban or spamming, Google might consider this to be a serious issue in the near future. Google would definitely think about avoiding crawling into these pages as this process would involve consuming a large number of resources and time.