The other day, Google employee John Muller answered the question about whether the crawler’s behavior would change if the active XML sitemap starts to fail.
Martin MacDonald, an expert in the Western SEO community, asked Muller on Twitter:
“The theoretical question. There is a well-functioning site that has an XML-map, daily updated with new pages. If the Sitemap stops working, can Googlebot increase the crawl rate for finding new content? Or are these systems completely separate from each other? ”
Muller replied that in this case the scan speed will not change.
“No, the general scan will not change.”
Recall that in January it became known that Google could ignore Sitemaps if they contain invalid URLs. And earlier this month , a Sitemap report was updated in Search Console.
Martin MacDonald
✔@searchmartin
@JohnMu theoretical question: scenario, a well established site, had a sitemap you guys hit every day with x,xxx’s of new pages.
If that sitemap stopped working, might gBot react by increasing crawling to find the new content?
Or are those systems completely separate?
No, the total crawl wouldn’t change.