Your site is being crawled at a rate calculated as optimal by Google.
Hi guys, I’m working with a client whose pages crawled by day in Search Console has dropped to 1/10th its previous rate and flatlined for the past 2 months. Looking at the crawl rate in site settings, the usual option to select a crawl rate (or let Google decide) is replaced with the message above.
We’ve improved server performance, capacity and unnecessary rate limiting and have seen the number of 5xx server errors is dropping. We’re also working on improving page speed to reduce the time spent downloading a page. The crawl rate is slowly increasing - a few thousand more pages a day - but it very much feels like this would happen anyway as Google works out the ‘optimal’ rate, and still well short of its former glory.
Any suggestions on how we can get the crawl rate back up? It’s becoming problematic because we’re finding that new content isn’t getting indexed.
I did find a page that lets you report a problem with how Googlebot crawls your site (https://www.google.com/webmasters/tools/googlebot-report
) but it seems to be intended for limiting crawls.