What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

During a server migration, Googlebot automatically adjusts the crawl frequency to avoid overwhelming the new server. This process occurs automatically: Googlebot initially reduces its crawl speed and then gradually returns to an appropriate pace.
1:04
🎥 Source video

Extracted from a Google Search Central video

⏱ 2:06 💬 EN 📅 24/09/2019 ✂ 2 statements
Watch on YouTube (1:04) →
Other statements from this video 1
  1. 0:34 Une migration de serveur peut-elle vraiment être invisible pour Google ?
📅
Official statement from (6 years ago)
TL;DR

Google states that Googlebot automatically adjusts its crawling frequency during a server migration to avoid overload: an initial slowdown, followed by a gradual return to normal. For SEO, this means there's no need to panic if crawling decreases post-migration — it's temporary. However, this 'self-regulation' doesn't exempt you from monitoring logs and the Search Console for potential technical issues that could slow down indexing in the long term.

What you need to understand

Why would Googlebot intentionally slow down its crawl?

Google wants to avoid overloading a new server that may not support the same volume of requests as the old one. During a migration, Googlebot detects a change in infrastructure — new domain name, new IP, new response times — and adopts a cautious approach.

Specifically, the bot temporarily reduces the number of requests per second it sends to the server. This observation phase allows it to measure the responsiveness of the new environment without risking triggering 503 errors or slowing down the user experience.

How does this gradual load increase happen?

Once Googlebot confirms that the server is responding correctly (stable response times, no server errors), it gradually increases crawling. This process can take anywhere from a few days to a few weeks, depending on the size of the site and the capacity of the new server.

However, Google does not provide any precise figures. How long does this phase actually last? What percentage of initial reduction? [To be verified] — no public data allows for accurately calibrating this period.

Should we intervene manually or let it be?

According to this statement, the process is supposed to be automatic. There's no need to request an accelerated recrawl via the Search Console — Googlebot would adjust its rhythm based on observed performance.

But in reality, SEOs find that monitoring server logs and manually adjusting certain parameters (robots.txt file, crawl rate via Search Console) can speed up the return to normal. Letting it happen 'automatically' without monitoring can delay full indexing by several weeks.

  • Googlebot first reduces its crawl rate to avoid overwhelming the new server
  • Then it gradually returns to a normal pace based on observed performances
  • This process is supposed to be automatic, but no precise timeline is communicated
  • Monitoring logs and the Search Console remains essential to detect any blockages
  • A high-performing server naturally accelerates the crawl load increase

SEO Expert opinion

Is this statement consistent with real-world observations?

Yes — and no. In theory, the idea that Googlebot adapts its crawl to the server's capabilities has been confirmed by experience for years. During a migration, a temporary decrease in crawl is indeed observed.

However, the devil is in the ambiguity of 'automatically'. In practice, some sites see their crawl return to normal in 48 hours, while others wait several weeks. Why this variability? Google does not specify. Site size, content quality, domain authority, user signals — all these factors probably play a role, but [To be verified] no specific variable is confirmed.

What factors actually influence the speed of recovery?

The performance of the new server, of course: low TTFB response times, absence of 5xx errors. But also the quality of the crawl budget before migration. A site that was poorly crawled (many unnecessary indexed pages, infinite pagination, duplicate content) will see its recovery slowed.

Experienced SEOs know that preparing the migration — cleaning the crawl budget, optimizing the XML sitemap, properly redirecting old URLs — significantly accelerates the return to normal. Relying solely on Google’s 'automatic' adjustment is a risky gamble.

Attention: If your crawl does not recover after 2-3 weeks post-migration, do not just wait. Check your server logs, the Search Console, and look for technical blockages (robots.txt, canonical, accidental noindex). Google's automation does not correct your configuration errors.

In which cases does this self-regulation fail?

When the new server is underpowered for the actual traffic of the site. Googlebot detects slowdowns, continues to crawl slowly, and you remain stuck in a reduced crawl for weeks.

Another trap: migrations with a domain change. The crawl doesn't decrease just because of Google's caution, but also because the new domain has not yet inherited all the authority of the old one. Again, passively waiting for Google to 'adjust automatically' can cost dearly in visibility.

Practical impact and recommendations

What should you concretely do before and during migration?

First, properly size your new server. Don't migrate to an infrastructure less powerful than the old one, hoping that Googlebot will be lenient. Test TTFB response times, check the ability to handle crawl spikes.

Next, clean your crawl budget before migration. Block unnecessary pages (filters, internal searches, URL parameters), correct redirection loops, remove orphaned pages. A clean site gets recrawled faster.

How to monitor the return to normal crawling?

Your best weapon is the server logs. Analyze daily the volume of Googlebot requests, the HTTP codes returned, the crawled pages. If you notice that certain strategic sections are not recrawled after 7-10 days, take action.

In the Search Console, check the Crawl Statistics report. You should see an inverted U curve: initial drop, then a gradual increase. If the curve stagnates at the bottom, it's an alarm signal — technical issue or too slow server.

What mistakes should you absolutely avoid?

Do not block Googlebot via robots.txt during migration to 'relieve the server'. You would disrupt the gradual crawl recovery and delay indexing by several weeks.

Also avoid leaving temporary 302 redirects after migration. Googlebot will interpret this as a temporary change and will not fully index the new server. Change all your redirects to 301 permanent as soon as the migration is stabilized.

  • Size the new server to support at least the same crawl volume as the old one
  • Clean the crawl budget before migration: block unnecessary pages, fix redirects
  • Monitor server logs daily during the first 2-3 weeks
  • Check the Crawl Statistics report in the Search Console
  • Never block Googlebot via robots.txt to 'save' server resources
  • Change all the redirects to 301 permanent as soon as migration is validated
A well-prepared server migration — properly sized server, cleaned crawl budget, 301 redirects in place — significantly limits the drop in crawl. Googlebot automatically adjusts its pace, but this adjustment will never replace active monitoring of your logs and the Search Console. If your site is complex (thousands of pages, high organic traffic), these technical optimizations can quickly become time-consuming. In this case, hiring a specialized SEO agency for migrations may save you costly visibility losses and accelerate the return to normal.

❓ Frequently Asked Questions

Combien de temps dure la baisse de crawl après une migration serveur ?
Google ne donne aucun chiffre précis. Dans la pratique, on observe une remontée progressive sur 2 à 4 semaines pour la plupart des sites, à condition que le nouveau serveur soit performant et bien configuré.
Dois-je demander un recrawl manuel via la Search Console après la migration ?
Ce n'est pas nécessaire selon Google — l'ajustement est automatique. Mais soumettre un sitemap XML mis à jour et demander l'indexation des pages stratégiques peut accélérer la détection du nouveau serveur.
Que faire si le crawl ne remonte pas après 3 semaines ?
Vérifiez vos logs serveur et la Search Console. Cherchez des erreurs 5xx, des temps de réponse lents, des blocages robots.txt, ou des canonical/noindex accidentels qui empêchent l'indexation.
Un serveur plus puissant accélère-t-il la remontée du crawl ?
Oui. Googlebot adapte son taux de crawl aux performances observées. Un serveur rapide (TTFB bas, pas d'erreurs) recevra naturellement plus de requêtes par seconde qu'un serveur lent.
Peut-on forcer Googlebot à augmenter son crawl via la Search Console ?
L'option « Taux d'exploration » dans l'ancienne Search Console a été supprimée. Vous ne pouvez plus fixer manuellement un taux de crawl maximum — Google décide seul en fonction de la capacité de votre serveur.
🏷 Related Topics
Crawl & Indexing Web Performance Redirects

🎥 From the same video 1

Other SEO insights extracted from this same Google Search Central video · duration 2 min · published on 24/09/2019

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.