What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Google follows about five consecutive redirects immediately and treats them as a single request. If a URL has more than five redirects, they are treated as separate requests.
172:13
🎥 Source video

Extracted from a Google Search Central video

⏱ 996h50 💬 EN 📅 12/03/2021 ✂ 43 statements
Watch on YouTube (172:13) →
Other statements from this video 42
  1. 42:49 Peut-on vraiment utiliser hreflang entre plusieurs domaines distincts ?
  2. 48:45 Peut-on vraiment utiliser hreflang entre plusieurs domaines distincts ?
  3. 58:47 Faut-il vraiment éviter de dupliquer son contenu sur deux sites distincts ?
  4. 58:47 Faut-il vraiment éviter de créer plusieurs sites pour le même contenu ?
  5. 91:16 Faut-il vraiment indexer les pages de recherche interne de votre site ?
  6. 91:16 Faut-il bloquer les pages de recherche interne pour éviter l'indexation d'un espace infini ?
  7. 125:44 Les Core Web Vitals influencent-ils vraiment le budget de crawl de Google ?
  8. 125:44 Réduire la taille de page améliore-t-il vraiment le budget crawl ?
  9. 152:31 Le rapport de liens internes dans Search Console reflète-t-il vraiment l'état de votre maillage ?
  10. 152:31 Pourquoi le rapport de liens internes de Search Console ne montre-t-il qu'un échantillon ?
  11. 172:13 Faut-il vraiment s'inquiéter des chaînes de redirections pour le crawl Google ?
  12. 201:37 Comment Google segmente-t-il réellement vos Core Web Vitals par groupes de pages ?
  13. 201:37 Comment Google segmente-t-il réellement vos Core Web Vitals par groupes de pages ?
  14. 248:11 AMP ou canonique : qui récolte vraiment les signaux SEO ?
  15. 257:21 Le Chrome UX Report compte-t-il vraiment vos pages AMP en cache ?
  16. 272:10 Faut-il vraiment rediriger vos URLs AMP lors d'un changement ?
  17. 272:10 Faut-il vraiment rediriger vos anciennes URLs AMP vers les nouvelles ?
  18. 294:42 AMP est-il vraiment neutre pour le classement Google ou cache-t-il un levier de visibilité invisible ?
  19. 296:42 AMP est-il vraiment un facteur de classement Google ou juste un ticket d'entrée pour certaines features ?
  20. 342:21 Pourquoi le contenu copié surclasse-t-il parfois l'original malgré le DMCA ?
  21. 342:21 Le DMCA est-il vraiment efficace pour protéger votre contenu dupliqué sur Google ?
  22. 359:44 Pourquoi le contenu copié surclasse-t-il votre contenu original dans Google ?
  23. 409:35 Pourquoi vos featured snippets disparaissent-ils sans raison technique ?
  24. 409:35 Les featured snippets et résultats enrichis fluctuent-ils vraiment par hasard ?
  25. 455:08 Le contenu masqué en responsive mobile est-il vraiment indexé par Google ?
  26. 455:08 Le contenu caché en CSS responsive est-il vraiment indexé par Google ?
  27. 563:51 Les structured data peuvent-elles vraiment forcer l'affichage d'un knowledge panel ?
  28. 563:51 Existe-t-il un balisage structuré qui garantit l'apparition d'un Knowledge Panel ?
  29. 583:50 Pourquoi la plupart des sites n'obtiennent-ils jamais de sitelinks dans Google ?
  30. 583:50 Peut-on vraiment forcer l'affichage des sitelinks dans Google ?
  31. 649:39 Les redirections 301 transfèrent-elles vraiment 100 % du jus SEO sans perte ?
  32. 649:39 Les redirections 301 transfèrent-elles vraiment 100% du PageRank et des signaux SEO ?
  33. 722:53 Faut-il vraiment supprimer ou rediriger les contenus expirés plutôt que de les garder indexables ?
  34. 722:53 Faut-il vraiment supprimer les pages expirées ou peut-on les laisser avec un label 'expiré' ?
  35. 859:32 Les mots-clés dans l'URL : facteur de ranking ou simple béquille temporaire ?
  36. 859:32 Les mots dans l'URL influencent-ils vraiment le classement Google ?
  37. 908:40 Faut-il vraiment ajouter des structured data sur les vidéos YouTube embarquées ?
  38. 909:01 Faut-il vraiment ajouter des données structurées vidéo quand on embed déjà YouTube ?
  39. 932:46 Les Core Web Vitals impactent-ils vraiment le SEO desktop ?
  40. 932:46 Pourquoi Google ignore-t-il les Core Web Vitals desktop dans son algorithme de classement ?
  41. 952:49 L'API et l'interface Search Console affichent-elles vraiment les mêmes données ?
  42. 963:49 Peut-on utiliser des templates différents par version linguistique sans pénaliser son SEO international ?
📅
Official statement from (5 years ago)
TL;DR

Google follows up to five consecutive redirects and treats them as a single crawl request. Beyond this threshold, each additional redirect becomes a separate request, which directly impacts your crawl budget. Specifically: limit your redirect chains to a maximum of four hops to avoid wasting crawl resources unnecessarily.

What you need to understand

What does it actually mean to 'treat as a single request'? <\/h3>

When Googlebot encounters a URL that redirects to another, it automatically follows the chain of redirects to reach the final URL. Mueller clarifies here that Google follows up to five consecutive redirects <\/strong> without interrupting this process. These five hops are treated as a single crawl operation, counted only once in your crawl budget.<\/p>

Beyond this limit, the process fragments. Each additional redirect becomes a separate request <\/strong>, which mobilizes more server resources and disproportionately consumes your crawl budget. For a site with thousands of pages, this difference is not trivial — it can delay the indexing of priority content.<\/p>

Why does Google impose this limit of five redirects? <\/h3>

The main reason is performance <\/strong>. Following infinite chains of redirects would expose Googlebot to redirect loops, configuration errors, or malicious cloaking attempts. By setting a threshold at five, Google strikes a balance between technical flexibility and protection against abuse.<\/p>

This limit also corresponds to on-ground reality. In 99% of healthy configurations, you never exceed three redirects <\/strong> — and if you have five, it's usually a sign of a problem with architecture or poorly cleaned up successive migrations. Google leaves you some leeway but does not endorse chaos.<\/p>

Does this rule apply to all types of redirects? <\/h3>

Mueller does not specify whether the distinction between 301/302/307/308 influences this treatment. In practice, the type of redirect has no impact on the counting of hops <\/strong> — it’s the number of consecutive HTTP requests that matters. Whether you chain five 301s or a mix of 302 and 307, Google stops the clock at the fifth hop.<\/p>

However, JavaScript or meta-refresh redirects are a different story <\/strong>. They require client-side rendering, hence a different processing method. Mueller speaks here only of classic server-side HTTP redirects. If your chain mixes server and JS redirects, it is even more costly in resources — and potentially not counted the same way.<\/p>

  • Google follows up to 5 consecutive HTTP redirects as a single request <\/strong><\/li>
  • Beyond that, each additional jump becomes a separate request <\/strong>, wasting crawl budget<\/li>
  • This limit applies to all server redirects <\/strong> (301, 302, 307, 308)<\/li>
  • JS or meta-refresh redirects do not follow this logic <\/strong> — they require rendering and are heavier<\/li>
  • In a clean architecture, you should never exceed 2-3 redirects <\/strong><\/li><\/ul>

SEO Expert opinion

Is this statement consistent with on-the-ground observations? <\/h3>

Yes, and it's even one of the rare times Google gives a precise number. Empirical tests show that Googlebot really follows up to five hops <\/strong> before treating subsequent redirects as fragmented requests. This behavior is observed in server logs: a chain of four redirects generates a single crawl entry, while a chain of six generates multiple.<\/p>

That being said, Mueller says nothing about crawl speed or lost PageRank <\/strong>. Even if Google 'follows' the five redirects, each hop dilutes the link equity transmitted a little more. Tests show that beyond two redirects, you start to lose juice — even if technically, the final URL is indexed properly.<\/p>

What nuances should be added to this rule? <\/h3>

First point: Mueller talks about 'consecutive' redirects <\/strong>. If your architecture mixes server redirects and client redirects (JS, meta-refresh), the counter does not work the same way. Google must first render the JS page, which adds latency and consumes more budget. A chain of three server redirects + one JS can cost as much as a chain of six classic redirects.<\/p>

Second nuance: this limit concerns crawling, not necessarily final indexing <\/strong>. Google can very well index the destination URL even if the chain exceeds five hops — but the process will be slower, less prioritized, and the link equity diminished. [To be verified] <\/strong>: no official data specifies whether a chain of seven redirects completely blocks indexing or simply delays it.<\/p>

In what cases does this rule not strictly apply? <\/h3>

Google may raise this limit for very high authority domains <\/strong> — it's a plausible hypothesis but not confirmed by Mueller. Sites like Wikipedia or Amazon have almost unlimited crawl budgets. If their technical architecture imposes six redirects on some pages, it's likely that Google still follows them. But for 99% of sites, don’t count on it.<\/p>

Another edge case: temporary redirects (302, 307) are not always treated the same way as permanent ones <\/strong>. If you chain five 302s, Google may decide to recrawl the chain multiple times to check if it changed — which doubles the actual budget consumption. Mueller does not specify this point, but it's consistent with observed behavior in logs.<\/p>

Attention: <\/strong> even if Google technically follows five redirects, that doesn't mean it's a good practice. Beyond two hops, you lose performance, crawl budget, and PageRank. Aim for zero redirects whenever possible, one when necessary.<\/div>

Practical impact and recommendations

What should you do to clean up your redirect chains? <\/h3>

First, audit your site with Screaming Frog or Sitebulb <\/strong> by enabling the 'Redirect Chains' option. These tools automatically detect chains of two or more redirects and list them for you. Export the report, sort by chain length, and focus on those exceeding two hops.<\/p>

Next, fix the chains by redirecting directly to the final URL <\/strong>. If A → B → C → D, replace with A → D and B → D. This reduces user latency, preserves link equity, and frees up crawl budget. In 80% of cases, these chains are the result of poorly documented successive migrations — no one ever cleaned up afterwards.<\/p>

What mistakes should be avoided when managing redirects? <\/h3>

Never redirect a redirect <\/strong> — this is the golden rule. If you migrate a site that has already been migrated, don't just add another layer of redirects on top of the existing ones. Trace the full chain, identify the original source URL, and redirect it directly to the new final destination.<\/p>

Second common mistake: forgetting external backlinks <\/strong>. You can fix your internal redirects, but if thousands of inbound links point to a URL that redirects twice, you're wasting equity. Contact the most significant referring sites to update their links — or use disavow if those links are toxic anyway.<\/p>

How can I check if my site complies with this limit? <\/h3>

Inspect your server logs <\/strong> to spot crawl patterns. If you see Googlebot making the same request multiple times on a redirect chain, it's splitting the process — a sign that you're exceeding the limit. Compare with your Search Console reports: if some pages take weeks to be indexed while being crawled, redirect chains are often to blame.<\/p>

Also, use the 'URL Inspection' tool in Search Console <\/strong> to manually test your problematic URLs. Google will show you the indexed final URL and the redirect path followed. If you see more than two hops, that’s an immediate red flag. Prioritize correction based on the current organic traffic of those pages.<\/p>

  • Audit the site with Screaming Frog <\/strong> to identify all redirect chains<\/li>
  • Fix chains of 3+ redirects <\/strong> by redirecting directly to the final URL<\/li>
  • Check external backlinks <\/strong> and update links from important referring sites<\/li>
  • Inspect server logs <\/strong> to detect split crawl patterns<\/li>
  • Test critical URLs in Search Console <\/strong> to validate they are indexed without friction<\/li>
  • Document each migration <\/strong> to avoid stacking layers of redirects over time<\/li><\/ul>
    In practice, aim for zero or a maximum of one redirect on your critical URLs. Beyond two hops, you start losing performance, crawl budget, and link equity — even if Google technically follows up to five. These technical optimizations can be complex to implement alone, especially on sites with a history of multiple migrations. Engaging a specialized SEO agency can quickly identify problematic chains, prioritize fixes based on business impact, and automate monitoring to prevent new chains from appearing after each overhaul.<\/div>

❓ Frequently Asked Questions

Que se passe-t-il exactement après la cinquième redirection ?
Google traite chaque redirection supplémentaire comme une requête distincte, ce qui consomme plus de crawl budget et ralentit l'indexation. L'URL finale peut toujours être indexée, mais le processus est fragmenté et moins prioritaire.
Les redirections 301 et 302 sont-elles comptées de la même façon ?
Oui, Google compte tous les types de redirections HTTP serveur (301, 302, 307, 308) de la même manière. C'est le nombre de sauts qui importe, pas le code de statut. Les redirections JS ou meta-refresh suivent une logique différente.
Une chaîne de quatre redirections pénalise-t-elle le référencement ?
Techniquement, Google la suit sans problème. Mais chaque saut dilue l'équité de lien transmise et augmente la latence utilisateur. Même si Google indexe l'URL finale, vous perdez en performance SEO et UX par rapport à une redirection directe.
Comment identifier rapidement les chaînes de redirections sur mon site ?
Utilisez Screaming Frog, Sitebulb ou des scripts de crawl custom. Activez l'option « Redirection Chains » et exportez le rapport. Priorisez les chaînes de trois sauts ou plus, surtout sur les pages à fort trafic organique ou avec beaucoup de backlinks.
Les redirections internes au sein d'un CDN comptent-elles dans cette limite ?
Oui, toute redirection HTTP côté serveur compte, qu'elle vienne de votre serveur origin ou d'un CDN. Si votre CDN ajoute une redirection pour gérer les versions www/non-www, elle s'ajoute à la chaîne totale.

🎥 From the same video 42

Other SEO insights extracted from this same Google Search Central video · duration 996h50 · published on 12/03/2021

🎥 Watch the full video on YouTube →

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.