What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Google follows a small number of redirects immediately, about five consecutive redirects, and treats them as a single request. Beyond five steps, they are treated as separate requests.
172:13
🎥 Source video

Extracted from a Google Search Central video

⏱ 996h50 💬 EN 📅 12/03/2021 ✂ 43 statements
Watch on YouTube (172:13) →
Other statements from this video 42
  1. 42:49 Peut-on vraiment utiliser hreflang entre plusieurs domaines distincts ?
  2. 48:45 Peut-on vraiment utiliser hreflang entre plusieurs domaines distincts ?
  3. 58:47 Faut-il vraiment éviter de dupliquer son contenu sur deux sites distincts ?
  4. 58:47 Faut-il vraiment éviter de créer plusieurs sites pour le même contenu ?
  5. 91:16 Faut-il vraiment indexer les pages de recherche interne de votre site ?
  6. 91:16 Faut-il bloquer les pages de recherche interne pour éviter l'indexation d'un espace infini ?
  7. 125:44 Les Core Web Vitals influencent-ils vraiment le budget de crawl de Google ?
  8. 125:44 Réduire la taille de page améliore-t-il vraiment le budget crawl ?
  9. 152:31 Le rapport de liens internes dans Search Console reflète-t-il vraiment l'état de votre maillage ?
  10. 152:31 Pourquoi le rapport de liens internes de Search Console ne montre-t-il qu'un échantillon ?
  11. 172:13 Combien de redirections Google suit-il réellement avant de fractionner le crawl ?
  12. 201:37 Comment Google segmente-t-il réellement vos Core Web Vitals par groupes de pages ?
  13. 201:37 Comment Google segmente-t-il réellement vos Core Web Vitals par groupes de pages ?
  14. 248:11 AMP ou canonique : qui récolte vraiment les signaux SEO ?
  15. 257:21 Le Chrome UX Report compte-t-il vraiment vos pages AMP en cache ?
  16. 272:10 Faut-il vraiment rediriger vos URLs AMP lors d'un changement ?
  17. 272:10 Faut-il vraiment rediriger vos anciennes URLs AMP vers les nouvelles ?
  18. 294:42 AMP est-il vraiment neutre pour le classement Google ou cache-t-il un levier de visibilité invisible ?
  19. 296:42 AMP est-il vraiment un facteur de classement Google ou juste un ticket d'entrée pour certaines features ?
  20. 342:21 Pourquoi le contenu copié surclasse-t-il parfois l'original malgré le DMCA ?
  21. 342:21 Le DMCA est-il vraiment efficace pour protéger votre contenu dupliqué sur Google ?
  22. 359:44 Pourquoi le contenu copié surclasse-t-il votre contenu original dans Google ?
  23. 409:35 Pourquoi vos featured snippets disparaissent-ils sans raison technique ?
  24. 409:35 Les featured snippets et résultats enrichis fluctuent-ils vraiment par hasard ?
  25. 455:08 Le contenu masqué en responsive mobile est-il vraiment indexé par Google ?
  26. 455:08 Le contenu caché en CSS responsive est-il vraiment indexé par Google ?
  27. 563:51 Les structured data peuvent-elles vraiment forcer l'affichage d'un knowledge panel ?
  28. 563:51 Existe-t-il un balisage structuré qui garantit l'apparition d'un Knowledge Panel ?
  29. 583:50 Pourquoi la plupart des sites n'obtiennent-ils jamais de sitelinks dans Google ?
  30. 583:50 Peut-on vraiment forcer l'affichage des sitelinks dans Google ?
  31. 649:39 Les redirections 301 transfèrent-elles vraiment 100 % du jus SEO sans perte ?
  32. 649:39 Les redirections 301 transfèrent-elles vraiment 100% du PageRank et des signaux SEO ?
  33. 722:53 Faut-il vraiment supprimer ou rediriger les contenus expirés plutôt que de les garder indexables ?
  34. 722:53 Faut-il vraiment supprimer les pages expirées ou peut-on les laisser avec un label 'expiré' ?
  35. 859:32 Les mots-clés dans l'URL : facteur de ranking ou simple béquille temporaire ?
  36. 859:32 Les mots dans l'URL influencent-ils vraiment le classement Google ?
  37. 908:40 Faut-il vraiment ajouter des structured data sur les vidéos YouTube embarquées ?
  38. 909:01 Faut-il vraiment ajouter des données structurées vidéo quand on embed déjà YouTube ?
  39. 932:46 Les Core Web Vitals impactent-ils vraiment le SEO desktop ?
  40. 932:46 Pourquoi Google ignore-t-il les Core Web Vitals desktop dans son algorithme de classement ?
  41. 952:49 L'API et l'interface Search Console affichent-elles vraiment les mêmes données ?
  42. 963:49 Peut-on utiliser des templates différents par version linguistique sans pénaliser son SEO international ?
📅
Official statement from (5 years ago)
TL;DR

Google automatically follows up to five consecutive redirects and treats them as a single request, without immediate technical penalty. Beyond this threshold, each step becomes a separate request, which fragments the crawl budget and dilutes link equity. For an SEO practitioner, this means that a chain of three redirects remains manageable, but beyond four steps, the risk of inefficiency significantly increases.

What you need to understand

What constitutes a single request in the context of Google's crawl? <\/h3>

When Googlebot visits a URL and encounters a redirect, it must follow that instruction to reach the final destination. Traditionally, each hop consumes crawl budget <\/strong> — this limited resource that Google allocates to your site. Mueller's statement clarifies that up to five consecutive redirects, Google optimizes this process by grouping the steps into a single logical operation.<\/p>

Specifically, if URL A points to B, which points to C, then D, then E, Google treats this journey as a single request <\/strong>. This means that instead of five distinct HTTP requests each consuming crawl budget, the whole counts as just one. It's a notable efficiency gain for sites with complex structures or multiple migrations.<\/p>

Why does Google impose a limit of five redirects? <\/h3>

The limit is not arbitrary — it protects against infinite redirect loops <\/strong> and faulty technical setups. If Google were to follow without limit, a poorly configured site could trap the crawler in an endless chain, consuming unnecessary server resources on both sides.<\/p>

Beyond five steps, each additional redirect becomes a separate request. This fragments the crawl budget <\/strong> and slows down the indexing process. More seriously, each hop dilutes the equity of links (the famous PageRank) that transits from the source URL to the final destination. A chain of seven redirects therefore loses mechanically more juice than a chain of three.<\/p>

Does this limit apply to all types of redirects? <\/h3>

Mueller does not specify whether the rule concerns only 301, 302, or also JavaScript and meta-refresh redirects <\/strong>. Field experience shows that Google treats server-side redirects (301/302) — immediate and effective — differently from client-side redirects, which require JavaScript execution and are therefore slower.<\/p>

The 307 and 308 redirects (less common) likely follow the same logic, but no official data confirms this. In practice, it is assumed that the limit of five applies to classic HTTP redirects <\/strong>, those that Googlebot can resolve without rendering JavaScript.<\/p>

  • Up to 5 consecutive redirects <\/strong>: treated as a single request, minimal impact on crawl budget
  • Beyond 5 steps <\/strong>: each redirect becomes a distinct request, consuming more resources
  • PageRank dilution <\/strong>: each additional hop reduces the equity passed to the final destination
  • Types of redirects <\/strong>: the rule primarily applies to 301/302, uncertain status for JavaScript and meta-refresh
  • Protection against loops <\/strong>: the limit prevents faulty configurations from trapping Googlebot indefinitely
  • <\/ul>

SEO Expert opinion

Is this statement consistent with field observations? <\/h3>

Yes, broadly. Technical audits show that short redirect chains (2-3 steps) do not visibly penalize indexing <\/strong>, while chains of six redirects or more often correlate with pages indexed late or weakened quality signals. This aligns with the mechanics described by Mueller.<\/p>

On the other hand, the exact limit of five remains blurry in empirical tests <\/strong>. Some sites with four redirects operate perfectly, while others with three already show slowdowns — likely due to other factors (server speed, content quality, crawl frequency). The rule of five is a technical threshold, not a universal guarantee.<\/p>

What nuances need to be added to this claim? <\/h3>

Mueller does not mention the cumulative impact of redirects on PageRank sculpting <\/strong>. Historically, each 301 redirect passes about 85 to 90% of equity — a figure never officially confirmed but widely observed. Therefore, three consecutive redirects can theoretically transmit only 61 to 73% of the initial juice (0.85³ ≈ 0.61; 0.9³ ≈ 0.73). [To verify] <\/strong> because Google has never published a precise figure.<\/p>

Another point: Mueller talks about “consecutive” redirects, but what happens with parallel redirects or branching chains? The statement remains vague on complex architectures <\/strong>, particularly multilingual sites with geolocated cascading redirects. In these cases, the limit of five may be reached without the SEO practitioner realizing it.<\/p>

When does this rule become critical for your SEO strategy? <\/h3>

Sites that have undergone multiple migrations or redesigns often accumulate orphaned redirect chains <\/strong>: a URL from 2018 redirects to one from 2020, which redirects to one from 2022, then to the current version. Outcome: four steps, close to the limit. If you then add an HTTPS redirect or switch to www, you cross the critical threshold.<\/p>

Warning: <\/strong> Audit tools (Screaming Frog, Oncrawl) rarely detect chains beyond three steps. A manual audit or a custom script is needed to map all redirect journeys. Do not rely solely on automated reports.<\/div><\/p>

Another sensitive case: e-commerce sites with UTM parameters or product URL variants <\/strong>. A URL with tracking may redirect to a canonical version, which itself redirects to a consolidated category, then to a brand page — and boom, four redirects. Multiply that by 10,000 products and you have an exploded crawl budget.<\/p>

Practical impact and recommendations

How to audit redirect chains on your site? <\/h3>

Start with a complete crawl using Screaming Frog in “crawl all redirects” mode <\/strong>, then export the redirect chain report. Sort the results by number of steps: anything over three deserves investigation. Then check manually with cURL or a Python script to confirm the actual number of hops — some tools stop prematurely.<\/p>

If you manage a large site (over 50,000 pages), use server logs to identify URLs that Googlebot frequently visits and consistently returns redirects. Oncrawl or Botify allow you to cross-reference crawl data and redirect chains <\/strong>, thus identifying critical paths that consume the most budget.<\/p>

What priority corrective actions should be implemented? <\/h3>

First priority: correct internal redirects <\/strong>. If your linking points to URLs that redirect, you force Google to unnecessarily follow chains. Update your internal links to point directly to the final destination — this saves crawl budget and preserves link equity.<\/p>

Next, eliminate the redirect chains inherited from old migrations. Consolidate multiple 301s into a single direct redirect <\/strong> from the source URL to the current destination. This sometimes requires touching the .htaccess file or Nginx rules, but the gain is immediate: reduced server response time and improved crawl rate.<\/p>

Should you always reduce all chains to a single step? <\/h3>

No, that would be counterproductive. Some multiple redirects are legitimate and necessary <\/strong>: for instance, an HTTP URL redirects to HTTPS, then to the canonical www version. This is two inevitable steps if your technical stack requires it. The goal is not absolute zero, but never exceeding three redirects for critical user journeys and high SEO potential pages.<\/p>

Focus your efforts on pages that receive quality backlinks or significant organic traffic <\/strong>. A chain of four redirects on a zombie page is not worth the intervention time. However, if your flagship product page accumulates three redirects and you plan to add a layer of geolocated personalization, that’s the time to clean up.<\/p>

Technical optimization of redirects may seem simple on paper, but in complex architectures (multiregional sites, headless platforms, hybrid CMS), every change affects dozens of server rules and can introduce regressions <\/strong>. If your site exceeds 10,000 pages or if you have undergone multiple successive migrations, engaging a specialized SEO agency can prevent costly mistakes and ensure a clean redesign of the redirect chains without breaking existing indexing.<\/p>

  • Audit all redirect chains with a tool configured to track beyond three steps
  • Prioritize correcting internal links pointing to redirected URLs
  • Consolidate chains inherited from migrations into direct redirects from source → final destination
  • Verify that strategic pages (landing pages, flagship products) never exceed three redirects
  • Monitor server logs to identify redirect paths that consume crawl budget abnormally
  • Test each modification with cURL or a script before deployment in production
  • <\/ul>
    The limit of five redirects is not a permission; it’s a technical threshold beyond which Google stops optimizing <\/strong>. In practice, aim for a maximum of three steps for critical journeys and regularly audit your chains to avoid silent accumulation. The real impact is measured in wasted crawl budget and diluted equity — two invisible variables in Google Analytics but determinant for your organic positions.<\/div>

❓ Frequently Asked Questions

Est-ce qu'une chaîne de trois redirections pénalise mon référencement ?
Non, pas directement. Google traite jusqu'à cinq redirections comme une seule requête, donc trois étapes restent dans la limite technique acceptable. L'impact principal se situe au niveau de la dilution du PageRank transmis, qui diminue légèrement à chaque saut.
Les redirections 302 comptent-elles dans la limite de cinq ?
Oui, toutes les redirections HTTP (301, 302, 307, 308) consomment une étape dans la chaîne. Google ne fait pas de distinction technique pour le comptage, même si le traitement sémantique (temporaire vs permanent) diffère.
Comment savoir si mes redirections internes créent des chaînes problématiques ?
Utilisez Screaming Frog en activant l'option 'crawl all redirects' et exportez le rapport des chaînes. Toute URL affichant plus de trois redirections consécutives nécessite une correction manuelle pour pointer directement vers la destination finale.
Les redirections JavaScript sont-elles incluses dans la limite de cinq ?
La déclaration de Mueller ne précise pas, mais l'expérience terrain suggère que les redirections JavaScript sont traitées différemment (nécessitant le rendu de la page). Elles s'ajoutent probablement aux redirections serveur dans le décompte total, mais aucune confirmation officielle n'existe.
Que se passe-t-il exactement au-delà de cinq redirections consécutives ?
Chaque redirection supplémentaire devient une requête HTTP distincte, consommant du crawl budget supplémentaire et ralentissant l'indexation. Google peut aussi décider d'arrêter de suivre la chaîne si elle semble anormale ou boucler, ce qui empêche l'indexation de la destination finale.

🎥 From the same video 42

Other SEO insights extracted from this same Google Search Central video · duration 996h50 · published on 12/03/2021

🎥 Watch the full video on YouTube →

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.