Official statement
Other statements from this video 42 ▾
- 42:49 Peut-on vraiment utiliser hreflang entre plusieurs domaines distincts ?
- 48:45 Peut-on vraiment utiliser hreflang entre plusieurs domaines distincts ?
- 58:47 Faut-il vraiment éviter de dupliquer son contenu sur deux sites distincts ?
- 58:47 Faut-il vraiment éviter de créer plusieurs sites pour le même contenu ?
- 91:16 Faut-il vraiment indexer les pages de recherche interne de votre site ?
- 91:16 Faut-il bloquer les pages de recherche interne pour éviter l'indexation d'un espace infini ?
- 125:44 Les Core Web Vitals influencent-ils vraiment le budget de crawl de Google ?
- 125:44 Réduire la taille de page améliore-t-il vraiment le budget crawl ?
- 152:31 Le rapport de liens internes dans Search Console reflète-t-il vraiment l'état de votre maillage ?
- 152:31 Pourquoi le rapport de liens internes de Search Console ne montre-t-il qu'un échantillon ?
- 172:13 Combien de redirections Google suit-il réellement avant de fractionner le crawl ?
- 201:37 Comment Google segmente-t-il réellement vos Core Web Vitals par groupes de pages ?
- 201:37 Comment Google segmente-t-il réellement vos Core Web Vitals par groupes de pages ?
- 248:11 AMP ou canonique : qui récolte vraiment les signaux SEO ?
- 257:21 Le Chrome UX Report compte-t-il vraiment vos pages AMP en cache ?
- 272:10 Faut-il vraiment rediriger vos URLs AMP lors d'un changement ?
- 272:10 Faut-il vraiment rediriger vos anciennes URLs AMP vers les nouvelles ?
- 294:42 AMP est-il vraiment neutre pour le classement Google ou cache-t-il un levier de visibilité invisible ?
- 296:42 AMP est-il vraiment un facteur de classement Google ou juste un ticket d'entrée pour certaines features ?
- 342:21 Pourquoi le contenu copié surclasse-t-il parfois l'original malgré le DMCA ?
- 342:21 Le DMCA est-il vraiment efficace pour protéger votre contenu dupliqué sur Google ?
- 359:44 Pourquoi le contenu copié surclasse-t-il votre contenu original dans Google ?
- 409:35 Pourquoi vos featured snippets disparaissent-ils sans raison technique ?
- 409:35 Les featured snippets et résultats enrichis fluctuent-ils vraiment par hasard ?
- 455:08 Le contenu masqué en responsive mobile est-il vraiment indexé par Google ?
- 455:08 Le contenu caché en CSS responsive est-il vraiment indexé par Google ?
- 563:51 Les structured data peuvent-elles vraiment forcer l'affichage d'un knowledge panel ?
- 563:51 Existe-t-il un balisage structuré qui garantit l'apparition d'un Knowledge Panel ?
- 583:50 Pourquoi la plupart des sites n'obtiennent-ils jamais de sitelinks dans Google ?
- 583:50 Peut-on vraiment forcer l'affichage des sitelinks dans Google ?
- 649:39 Les redirections 301 transfèrent-elles vraiment 100 % du jus SEO sans perte ?
- 649:39 Les redirections 301 transfèrent-elles vraiment 100% du PageRank et des signaux SEO ?
- 722:53 Faut-il vraiment supprimer ou rediriger les contenus expirés plutôt que de les garder indexables ?
- 722:53 Faut-il vraiment supprimer les pages expirées ou peut-on les laisser avec un label 'expiré' ?
- 859:32 Les mots-clés dans l'URL : facteur de ranking ou simple béquille temporaire ?
- 859:32 Les mots dans l'URL influencent-ils vraiment le classement Google ?
- 908:40 Faut-il vraiment ajouter des structured data sur les vidéos YouTube embarquées ?
- 909:01 Faut-il vraiment ajouter des données structurées vidéo quand on embed déjà YouTube ?
- 932:46 Les Core Web Vitals impactent-ils vraiment le SEO desktop ?
- 932:46 Pourquoi Google ignore-t-il les Core Web Vitals desktop dans son algorithme de classement ?
- 952:49 L'API et l'interface Search Console affichent-elles vraiment les mêmes données ?
- 963:49 Peut-on utiliser des templates différents par version linguistique sans pénaliser son SEO international ?
Google automatically follows up to five consecutive redirects and treats them as a single request, without immediate technical penalty. Beyond this threshold, each step becomes a separate request, which fragments the crawl budget and dilutes link equity. For an SEO practitioner, this means that a chain of three redirects remains manageable, but beyond four steps, the risk of inefficiency significantly increases.
What you need to understand
What constitutes a single request in the context of Google's crawl? <\/h3>
When Googlebot visits a URL and encounters a redirect, it must follow that instruction to reach the final destination. Traditionally, each hop consumes crawl budget <\/strong> — this limited resource that Google allocates to your site. Mueller's statement clarifies that up to five consecutive redirects, Google optimizes this process by grouping the steps into a single logical operation.<\/p> Specifically, if URL A points to B, which points to C, then D, then E, Google treats this journey as a single request <\/strong>. This means that instead of five distinct HTTP requests each consuming crawl budget, the whole counts as just one. It's a notable efficiency gain for sites with complex structures or multiple migrations.<\/p> The limit is not arbitrary — it protects against infinite redirect loops <\/strong> and faulty technical setups. If Google were to follow without limit, a poorly configured site could trap the crawler in an endless chain, consuming unnecessary server resources on both sides.<\/p> Beyond five steps, each additional redirect becomes a separate request. This fragments the crawl budget <\/strong> and slows down the indexing process. More seriously, each hop dilutes the equity of links (the famous PageRank) that transits from the source URL to the final destination. A chain of seven redirects therefore loses mechanically more juice than a chain of three.<\/p> Mueller does not specify whether the rule concerns only 301, 302, or also JavaScript and meta-refresh redirects <\/strong>. Field experience shows that Google treats server-side redirects (301/302) — immediate and effective — differently from client-side redirects, which require JavaScript execution and are therefore slower.<\/p> The 307 and 308 redirects (less common) likely follow the same logic, but no official data confirms this. In practice, it is assumed that the limit of five applies to classic HTTP redirects <\/strong>, those that Googlebot can resolve without rendering JavaScript.<\/p>Why does Google impose a limit of five redirects? <\/h3>
Does this limit apply to all types of redirects? <\/h3>
SEO Expert opinion
Is this statement consistent with field observations? <\/h3>
Yes, broadly. Technical audits show that short redirect chains (2-3 steps) do not visibly penalize indexing <\/strong>, while chains of six redirects or more often correlate with pages indexed late or weakened quality signals. This aligns with the mechanics described by Mueller.<\/p> On the other hand, the exact limit of five remains blurry in empirical tests <\/strong>. Some sites with four redirects operate perfectly, while others with three already show slowdowns — likely due to other factors (server speed, content quality, crawl frequency). The rule of five is a technical threshold, not a universal guarantee.<\/p> Mueller does not mention the cumulative impact of redirects on PageRank sculpting <\/strong>. Historically, each 301 redirect passes about 85 to 90% of equity — a figure never officially confirmed but widely observed. Therefore, three consecutive redirects can theoretically transmit only 61 to 73% of the initial juice (0.85³ ≈ 0.61; 0.9³ ≈ 0.73). [To verify] <\/strong> because Google has never published a precise figure.<\/p> Another point: Mueller talks about “consecutive” redirects, but what happens with parallel redirects or branching chains? The statement remains vague on complex architectures <\/strong>, particularly multilingual sites with geolocated cascading redirects. In these cases, the limit of five may be reached without the SEO practitioner realizing it.<\/p> Sites that have undergone multiple migrations or redesigns often accumulate orphaned redirect chains <\/strong>: a URL from 2018 redirects to one from 2020, which redirects to one from 2022, then to the current version. Outcome: four steps, close to the limit. If you then add an HTTPS redirect or switch to www, you cross the critical threshold.<\/p> Another sensitive case: e-commerce sites with UTM parameters or product URL variants <\/strong>. A URL with tracking may redirect to a canonical version, which itself redirects to a consolidated category, then to a brand page — and boom, four redirects. Multiply that by 10,000 products and you have an exploded crawl budget.<\/p>What nuances need to be added to this claim? <\/h3>
When does this rule become critical for your SEO strategy? <\/h3>
Practical impact and recommendations
How to audit redirect chains on your site? <\/h3>
Start with a complete crawl using Screaming Frog in “crawl all redirects” mode <\/strong>, then export the redirect chain report. Sort the results by number of steps: anything over three deserves investigation. Then check manually with cURL or a Python script to confirm the actual number of hops — some tools stop prematurely.<\/p> If you manage a large site (over 50,000 pages), use server logs to identify URLs that Googlebot frequently visits and consistently returns redirects. Oncrawl or Botify allow you to cross-reference crawl data and redirect chains <\/strong>, thus identifying critical paths that consume the most budget.<\/p> First priority: correct internal redirects <\/strong>. If your linking points to URLs that redirect, you force Google to unnecessarily follow chains. Update your internal links to point directly to the final destination — this saves crawl budget and preserves link equity.<\/p> Next, eliminate the redirect chains inherited from old migrations. Consolidate multiple 301s into a single direct redirect <\/strong> from the source URL to the current destination. This sometimes requires touching the .htaccess file or Nginx rules, but the gain is immediate: reduced server response time and improved crawl rate.<\/p> No, that would be counterproductive. Some multiple redirects are legitimate and necessary <\/strong>: for instance, an HTTP URL redirects to HTTPS, then to the canonical www version. This is two inevitable steps if your technical stack requires it. The goal is not absolute zero, but never exceeding three redirects for critical user journeys and high SEO potential pages.<\/p> Focus your efforts on pages that receive quality backlinks or significant organic traffic <\/strong>. A chain of four redirects on a zombie page is not worth the intervention time. However, if your flagship product page accumulates three redirects and you plan to add a layer of geolocated personalization, that’s the time to clean up.<\/p> Technical optimization of redirects may seem simple on paper, but in complex architectures (multiregional sites, headless platforms, hybrid CMS), every change affects dozens of server rules and can introduce regressions <\/strong>. If your site exceeds 10,000 pages or if you have undergone multiple successive migrations, engaging a specialized SEO agency can prevent costly mistakes and ensure a clean redesign of the redirect chains without breaking existing indexing.<\/p>What priority corrective actions should be implemented? <\/h3>
Should you always reduce all chains to a single step? <\/h3>
❓ Frequently Asked Questions
Est-ce qu'une chaîne de trois redirections pénalise mon référencement ?
Les redirections 302 comptent-elles dans la limite de cinq ?
Comment savoir si mes redirections internes créent des chaînes problématiques ?
Les redirections JavaScript sont-elles incluses dans la limite de cinq ?
Que se passe-t-il exactement au-delà de cinq redirections consécutives ?
🎥 From the same video 42
Other SEO insights extracted from this same Google Search Central video · duration 996h50 · published on 12/03/2021
🎥 Watch the full video on YouTube →Related statements
Get real-time analysis of the latest Google SEO declarations
Be the first to know every time a new official Google statement drops — with full expert analysis.
💬 Comments (0)
Be the first to comment.