Official statement
Other statements from this video 11 ▾
- 1:47 Les balises alt des images sont-elles vraiment indispensables pour le SEO ?
- 3:35 Faut-il vraiment se méfier des slogans et interliens répétés sur chaque page ?
- 5:50 Le H1 dupliqué sur plusieurs pages nuit-il vraiment au SEO ?
- 9:59 Hreflang suffit-il vraiment à empêcher Google de fusionner vos versions internationales ?
- 15:07 Le contenu adulte partiel pénalise-t-il vraiment le SEO d'un site ?
- 23:17 Les backlinks sont-ils vraiment devenus un facteur de classement secondaire ?
- 37:03 Le SEO technique restera-t-il vraiment le pilier central du référencement ?
- 38:45 Les extraits enrichis Schema.org améliorent-ils vraiment votre CTR si Google les juge inutiles ?
- 43:25 La qualité centrée utilisateur suffit-elle vraiment à plaire à Google ?
- 52:05 Faut-il vraiment abandonner les sites m-dot pour passer au responsive ?
- 73:31 Combien de temps faut-il vraiment maintenir une redirection après une migration de domaine ?
Google follows up to 5 consecutive redirects during a crawl session. Beyond that, the bot gives up, and your page isn't crawled. Essentially, each redirect costs crawl time and dilutes the PageRank signal—keeping chains to 1 or 2 hops remains best practice.
What you need to understand
What does '5 redirects at a time' really mean?
Google sets a clear technical limit: its crawler can follow up to 5 successive redirects during a single crawl session. If one URL redirects to another that redirects to a third, and so forth, Googlebot will continue until the fifth link.
Beyond this threshold, the bot aborts the request. The final page is not crawled, and therefore not indexed. Your content effectively disappears from Google's radar, regardless of its quality or relevance.
Why limit redirect chains even below the threshold of 5?
Following 5 redirects is technically possible—but it's extremely resource-intensive. Each hop involves a new HTTP request, a new network round trip, and a new processing delay. On a high-traffic site, this accumulated latency can exhaust your crawl budget within hours.
The PageRank also dilutes at every step. A chain of 4 redirects means that the SEO juice travels through four successive pipes before reaching its destination—each pipe has a little leak. A direct redirect retains the relevance signal better.
How does this problem actually manifest on a site?
You may notice orphan pages in Search Console: URLs known to Google but never crawled, or crawled with weeks of delay. Server logs may show crawl sessions abruptly stopping on certain URLs without completing the chain.
Sites that migrate multiple times without cleaning up redirects may accidentally accumulate these chains. A historical page points to an old domain that redirects to a new one which then redirects to a new URL structure—and that's 4 hops. Google follows, but you lose 80% of your crawl budget on this section of the site.
- Google follows up to 5 consecutive redirects before giving up
- Each hop consumes crawl budget and dilutes the PageRank transmitted
- Long chains slow down crawling and delay indexing
- Redirects often accumulate after multiple migrations or redesigns
- A regular audit of redirect chains helps avoid this waste
SEO Expert opinion
Is this limit of 5 consistent with field observations?
Yes, and it has been documented for years in Google technical forums. What’s new here is that Mueller explicitly reminds us of the rule in an official context. In practice, we do observe crawl aborts beyond 4-5 hops, especially on low-authority sites where the crawl budget is already rationed.
However, Google does not specify whether this limit applies per URL or per overall session. In practice, Googlebot seems to treat each URL in isolation—a chain of 5 redirects on one page does not affect the crawl of another page on the site. But if your redirects all point to long chains, you quickly saturate your available crawl quota. [To be verified] : Google provides no figures on the relative cost of a redirect compared to a direct request in terms of crawl budget.
What nuances should be added to this rule?
Google mentions redirects “at the same time during a crawl session.” This suggests that if the bot returns later, it might retry to follow the chain. But there’s no guarantee it will do so quickly—nor that it will do so at all if your site lacks authority.
Another point: not all redirects are created equal. A permanent 301 is supposed to transmit 100% of the PageRank according to Google, but a chain of 301s necessarily involves cumulative losses due to network processing. A temporary 302 keeps the original URL in the index and delays consolidation. If you alternate between 301 and 302, you create an algorithmic ambiguity that Google often resolves by… not indexing anything at all.
In what cases does this rule not really apply?
If you redirect a URL from a high-authority external domain—for example, a backlink from a major news site—Google will likely follow the chain to the end, even if it exceeds 5 hops, because the initial relevance signal is strong enough to justify the effort.
On the other hand, on an average site with a tight crawl budget, Google prioritizes direct URLs and abandons chains as early as the third hop. You might not see any errors in Search Console: the page is just silently ignored. This is particularly insidious on e-commerce sites with thousands of product listings migrated multiple times.
Practical impact and recommendations
What should you do to clean up redirect chains?
First step: crawl your entire site using Screaming Frog, Oncrawl, or an equivalent tool. Enable the “follow redirects” option and export all detected chains. You'll get a table with the source URL, intermediate hops, and the final destination.
Next, replace each chain with a direct redirect from the source to the destination. If A redirects to B which redirects to C, remove the B → C redirect and set A → C directly. You save one hop, accelerate the crawl, and retain the PageRank better. It’s tedious work, but it’s the only way to clean up properly.
What mistakes should be avoided when fixing redirects?
Never break a redirect in place without ensuring it no longer receives traffic or backlinks. Use Search Console to list source URLs that still generate clicks or impressions. Use a backlink tool like Ahrefs or Majestic to identify active incoming links.
Also, avoid creating redirect loops while trying to shorten chains: A → B → A is a classic mistake after poorly documented redesigns. Test each redirect with curl or an online tool before deploying it to production. Finally, don't mix 301 and 302 in the same chain—choose one type and stick to it to avoid algorithmic ambiguities.
How can I check if my site complies with this rule after correction?
Re-crawl your site after deploying the new redirect rules. Ensure that all remaining chains have a maximum of 1 hop (2 in exceptional cases). Monitor server logs for a few days: if Googlebot continues to hit intermediate URLs, it means your redirects aren't being accounted for properly—CDN cache, browser cache, or misordered .htaccess rules.
Use the live URL test in Search Console to validate that Google sees the direct redirect correctly. If the test still shows multiple hops, force a recrawl and wait a few hours. Finally, compare your crawl budget before/after: you should see an increase in the number of pages crawled per day if you had many chains. This cleanup may seem technical and tedious—if you manage a complex site or lack internal resources, hiring a specialized SEO agency can save you valuable time and avoid costly mistakes during implementation.
- Crawl the complete site to identify all redirect chains
- Replace each chain with a direct source → final destination redirect
- Check backlinks and traffic before modifying an existing redirect
- Avoid loops and mixing 301/302 in the same chain
- Re-crawl after deployment and monitor server logs
- Use the live URL test in Search Console to validate
❓ Frequently Asked Questions
Que se passe-t-il si une chaîne de redirection dépasse 5 sauts ?
Une redirection 301 transmet-elle vraiment 100 % du PageRank ?
Faut-il supprimer toutes les redirections existantes ?
Comment détecter les chaînes de redirection sur un gros site ?
Les redirections JavaScript sont-elles comptabilisées dans la limite de 5 ?
🎥 From the same video 11
Other SEO insights extracted from this same Google Search Central video · duration 54 min · published on 06/03/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.