Official statement
Other statements from this video 9 ▾
- 3:39 Comment rediriger les utilisateurs multilingues sans pénaliser l'indexation Google ?
- 5:59 Comment Google choisit-il vraiment l'URL canonique de vos pages ?
- 24:36 Pourquoi Google traite-t-il les pages noindex comme des 404 pour le PageRank ?
- 28:26 Les erreurs 404 et 410 pénalisent-elles vraiment votre indexation Google ?
- 28:49 Hreflang et x-default : comment gérer vraiment la version par défaut d'un site multilingue ?
- 37:01 La vitesse de chargement reste-t-elle vraiment un facteur de classement déterminant ?
- 40:46 Le Mobile-First Index impose-t-il vraiment une parité stricte entre versions desktop et mobile ?
- 45:42 Le mobile-first index pénalise-t-il vraiment les contenus masqués sur mobile ?
- 56:10 JavaScript et SEO : Google indexe-t-il vraiment vos contenus rendus côté client ?
Google tracks redirects up to a maximum of five steps before giving up. This means that a URL redirected more than five times is unlikely to be indexed. The issue is not so much about the indexing of short chains but the loss of link equity and wasted crawl time, even though Google downplays the concern.
What you need to understand
What actually happens when Googlebot encounters a redirect?
When Googlebot discovers a URL, it sends an HTTP request to the server. If the server returns a status code of 301 or 302, the bot automatically follows the new destination indicated in the Location header.
This process repeats if the new URL is itself redirected. Google follows this logic up to a maximum of five successive hops. Beyond that, the bot gives up and does not attempt to reach the final URL. Consequently, the source URL is neither indexed nor properly consolidated.
Why limit to five redirects and not more?
The limit of five steps acts as a safeguard against infinite loops and failing server configurations. It also protects Google's crawl resources, as indefinitely following chains could slow down the exploration of billions of pages.
In practice, this constraint rarely poses issues for well-managed sites. Excessive redirect chains often result from an accumulation of poorly cleaned migrations or successive URL changes that have not been consolidated. A professional site rarely exceeds two or three hops.
What does Google mean by 'multiple redirects generally do not pose a problem'?
Mueller tempers the concern: a chain of two or three redirects does not prevent indexing. The bot follows it properly, and the final URL can be indexed without apparent issues. This wording aims to prevent unnecessary panic among webmasters who discover a few short chains.
Nonetheless, this statement leaves the impact on PageRank and crawl budget unclear. Google states 'no problem for indexing,' but does not clarify anything about the link equity passed on or the wasted crawl time. This is exactly the kind of vague wording that deserves to be explored.
- Technical Limit: Googlebot follows up to five redirects; beyond that, it gives up
- Real Risk: loss of link equity and unnecessary crawl budget consumption, even under five hops
- Frequent Origin: successive migrations, uncoordinated URL changes, poor technical management
- Google's Tolerance: two or three redirects generally do not prevent indexing
SEO Expert opinion
Does this limit of five redirects align with what is observed in the field?
Technical audits confirm the limit. When analyzing chains of six redirects or more, the final URL never appears in the Google index. Server logs show that Googlebot stops after the fifth hop.
However, the claim that 'multiple redirects do not pose a problem' deserves a significant nuance. Indeed, indexing may work. But each hop consumes crawl time and potentially dilutes the link equity passed on. A site that multiplies chains of three or four redirects unnecessarily wastes its crawl budget.
What doesn't Google mention in this statement?
Google remains strangely silent on the transmission of PageRank through the chains. Officially, a 301 transmits link equity. But what happens when this 301 points to another 301, which itself points to a temporary 302? [To be verified]: no official documentation specifies whether each hop dilutes the signal.
The other blind spot concerns the crawl budget. Mueller says that multiple redirects 'generally do not pose a problem,' but for a site with 100,000 pages having thousands of chains, the cumulative impact can be significant. Google does not crawl infinitely. Every second wasted following redirects means one less page explored.
In what cases could this rule pose a problem?
E-commerce sites often undergo successive migrations: platform changes, URL rewrites, switching to HTTPS, adding or removing www. Each step adds a redirect if the previous ones are not cleaned up properly. The result: chains of four or five hops approaching the limit.
Multilingual or multi-domain sites may also accumulate involuntary redirects: language detection, geographical redirects, and then structural redirects. When combined, they easily reach three or four hops. Again, Google may index, but the cost in resources is real.
Practical impact and recommendations
How can I detect and correct redirect chains on my site?
A technical crawler like Screaming Frog, OnCrawl, or Botify can help identify all chains. Configure the tool to follow redirects and generate a report listing URLs with more than one hop. Prioritize those that exceed two redirects.
Once identified, correct them by directly modifying the source redirect to point to the final URL. If A redirects to B, which redirects to C, make sure that A redirects directly to C. This operation is done in the .htaccess, Nginx configuration, or via your CMS depending on your environment.
What mistakes should be avoided when cleaning up redirects?
Never delete a redirect without ensuring that it is no longer referenced anywhere. Check internal links, important external backlinks, and old marketing campaigns. A redirect deleted too quickly can generate 404s and lose traffic.
Avoid also creating new chains while fixing old ones. If you modify a redirect, ensure that the new destination is not itself redirected. A post-correction audit with your crawler confirms that the chains have been shortened or eliminated.
How should I prioritize corrections when there are hundreds of chains?
Start with URLs that receive organic traffic or quality backlinks. A redirect chain on a page with no visitors or external links is less urgent than a chain on a page generating 1000 visits per month.
Use data from Search Console and Google Analytics to identify strategic pages. Cross-reference with your crawler data to locate long chains on these URLs. This data-driven approach avoids wasting time on corrections with no impact.
If your site has complex configurations (multiple migrations, multilingual, numerous domains), auditing and correcting can quickly become time-consuming. In this case, working with a specialized SEO agency allows for accurate diagnostics and a tailored technical roadmap, without tying up your internal teams for weeks.
- Run a crawl on the site with a technical tool (Screaming Frog, OnCrawl, Botify) to list all redirects
- Identify chains with more than two hops and prioritize them based on traffic and backlinks
- Change source redirects to point directly to the final URL
- Ensure no new chains are created during the correction
- Check internal links and update those pointing to redirected URLs
- Conduct a post-correction crawl to confirm that the chains have been removed
❓ Frequently Asked Questions
Google perd-il du PageRank à chaque saut de redirection dans une chaîne ?
Une redirection 302 temporaire compte-t-elle dans la limite des cinq sauts ?
Faut-il corriger les chaînes de deux redirections ou seulement celles de trois et plus ?
Les redirections JavaScript ou meta refresh comptent-elles dans cette limite ?
Comment vérifier rapidement si mon site a des chaînes de redirections ?
🎥 From the same video 9
Other SEO insights extracted from this same Google Search Central video · duration 57 min · published on 05/04/2018
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.