What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

Google handles multiple redirects by following up to five redirects at once. Beyond this number, processing can be segmented to slowly check later redirects.
36:53
🎥 Source video

Extracted from a Google Search Central video

⏱ 56:42 💬 EN 📅 27/06/2019 ✂ 10 statements
Watch on YouTube (36:53) →
Other statements from this video 9
  1. 0:36 Les pages profondes de votre site pèsent-elles vraiment dans votre référencement global ?
  2. 6:47 Les nouveaux protocoles Internet améliorent-ils vraiment votre SEO ?
  3. 12:03 La vitesse du site influence-t-elle vraiment les mises à jour de l'algorithme Google ?
  4. 17:14 Pourquoi Google n'affiche-t-il qu'une partie de vos données structurées dans la Search Console ?
  5. 26:58 Faut-il vraiment désavouer les liens spam ou Google s'en charge-t-il tout seul ?
  6. 31:53 Les certifications médicales des auteurs influencent-elles vraiment le ranking des contenus santé ?
  7. 48:03 Comment accélérer la désindexation de vos contenus inutiles ?
  8. 57:02 Les données structurées suffisent-elles vraiment à décrocher des rich snippets pour vos recettes ?
  9. 65:11 Les nouveaux formats de résultats sont-ils vraiment accessibles partout ?
📅
Official statement from (6 years ago)
TL;DR

Google follows up to five consecutive redirects in a single crawl pass. Beyond that, processing becomes fragmented: each additional step is checked slowly during subsequent crawls. In practical terms, a chain of redirects that is too long slows down the indexing of your final page and dilutes the link equity passed.

What you need to understand

Why Does Google Limit Redirect Processing to Five Hops?

Redirects consume crawl budget. Each additional hop represents an HTTP request, a response download, a status check. Googlebot cannot indefinitely follow redirect chains without risking getting trapped in infinite loops or wasting resources on misconfigured sites.

The limit of five redirects at once is a compromise. It allows for the management of standard migrations (HTTP → HTTPS, www → non-www, old structure → new) while avoiding abuse. Beyond that, the bot segments the processing: it notes the last URL reached, waiting for the next crawl to continue.

What Happens After the Fifth Redirect?

Processing becomes asynchronous and fragmented. Googlebot doesn't refuse to index the final page—it simply slows down the process. During the first crawl, it reaches the fifth URL in the chain. During a subsequent crawl (possibly days or weeks later, depending on the site's priority), it resumes from where it left off.

This delay has two direct consequences. First, the freshness of content: your new page will take longer to appear in the index. Second, the loss of PageRank: each 301 or 302 redirect slightly dilutes the equity passed, and a chain of seven redirects amplifies this dilution.

Does This Rule Apply to All Types of Redirects?

Yes, all HTTP redirects count towards this limit: 301 (permanent), 302 (temporary), 307, 308. JavaScript or meta refresh redirects are treated differently—they don't count towards these five hops, but they still slow down the crawl as they require JavaScript rendering.

Server-side redirects (from Apache, Nginx, via .htaccess or configuration files) are the most reliable. CDNs like Cloudflare sometimes add invisible intermediate redirects—monitor your chains with tools like Screaming Frog or command-line curl.

  • Five redirects maximum are processed in a single crawl pass by Googlebot
  • Beyond that, processing becomes fragmented and asynchronous, slowing down indexing
  • Each additional hop dilutes the PageRank passed to the final page
  • JavaScript/meta refresh redirects do not count towards this limit, but they further slow down the crawl
  • CDNs and proxies can add invisible intermediate redirects—regularly audit your chains

SEO Expert opinion

Is This Statement Consistent with Field Observations?

Yes, and it’s even a more generous figure than what was historically observed. For years, the unofficial recommendation hovered around a maximum of three redirects. That Mueller confirms five redirects in a single pass is great news for complex migrations—multiple domains, deep restructurings, brand consolidations.

However, "slowly checks later redirects" remains intentionally vague. How much time between each resume? It depends on the crawl budget allocated to the site, the historical crawl frequency, and the domain authority. A site with a low crawl budget may wait weeks before Googlebot resumes the chain.

In What Cases Does This Limit Present a Real Problem?

Poorly planned migrations are the typical case. Imagine a business migrating HTTP → HTTPS, then changing domain, then reorganizing its structure, then switching to a new CMS—all while keeping each layer of redirects. You end up with seven or eight hops before reaching the final page.

Multilingual or multi-regional sites are also exposed. Geolocated redirect (e.g., .com → .fr), then www → non-www redirect, then language redirect (e.g., /fr/ → /fr-FR/), then category redirect after redesign… Each business logic adds another layer.

What Nuances Should Be Applied to This Rule?

First, this limit applies to crawling, not rendering. If a page redirects via JavaScript after an initial HTML load, Googlebot must first render the page, then follow the redirect—an even slower process. Server redirects are always preferable.

Next, PageRank dilution remains a vague topic. Google has claimed for years that 301 redirects pass 100% of equity, but field tests show a slight loss in long chains. [To be verified]—no official precise data on the exact percentage after five hops or more.

Warning: Multiple redirects can mask deeper issues. If your site accumulates long chains, it is often a symptom of poorly consolidated architecture or sloppy migration management. Rather than relying on Google’s tolerance, fix the source.

Practical impact and recommendations

How to Detect and Measure Redirect Chains on Your Site?

Run a complete crawl with Screaming Frog (Spider mode, limited to your domain). Enable the "Always Follow Redirects" option and check the "Redirect Chains" tab. You will see each chain, the number of hops, and the HTTP status codes at each step.

For on-the-spot checks, use curl in the command line with the -L (follow redirects) and -v (verbose) options. You will see each step, the HTTP headers, and response times. Example: curl -L -v https://yoursite.com/old-url. If you count more than five hops, you have a problem.

What to Do if You Discover Chains That Are Too Long?

First priority: shorten critical chains. Identify pages receiving external backlinks or significant organic traffic. For these URLs, redirect directly from the old to the new, without going through intermediate steps. Modify your server configuration (.htaccess, nginx.conf, Cloudflare rules) to point A → D instead of A → B → C → D.

Second step: clean up obsolete redirects. Many sites retain migration redirects that are five or ten years old. If the source URL is no longer receiving any traffic or backlinks, you can delete it (return a 410 Gone or a 404 if preferred). Focus your redirects on what still has value.

How to Avoid Recreating This Problem in Future Migrations?

Plan your redirects in a single layer from the beginning. If you are migrating HTTP → HTTPS and www → non-www simultaneously, set up a direct redirect A → D, not A → B → C → D. Modern servers (Apache 2.4+, Nginx) handle very complex conditional rules well.

Document each wave of redirects in a mapping table (source URL, target URL, HTTP code, implementation date, reason). During the next migration, you can consolidate by directly updating old sources. Automate the verification with a script that tests your critical URLs weekly.

  • Crawl your site with Screaming Frog to detect all existing redirect chains
  • Prioritize critical URLs (strong backlinks, organic traffic) and shorten their chains to a maximum of 1-2 hops
  • Remove or return 410/404 on obsolete redirects that no longer receive any signals
  • Configure future redirects in a single step (A → D) rather than in cascading (A → B → C → D)
  • Document each mapping in a centralized table with dates and reasons
  • Automate a weekly monitoring of critical URLs to quickly detect any new chains
Auditing and cleaning complex redirect chains require a fine understanding of your server architecture, your migration history, and your SEO priorities. If you manage a site with multiple domains, successive migrations, or an advanced CDN configuration, these optimizations can quickly become technical. In this case, consulting a specialized SEO agency allows you to benefit from a complete diagnosis and an action plan tailored to your specific infrastructure—especially if you are planning a redesign or brand consolidation soon.

❓ Frequently Asked Questions

Est-ce que les redirections 302 temporaires comptent dans la limite des cinq redirections ?
Oui, toutes les redirections HTTP (301, 302, 307, 308) comptent dans cette limite de cinq sauts. Le type de redirection n'influence que l'équité transmise et le signal d'indexation, pas le traitement en chaîne par Googlebot.
Les redirections JavaScript ou meta refresh sont-elles incluses dans cette limite ?
Non, elles ne comptent pas dans les cinq redirections HTTP. Cependant, elles ralentissent encore plus le crawl car Googlebot doit d'abord rendre la page JavaScript avant de détecter la redirection. Préfère toujours les redirections serveur 301.
Combien de temps faut-il à Google pour traiter une chaîne de sept redirections ?
Aucune donnée officielle précise. Google suit les cinq premières en une passe, puis reprend lors d'un crawl ultérieur. Le délai dépend du crawl budget alloué à ton site — de quelques jours à plusieurs semaines pour un site à faible autorité.
Une chaîne de redirections longue affecte-t-elle le PageRank transmis ?
Officiellement, Google affirme que les 301 transmettent 100 % de l'équité. Dans la pratique, les tests terrain montrent une légère dilution dans les chaînes longues. Impossible de quantifier précisément sans données officielles, mais limiter à 1-2 sauts reste une bonne pratique.
Comment vérifier rapidement si une URL a une chaîne de redirections trop longue ?
Utilise curl en ligne de commande avec les options -L (follow redirects) et -v (verbose) : curl -L -v https://tonsite.com/url. Tu verras chaque étape HTTP. Pour un audit complet, Screaming Frog avec l'option "Always Follow Redirects" liste toutes les chaînes détectées.
🏷 Related Topics
AI & SEO Redirects

🎥 From the same video 9

Other SEO insights extracted from this same Google Search Central video · duration 56 min · published on 27/06/2019

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.