What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

It's best to set up redirects to point directly to the final URL without any intermediaries. Too many steps in a redirect chain can lead to delays in crawling.
30:46
🎥 Source video

Extracted from a Google Search Central video

⏱ 55:16 💬 EN 📅 16/04/2019 ✂ 10 statements
Watch on YouTube (30:46) →
Other statements from this video 9
  1. 1:05 Le nofollow sur les facettes tue-t-il vraiment le crawl budget ?
  2. 4:17 Faut-il vraiment attendre avant de diagnostiquer les problèmes d'indexation Google ?
  3. 8:32 Comment distinguer le vrai Googlebot des faux robots usurpateurs ?
  4. 10:12 Pourquoi vos images ne s'indexent-elles pas malgré un contenu optimisé ?
  5. 14:42 Faut-il vraiment personnaliser les données structurées de chaque page ?
  6. 20:31 Les domaines expirés sont-ils vraiment inutiles pour le SEO ?
  7. 21:37 Faut-il vraiment ajouter des canoniques auto-référentielles sur chaque page ?
  8. 36:34 Comment prouver votre expertise aux yeux de Google lors des Core Updates ?
  9. 53:04 Faut-il fuir les domaines avec un passé spam ou peut-on les récupérer ?
📅
Official statement from (7 years ago)
TL;DR

Google recommends setting up redirects to point directly to the final URL without intermediate steps. A long chain can slow down crawling and delay the indexing of new pages. In practice, each cascading redirect consumes crawl budget and increases response time — two factors that can hinder the discovery of your content by bots.

What you need to understand

What is a redirect chain and why does it cause issues?

A redirect chain occurs when one URL redirects to another, which in turn redirects to a third, and so on. For example: URL A → URL B → URL C → final URL. Each intermediate step incurs an additional HTTP request, which lengthens processing time.

Googlebot, like any crawler, has a limited crawl budget — the number of pages it is willing to explore on your site within a given timeframe. The more time it spends on cascading redirects, the less it can explore fresh content. On a site with thousands of pages, this friction can delay the indexing of strategic pages.

Why does Google emphasize direct final URLs?

The search engine prioritizes crawling efficiency. Each jump in a chain consumes server resources and extends the delay before content is analyzed. Mueller does not mention a direct penalty, but rather a delay in processing — an important distinction.

A single redirect (A → final URL) is instantly understood and followed. Three or four redirects in a series can trigger timeout errors, especially if the server is slow or if the site experiences a traffic spike. Googlebot may also decide to abandon the crawl of this chain to move on.

How many redirects can we tolerate before it becomes problematic?

Google has never communicated an official threshold, but field observations show that two redirects maximum are generally well tolerated. Beyond that, we enter a gray area where behavior becomes unpredictable depending on server speed and site crawl frequency.

Some historical sites accumulate redirects over time with migrations, redesigns after redesigns. A page might end up pointing to four or five intermediate URLs before reaching its final destination. This is precisely the type of configuration Mueller is addressing here: an uncontrolled stacking that harms crawling speed.

  • A redirect chain lengthens crawl time and consumes budget unnecessarily.
  • Googlebot may abandon a chain that is too long or process it later, delaying indexing.
  • No direct penalty is applied, but the impact on content discovery is real.
  • Two redirects at most seem to be the practical acceptable limit, although Google does not set an official threshold.
  • Sites that have undergone multiple migrations or structural changes are the most exposed to this issue.

SEO Expert opinion

Is this recommendation consistent with what we observe in the field?

Yes, and it's even one of the few statements from Mueller that enjoys total consensus among SEO practitioners. Technical audits regularly reveal chains of three, four, or even five redirects on sites that have undergone several migrations without cleanup. The impact on response time is measurable, and server logs show that Googlebot sometimes abandons these chains before reaching the final URL.

Tools like Screaming Frog or OnCrawl easily detect these anomalies. On a high-volume page site, fixing these chains can free up 10 to 20% of crawl budget — a significant gain for speeding up the indexing of new sections or freshly published content.

Are there nuances or cases where this rule does not strictly apply?

Let's be honest: on a site with a few dozen pages and an excellent crawl budget, a chain of two redirects is unlikely to cause any visible issues. The impact becomes critical on sites with thousands or even millions of pages, where every millisecond counts and where Googlebot must prioritize its exploration.

Another special case: sites with a very high crawl frequency (news media, real-time content platforms) can afford less margin for error. A chain of three redirects on a fresh article page can delay its appearance in Google News by several minutes or even hours — a vital delay in this sector.

What risks exist in not correcting these chains immediately?

The main risk is not an algorithmic penalty — Google does not directly penalize redirect chains — but a gradual degradation of crawling speed. New pages take longer to be discovered, content updates are indexed late, and deep pages may remain off the radar for weeks.

On an e-commerce site with frequent product rotations, this could mean that in-stock references are not indexed in time, while out-of-stock products remain visible in the SERPs. An imbalance between reality and indexing that directly impacts the business. [To be verified]: Google has never provided numerical data on the percentage of crawl budget lost per additional redirect jump — field estimates vary from 5 to 15% per step depending on server latency.

If your site has undergone multiple migrations or domain name changes, a complete redirect audit is a priority. Chains accumulated over the years can represent an invisible but measurable hindrance to your SEO performance.

Practical impact and recommendations

How do I identify redirect chains on my site?

Use a technical crawler like Screaming Frog, Oncrawl, or Sitebulb in "redirect tracking" mode. These tools automatically detect URLs that redirect to another URL that is then redirected. Export the report with the number of jumps per chain and prioritize corrections on URLs with the most incoming links or those most strategic for your business.

The server logs are also a gold mine: analyze Googlebot's requests and identify URLs where the bot returns multiple times without reaching the final URL, a sign that it abandons or queues it. This gives you a concrete view of the impact on your real crawl budget.

What should I actually do to fix these chains?

The solution is simple in theory: redirect directly from URL A to the final URL, bypassing all intermediate steps. In practice, this involves modifying .htaccess files, Nginx redirect rules, or redirections managed via your CMS or CDN. Beware of rule conflicts that can create infinite loops — test each modification in a staging environment before deployment.

On sites that have undergone multiple migrations, it may be necessary to completely rebuild the redirect file by mapping each old URL to its current final destination. It’s tedious but essential to clean up the architecture and regain crawling speed.

What mistakes should I avoid when correcting these chains?

Never remove an intermediate redirect without ensuring that no external or internal URL still points to it. If high-quality backlinks point to URL B that redirects to the final URL, you must first redirect B directly to the final, then update internal links to point directly to the final. Removing B before this update would create 404s.

Another classic trap: cascading redirects caused by protocol or domain changes. For example: http://example.com → https://example.com → https://www.example.com → https://www.example.com/final-page. Each layer (HTTP → HTTPS, non-www → www, old slug → new slug) must be consolidated into a single rule pointing directly to the final canonical version.

  • Crawl your site with a technical tool to detect all existing redirect chains.
  • Prioritize fixing URLs that have the most incoming links or the highest organic traffic.
  • Modify redirect rules to point directly to the final URL, testing in a staging environment.
  • Check that no internal links still point to intermediate URLs after correction.
  • Analyze server logs post-correction to measure the impact on the frequency and depth of Googlebot's crawl.
  • Document each change to avoid recreating chains during future migrations or redesigns.
Fixing redirect chains is an essential technical optimization to maximize your crawl budget and speed up the indexing of your content. It's a task that can be complex on sites with a history of multiple migrations or heterogeneous technical architecture. For high-volume sites or those that have undergone several redesigns, it is often wise to entrust this audit and these corrections to a specialized SEO agency, capable of mapping all redirects and deploying changes without risking regressions or traffic loss.

❓ Frequently Asked Questions

Une chaîne de deux redirections est-elle problématique pour le SEO ?
Dans la majorité des cas, deux redirections sont tolérées sans impact visible sur le crawl. C'est au-delà de deux sauts que les retards deviennent mesurables, surtout sur les sites à fort volume de pages.
Les redirections 301 en chaîne transmettent-elles toujours le PageRank ?
Oui, Google a confirmé que les redirections 301 transmettent le PageRank, même en chaîne. Cependant, chaque saut rallonge le temps de crawl, ce qui peut retarder l'indexation de la page finale.
Comment mesurer l'impact réel des chaînes de redirection sur mon budget crawl ?
Analyse tes logs serveur pour repérer les URLs où Googlebot effectue plusieurs requêtes sans atteindre l'URL finale. Compare la fréquence de crawl avant et après correction des chaînes pour quantifier le gain.
Faut-il corriger en priorité les chaînes sur les pages ayant le plus de backlinks ?
Absolument. Les pages avec beaucoup de liens entrants sont crawlées plus fréquemment. Corriger leurs chaînes de redirection maximise l'impact sur la vélocité de découverte de ton contenu par Googlebot.
Les redirections JavaScript créent-elles aussi des chaînes problématiques ?
Oui, et c'est même pire : Googlebot doit exécuter le JavaScript pour détecter la redirection, ce qui rallonge encore plus le processus. Privilégie toujours les redirections côté serveur (301/302) pour un crawl optimal.
🏷 Related Topics
Crawl & Indexing AI & SEO Domain Name Redirects

🎥 From the same video 9

Other SEO insights extracted from this same Google Search Central video · duration 55 min · published on 16/04/2019

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.