What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

To change your URL structure, use JavaScript to create client-side redirects because server-side redirects do not work after the hash symbol, which is handled in the browser.
1:06
🎥 Source video

Extracted from a Google Search Central video

⏱ 1:38 💬 EN 📅 17/09/2019 ✂ 2 statements
Watch on YouTube (1:06) →
Other statements from this video 1
  1. 0:35 Les URLs hashbang sont-elles enfin devenues crawlables par Google sans configuration spécifique ?
📅
Official statement from (6 years ago)
TL;DR

Google recommends using JavaScript for handling client-side redirects when the URL structure includes fragments (the # symbol), as traditional server redirects do not process this part of the URL. For SEO, this means adjusting your migration strategy based on whether the old structure included fragments or not. The critical nuance: this approach applies only to specific cases of redesigns involving hash URLs, not to standard migrations.

What you need to understand

Why do server redirects fail on URL fragments?

The hash (#) symbol in a URL creates what is known as a fragment identifier. This fragment is never sent to the server during an HTTP request — it is processed exclusively on the browser side. When a user or bot accesses example.com/page#section1, the server only receives example.com/page.

The direct consequence: it's impossible to set up a 301 or 302 server-side redirect that accounts for what follows the #. Your .htaccess file or Nginx configuration simply does not see this part. If your old structure relied on URLs like site.com/#/products/shoes (typical of older single-page JavaScript applications), migrating to site.com/products/shoes requires a different approach.

In what context does this recommendation actually apply?

This directive from Mueller targets a specific use case: sites that have used hash-based URLs (often with first-generation AngularJS or older SPAs) and want to migrate to a clean URL architecture. Typically, we're talking about JavaScript frameworks that routed everything via the fragment before the HTML5 History API became standard.

If you have never had a # in your production URL structure, this recommendation does not concern you. The classic migration of a WordPress, Drupal, or traditional e-commerce site remains managed by 301 server redirects without any client-side JavaScript.

How does JavaScript help bypass this technical limitation?

Since the fragment is accessible to the browser, JavaScript can read it via window.location.hash and trigger a client-side redirect. The script detects the old URL format, extracts information from the fragment, and redirects to the new clean structure using window.location.replace() or history.replaceState().

Googlebot now executes JavaScript quite reliably — which means that a client-side JavaScript redirect is generally followed and interpreted. But be cautious: this adds a layer of complexity and potential delays in processing, unlike an instant HTTP redirect.

  • HTTP server redirects do not see what follows the # in the URL
  • Client-side JavaScript can read window.location.hash and programmatically redirect
  • This approach only concerns migrations involving existing hash URLs
  • Googlebot executes the JavaScript but with a latency higher than that of classic HTTP redirects
  • Classic migrations without fragments remain handled by 301 server — nothing changes for 95% of cases

SEO Expert opinion

Is this recommendation really the only technical solution available?

Mueller presents JavaScript redirection as the solution, but let’s be precise: this is the solution when trying to preserve old URLs with fragments. In reality, the best approach often is to never have used indexable hash URLs in the first place. If you are in the design phase of a modern SPA, prioritize using the History API mode (clean URLs without #) instead of hash mode.

For an existing migration, yes, JavaScript becomes necessary — but this introduces points of fragility. The script must load and execute correctly. If a bot disables JavaScript (some third-party crawlers still do), the redirect fails. And even Googlebot, despite its advancements, can sometimes index the state before redirection if rendering is slow. [To be verified] systematically through Search Console and rendering tests.

What practical risks does this approach actually entail for practitioners?

The first constraint: execution latency. An HTTP 301 redirect is instantaneous, processed even before the page loads. A JavaScript redirect requires downloading the HTML, the script, parsing it, and executing it — we’re talking about an additional minimum of hundreds of milliseconds. For Googlebot, which manages a limited crawl budget, this delay can impact the number of pages crawled.

The second pitfall: managing historical SEO signals. A server 301 redirect cleanly passes PageRank and ranking signals. With JavaScript, Google must understand that URL A redirects to URL B — which usually works, but with fewer formal guarantees. Monitoring changes in positions and organic traffic post-migration becomes absolutely critical.

Attention: If your migration involves thousands of URLs with fragments, test first on a smaller sample. Check in Search Console that Google is correctly following JavaScript redirects and transferring indexing to the new URLs before doing a massive deployment.

In which scenarios does this directive absolutely not apply?

Any standard migration without fragments falls under classic server-side redirects. If you’re moving from example.com/old-page to example.com/new-page, a simple 301 in the .htaccess or server config is more than sufficient. Don’t complicate things unnecessarily with JavaScript.

The same applies to domain changes, directory structure modifications, or content consolidations. Mueller's statement targets a niche technical case — not the majority of redesigns. Too many SEOs apply complex solutions to simple problems. Here, reserve JavaScript for URLs with # actually present in your logs or your Search Console.

Practical impact and recommendations

How do you concretely implement a JavaScript redirect for hash URLs?

First step: audit your indexed old URLs via Search Console to identify those containing fragments. Export the complete list and analyze the patterns. If you had a structure like site.com/#/category/product, map the correspondence to the new clean URLs site.com/category/product.

Next, integrate a redirection script in the head of your old pages (or globally if the entire structure was hash-based). This script should read window.location.hash, parse the content, and execute window.location.replace(newURL) — not window.location.href which would create an entry in the history. Use replace() or better yet history.replaceState() for a clean transition.

What technical errors must absolutely be avoided during this migration?

A frequent mistake: forgetting to test Googlebot’s rendering. Use the URL inspection tool in Search Console to check that the bot properly detects the redirect and indexes the new target URL. If the rendering still shows the old URL, your script has an execution or timing problem.

Another pitfall: not managing edge cases. What happens if a user arrives at a fragment URL that doesn’t have a match in your new structure? Plan a default redirect to a relevant page (parent category, homepage) instead of a silent JavaScript error. And document each mapping — you’ll need it six months later when a question arises.

How do you check that the migration was successful without losing visibility?

Monitor daily for the first 4-6 weeks: positions of strategic keywords, organic traffic by landing page, indexing rate in Search Console. Compare the curves before/after migration while isolating seasonality. A sharp drop signals a redirect or detection issue with Google.

Also, check that the old URLs with fragments gradually disappear from the index in favor of the new ones. Use targeted site: queries to track the evolution. If after 2-3 months Google is still massively indexing the old URLs, your JavaScript redirect strategy shows a flaw — it’s time to investigate server logs and coverage reports.

  • Thoroughly map all indexed URLs with fragments in Search Console
  • Implement a redirection script using window.location.replace() or history.replaceState()
  • Test each redirect pattern using Google’s URL inspection tool
  • Gradually deploy by sections of the site if the volume is significant
  • Monitor positions, organic traffic, and indexing for at least 6 weeks post-migration
  • Plan default redirects for orphan URLs without matches
Migrations involving hash URLs require a meticulously tested client-side JavaScript approach, with close monitoring of SEO metrics. This type of technical overhaul carries specific risks — rendering delays, signal transmission, bot compatibility — that often justify the support of a specialized SEO agency adept in both JavaScript aspects and the nuances of how Googlebot reacts to client-side redirects.

❓ Frequently Asked Questions

Les redirections JavaScript sont-elles aussi efficaces que les 301 serveur pour le SEO ?
Les redirections JavaScript fonctionnent pour Googlebot mais introduisent une latence d'exécution et moins de garanties formelles sur la transmission des signaux de classement. Une 301 serveur reste l'approche privilégiée quand techniquement possible — JavaScript n'est nécessaire que pour les URLs avec fragments (#).
Googlebot suit-il systématiquement les redirections implémentées en JavaScript côté client ?
Googlebot exécute le JavaScript moderne de manière généralement fiable, mais avec des délais variables selon la complexité du script et la charge serveur. Il est essentiel de tester chaque redirection via l'outil d'inspection d'URL de la Search Console avant déploiement massif.
Peut-on combiner redirections serveur et JavaScript pour une migration avec fragments ?
Oui, c'est même recommandé : utilisez des 301 serveur pour toutes les URLs sans fragment, et réservez le JavaScript uniquement pour gérer les anciennes URLs contenant un #. Cette approche hybride minimise la dépendance au rendu JavaScript.
Faut-il conserver les anciennes URLs avec fragment après avoir mis en place les redirections JavaScript ?
Conservez-les temporairement (3-6 mois) pour laisser Google réindexer et transférer les signaux, puis supprimez-les proprement en retournant des 404 ou 410 une fois que la Search Console montre une indexation complète des nouvelles URLs.
Cette technique de redirection JavaScript impacte-t-elle les Core Web Vitals et le temps de chargement ?
Oui, une redirection JavaScript ajoute nécessairement du délai par rapport à une redirection HTTP instantanée. Optimisez le script (inline critique, exécution immédiate) et surveillez le LCP et le CLS pour détecter toute dégradation de l'expérience utilisateur.
🏷 Related Topics
AI & SEO JavaScript & Technical SEO Links & Backlinks Domain Name Pagination & Structure Redirects

🎥 From the same video 1

Other SEO insights extracted from this same Google Search Central video · duration 1 min · published on 17/09/2019

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.