What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

URLs with AJAX fragments (hashbangs) are not ideal for SEO. Prefer a clean URL structure that works without the need for client-side scripts.
41:02
🎥 Source video

Extracted from a Google Search Central video

⏱ 56:12 💬 EN 📅 30/11/2017 ✂ 13 statements
Watch on YouTube (41:02) →
Other statements from this video 12
  1. 2:45 Le snippet Google doit-il toujours correspondre exactement à la page de destination ?
  2. 3:45 Google détecte-t-il vraiment tout seul la langue de votre site multilingue ?
  3. 10:01 Faut-il vraiment multiplier les domaines pour son SEO international ?
  4. 12:02 Google peut-il ignorer vos versions linguistiques si elles se ressemblent trop ?
  5. 12:41 Les iframes nuisent-elles vraiment au SEO de votre site ?
  6. 19:33 Pourquoi la Search Console affiche-t-elle des erreurs de données structurées introuvables ailleurs ?
  7. 22:11 Comment le hreflang détermine-t-il vraiment quelle version de votre site Google affiche ?
  8. 22:25 Faut-il vraiment traiter vos pages AMP comme du contenu principal pour qu'elles soient indexées ?
  9. 34:12 Pourquoi Google abandonne-t-il progressivement les pages redirigées vers des erreurs 403 ?
  10. 38:24 Comment Google traite-t-il vraiment les liens internes dupliqués sur une même page ?
  11. 51:10 La vitesse de chargement est-elle vraiment un critère de pénalité Google ?
  12. 61:18 Pourquoi un double canonical AMP/desktop peut-il tuer l'affichage de vos pages ?
📅
Official statement from (8 years ago)
TL;DR

Google claims that URL structures based on AJAX fragments (hashbangs) harm SEO. The recommendation is clear: favor clean URLs that work without client-side JavaScript. In practical terms, this means switching to traditional server URLs or adopting server-side rendering for your JavaScript applications.

What you need to understand

What is a hashbang and why was it used?

The hashbang (#!) was a popular technique in the 2010s for creating single-page applications (SPAs) with dynamic content. The idea was simple: use the URL fragment after the hash to load different views via JavaScript without reloading the page.

Google even proposed a translation system where example.com/#!/article became example.com/?_escaped_fragment_=/article for bots. This system has since been abandoned, making hashbangs even more problematic for indexing.

Why do hashbangs pose a technical problem for indexing?

URL fragments (everything following the #) are never sent to the server during an HTTP request. Only client-side JavaScript can read them and act accordingly. For Googlebot, this means extra work: executing JS, waiting for content to load, and hoping everything works.

The reality on the ground? The crawl budget is wasted, rendering time skyrockets, and indexing becomes erratic depending on the complexity of your JavaScript code. Not to mention other engines like Bing or Yandex, which have historically performed worse with JS rendering.

What does Google really mean by a “clean URL structure”?

Google refers to URLs that function without client-side scripts. This means a typical HTTP request to the URL should return the complete content, or at minimum, a version of HTML that is usable by bots.

Current technical solutions include Server-Side Rendering (SSR), Static Site Generation (SSG), or progressive hydration. The goal: ensure that each URL corresponds to an identifiable server resource with a 200 HTTP code and content in the initial HTML.

  • Avoid URLs like site.com/#!page/article or site.com/#/product/123
  • Prefer URLs like site.com/page/article or site.com/product/123
  • Ensure the main content is present in the source HTML, not just injected by JavaScript afterwards
  • Test your URLs with a simple curl or by disabling JavaScript: the essential content should be visible
  • Check in the Search Console that your pages are indexed with the correct content, not just an empty shell

SEO Expert opinion

Is this directive still relevant with Google's advances in JavaScript?

Googlebot has indeed made progress in JavaScript rendering since 2015, but claiming it perfectly handles all modern frameworks is a myth. Real-world observations show erratic behaviors: timeouts on heavy resources, issues with asynchronous API requests, and partial indexing on sites with lots of JS.

Müller's position remains relevant. Even if Googlebot can technically index content loaded in JS, it remains resource-intensive and less reliable than a traditional server URL. This difference becomes critical for your visibility on sites with thousands of pages.

What nuances should be considered based on the type of site?

A brochure site with 20 pages can afford extensive JavaScript if crawl budget is not an issue. But an e-commerce site with 10,000 product listings? There, every millisecond of rendering matters, and hashbangs become a structural disadvantage.

Complex web applications (SaaS, platforms) often have a public section that must be indexed and a private section (dashboard) which does not. For the public section, using SSR or SSG becomes non-negotiable. For the private section behind authentication, it doesn't matter: hashbangs have no SEO impact since these pages should not be indexed.

What should you do if your site still uses hashbangs today?

Let’s be honest: migrating a complete architecture is not trivial. It often involves redesigning the routing of your application, implementing SSR, and managing 301 redirects from old URLs. The risk of temporary traffic loss exists.

Two pragmatic approaches: either a progressive migration by sections of the site (start with priority pages), or implementing a hybrid system where clean URLs co-exist temporarily with the old ones. In any case, a prior technical audit is essential to identify dependencies. [To verify]: no Google data precisely quantifies the negative SEO impact of hashbangs versus clean URLs, but case studies show indexing gains of 30 to 50% after migration.

Practical impact and recommendations

How can you check if your site is affected by this issue?

The first step: open your site and look at the address bar. Do you see #! or #/ in your navigation URLs? If yes, you are affected. Second test: disable JavaScript in your browser (DevTools > Settings > Disable JavaScript) and navigate your site.

If the main content disappears or if navigation breaks completely, that's a red flag. Also, use the URL Inspection Tool in the Search Console to compare the raw HTML and the rendered HTML: a significant gap indicates excessive dependency on JavaScript.

What concrete actions can be taken to correct the architecture?

The solution depends on your tech stack. With React, choose Next.js, which offers SSR and SSG natively. With Vue, Nuxt.js plays the same role. Angular offers Angular Universal for server-side rendering.

If you're starting from scratch or redesigning, favor Static Site Generation when possible: generate all your pages in static HTML at build time, and SEO becomes trivial. For highly dynamic content (changing prices, stock), combine SSG with client-side API requests for volatile data only.

What pitfalls should you avoid during migration?

The classic pitfall: forgetting 301 redirects from the old URLs with hashbang. The problem: hashbangs are not sent to the server, so a standard server redirect can't intercept them. Solution: manage redirects in client-side JavaScript for old URLs, then redirect to the new clean URLs.

Another common mistake: implementing SSR but forgetting essential meta tags (title, description, canonical) in the initial render. Verify that your SSR generates complete HTML with all SEO metadata in the first server response, not just an empty structure filled in later by JS.

  • Audit all your URLs to identify those containing fragments #! or #/
  • Test visible content with JavaScript disabled on a representative sample of pages
  • Compare the source HTML (View Source) to the rendered HTML in the inspector to measure JS dependency
  • Plan a migration strategy suitable for your framework (SSR, SSG, or hybrid)
  • Set up a complete mapping of old/new URL format with 301 redirects managed client-side if necessary
  • Monitor indexing in the Search Console after migration and quickly resolve any abnormal drops
Migrating from a hashbang architecture to clean URLs is a significant technical project that touches the core of your application. Between analyzing the existing structure, choosing the appropriate rendering solution (SSR, SSG, hydration), managing redirects, and monitoring post-migration, the expertise required often exceeds internal resources. If your site generates significant traffic or has several thousand pages, support from an SEO agency specializing in JavaScript architectures can secure the transition and avoid costly visibility losses.

❓ Frequently Asked Questions

Les hashbangs (#!) sont-ils complètement obsolètes pour le SEO en 2025 ?
Oui, Google a abandonné le système _escaped_fragment_ qui permettait de les gérer. Aujourd'hui, ils ne font que compliquer l'indexation sans apporter aucun bénéfice.
Un site avec des hashbangs peut-il quand même être indexé par Google ?
Techniquement oui, si Googlebot arrive à rendre le JavaScript. Mais c'est moins fiable, plus lent, et cela consomme inutilement votre budget de crawl. Les URLs propres sont systématiquement plus performantes.
Quelle est la différence entre hashbang (#!) et hash simple (#) ?
Le hashbang était une convention spécifique pour indiquer du contenu dynamique AJAX. Le hash simple (#) sert traditionnellement aux ancres internes. Ni l'un ni l'autre ne sont envoyés au serveur, mais le hashbang était censé signaler un besoin d'indexation particulier, système aujourd'hui abandonné.
Le Server-Side Rendering est-il la seule solution pour remplacer les hashbangs ?
Non, le Static Site Generation (SSG) ou l'hydratation progressive fonctionnent aussi. L'objectif est que chaque URL retourne du contenu HTML exploitable côté serveur, peu importe la technique utilisée.
Comment gérer les redirections 301 depuis des URLs avec hashbang ?
Les fragments (#) ne sont pas envoyés au serveur, donc les redirections 301 classiques ne fonctionnent pas. Vous devez gérer la redirection en JavaScript côté client : détectez l'ancien format d'URL et redirigez vers la nouvelle URL propre.
🏷 Related Topics
Content Crawl & Indexing JavaScript & Technical SEO Links & Backlinks Domain Name Pagination & Structure

🎥 From the same video 12

Other SEO insights extracted from this same Google Search Central video · duration 56 min · published on 30/11/2017

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.