What does Google say about SEO? /

Official statement

Avoid using URL fragments if you want crawlers to discover and follow your links. Fragment identifiers are not designed to point to different content and are ignored by crawlers.
4:17
🎥 Source video

Extracted from a Google Search Central video

⏱ 4:48 💬 EN 📅 29/04/2020 ✂ 3 statements
Watch on YouTube (4:17) →
Other statements from this video 2
  1. 1:00 How do internal links really shape the topical relevance of your pages?
  2. 3:10 Are Your JavaScript Links Wrecking Your Crawl Budget, and How Can You Fix It?
📅
Official statement from (6 years ago)
TL;DR

Google completely ignores URL fragments (everything after the #) during crawling and link tracking. This means if your navigation relies on anchors like example.com/page#section2, crawlers will never discover the targeted content. For SEO, this means rethinking the architecture of single-page applications (SPAs) and JavaScript applications that overuse fragment routing.

What you need to understand

What is a URL fragment and why does it exist?

A URL fragment is the portion of a web address that follows the hash symbol (#). Originally designed to allow intra-page navigation to specific HTML anchors, this mechanism has been largely hijacked by modern JavaScript frameworks (React, Angular, Vue.js in hash mode) to simulate client-side routing.

The problem? Crawlers treat fragments as client-side metadata, not as identifiers for distinct content. When Googlebot encounters example.com/blog#article-5, it only processes example.com/blog. Everything that follows the # disappears from the crawl equation.

How does this limitation actually affect indexing?

If your site uses fragments to dynamically load different content, you create ghost URLs that Google cannot follow or index. A navigation menu with links like /products#category-shoes or /services#pricing generates dead links for bots.

Old Single Page Applications (SPAs) in hash router mode are particularly vulnerable. Each internal "page" remains invisible to traditional crawling unless you implement a prerendering or Server-Side Rendering (SSR) system that turns these fragments into real URLs with server state.

What exceptions and edge cases should be noted?

Fragments remain useful and legitimate for their original use: anchoring to sections on the same page. An internal link to /guide#chapter-3 works perfectly for user UX, but does not distribute PageRank to a separate page since there is no separate page.

Some believe that JavaScript can compensate by manipulating the History API to turn fragments into real URLs. It's true, but it requires rigorous implementation and server-side rendering that generates crawlable URLs from the start. The crawler will not execute your client-side JavaScript to magically discover your content.

  • Fragments (#) are ignored by Googlebot during crawling and internal link tracking
  • Any content accessible only via fragment remains invisible without SSR or prerendering
  • Hash router SPAs must migrate to history mode with real server URLs
  • The legitimate use of fragments is limited to intra-page anchoring for user experience
  • Client-side JavaScript does not compensate for the lack of crawlable URLs on the server side

SEO Expert opinion

Is this statement consistent with field observations?

Absolutely. Empirical tests consistently confirm that fragments disappear from crawl logs and Search Console coverage reports. No surprise here: this behavior has been documented since the early days of the web, but is regularly rediscovered by each generation of developers who reinvent client-side routing.

The real issue is that many modern JavaScript frameworks have long encouraged hash routing by default for its simplicity of deployment (no server configuration needed). As a result, thousands of sites think they have 50 indexable pages when they only expose one to crawl.

What nuances must be considered for this rule?

The essential nuance concerns the use of fragments for A/B testing or tracking. Some tools add session parameters after the # to avoid polluting the canonical URL. In this specific case, Google’s ignorance of the fragment is a blessing, not a problem.

A second nuance: fragments can coexist with crawlable content if the underlying architecture relies on real URLs. A well-structured site can have example.com/article-5 as an indexable URL AND offer example.com/article-5#comment-section for UX. The fragment enhances the experience without breaking indexing.

In what situations might this rule evolve?

[To be verified] Google has experimented in the past with the #! (hashbang) scheme to make fragments crawlable via a snapshot system. This approach was officially abandoned in 2015, but we cannot exclude the possibility of a new method emerging in response to the omnipresence of modern JavaScript.

That said, the industry has clearly moved towards cleaner solutions like Server-Side Rendering (Next.js, Nuxt.js) and Static Site Generation. The trend is not to make fragments crawlable, but to eliminate them completely from SEO-friendly architectures.

Warning: If your site is still using hash routing in production, you are likely losing organic traffic without knowing. A quick crawl audit with Screaming Frog or Sitebulb will show you the extent of the damage: undiscovered internal URLs, broken crawl depth, dilution of internal PageRank.

Practical impact and recommendations

How can you quickly audit if your site is suffering from this issue?

First step: inspect your server logs and compare them with your internal link structure. If you see internal links with # in your HTML but no corresponding crawl trace in the logs, you have confirmation of the problem. Search Console will never show you these ghost URLs in the coverage report.

Second step: crawl your site with a tool that emulates Googlebot with JavaScript disabled (Screaming Frog in "Render: Text Only" mode). Compare the number of discovered pages with your XML sitemap or your expected inventory. A significant gap often indicates a fragment routing or excessive JavaScript dependency issue.

What technical migrations should be considered concretely?

If you are using a framework like React Router, Vue Router, or Angular Router in hash mode, migrating to "history" mode (or "HTML5 mode") should be your top priority. This requires server config to handle rewrites: all URLs must point to your application entry point which then manages the routing.

For complex sites, consider a hybrid architecture with partial SSR: high-SEO stakes pages (categories, product sheets, articles) are rendered server-side, while purely UX interactions (modals, tabs, accordions) can remain client-side with fragments. You maintain a fluid experience without sacrificing indexing.

What mistakes to avoid during the redesign?

Never remove fragments outright without a redirect plan. If your URLs with # have accumulated external backlinks or social shares (rare but possible), you need to map these old URLs to the new ones. Set up server-side or JavaScript detection to redirect properly.

Second trap: not testing server rendering before going live. A poorly configured SSR can generate empty content or JavaScript errors that break indexing even worse than before. Use the URL inspection tool in Search Console to validate that the rendered content matches expectations.

  • Audit server logs to identify uncrawled URLs with fragments
  • Migrate from hash routing to history mode in your JavaScript frameworks
  • Configure server rewrites (Apache, Nginx) to support HTML5 routing
  • Implement SSR or prerendering for critical SEO pages
  • Set up redirects for old fragment URLs if they have backlinks
  • Validate rendering with the Search Console inspection tool before and after migration
Eliminating URL fragments for SEO-critical content is a technical operation that affects server architecture, framework configuration, and application logic. The complexity varies significantly depending on the tech stack (Next.js makes it easier, an old Angular site may require partial overhaul). If your internal team lacks expertise on this type of migration or if you want to avoid costly visibility errors, consulting with a specialized technical SEO agency may be wise to secure the transition and measure the real impact on your rankings.

❓ Frequently Asked Questions

Est-ce que les fragments (#) transmettent du PageRank vers les sections ciblées ?
Non. Les fragments ne créent pas de liens distincts aux yeux de Google, donc aucun PageRank n'est distribué. Le lien pointe vers l'URL de base sans le fragment.
Un sitemap XML peut-il contenir des URLs avec fragments pour forcer l'indexation ?
Techniquement oui, mais Google ignorera la partie après le #. Vous soumettriez en réalité plusieurs fois la même URL de base, ce qui dilue votre crawl budget inutilement.
Les fragments posent-ils problème pour le maillage interne et la profondeur de crawl ?
Absolument. Si vos liens internes utilisent des fragments pour naviguer vers du contenu distinct, vous créez des culs-de-sac pour le crawler. La profondeur de crawl reste bloquée à la page contenant les fragments.
Le prerendering dynamique (comme Prerender.io) résout-il définitivement le problème ?
Il compense en générant du HTML statique pour les bots, mais c'est une rustine. Une architecture avec vraies URLs crawlables nativement reste supérieure en termes de maintenabilité et de coûts.
Les frameworks modernes (Next.js, Nuxt) évitent-ils automatiquement ce piège ?
En grande partie oui, car ils favorisent le routing basé sur le système de fichiers avec URLs propres et SSR natif. Mais une mauvaise config ou l'usage de certains composants client-only peut réintroduire le problème.
🏷 Related Topics
Content Crawl & Indexing AI & SEO Links & Backlinks Domain Name

🎥 From the same video 2

Other SEO insights extracted from this same Google Search Central video · duration 4 min · published on 29/04/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.