What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

To optimize SEO, it is crucial that your site's content arrives with an initial HTML sent to the client without waiting for JavaScript execution. Googlebot goes through two waves of indexing, the first without executing any JavaScript. The absence of initial rendering can harm your site's indexing, especially if it's large or frequently updated.
2:06
🎥 Source video

Extracted from a Google Search Central video

⏱ 3:10 💬 EN 📅 10/04/2019 ✂ 3 statements
Watch on YouTube (2:06) →
Other statements from this video 2
  1. 1:04 Les balises title et meta description sont-elles vraiment décisives pour votre visibilité dans Google ?
  2. 2:39 Faut-il systématiquement pré-rendre son JavaScript pour que Googlebot indexe correctement ?
📅
Official statement from (7 years ago)
TL;DR

Google indexes your site in two waves: the first completely ignores JavaScript. The result: any content that relies on JS rendering doesn't exist during this first pass and risks never being considered if your site is large or frequently updated. Specifically, a site that delivers its content as raw HTML has a massive indexing advantage over a competitor that relies on React or Vue to display its titles, descriptions, or internal links.

What you need to understand

What’s the deal with this 'two waves' of indexing?

Googlebot crawls your page and retrieves the initial HTML — the one that comes directly from the server. At this stage, nothing is executed: no JavaScript, no front-end frameworks, no fetch API. This first wave forms the backbone of indexing.

The second wave comes later — sometimes hours or days later — and executes JavaScript to complete the rendering. But here's the catch: if your critical content (titles, meta, internal linking, paragraphs) is only available after this second pass, you've already wasted precious time. And that’s where the trouble begins.

Why does this strategy pose problems for large sites?

A site with 10,000 pages that relies on JavaScript to display its product URLs or internal navigation links creates a bottleneck. Googlebot cannot re-crawl everything in JS rendering mode: it's too resource-intensive.

The result: Some pages may never pass the second wave, or if they do, they arrive so late that frequent updates are never taken into account. Your site becomes a ghost for Google — technically crawled but indexed incompletely.

Does server-side rendering really solve everything?

SSR (Server-Side Rendering) or static pre-rendering become survival mechanisms in this ecosystem. They ensure that the initial HTML already contains critical content, without waiting for the browser to execute code.

But be careful: poorly configured SSR may still send an empty skeleton to the bot, especially if you serve different versions based on user-agent. Google doesn't cheat: it analyzes what comes in the initial HTTP request, period.

  • Initial HTML = first impression: anything not included is considered secondary by Googlebot during the first wave.
  • Limited crawl budget: on a large site, the second wave may never reach certain pages or may arrive too late.
  • Frequent updates = increased risk: If your content changes rapidly (e-commerce, news), the gap between the two waves becomes a major handicap.
  • SSR/SSG mandatory: to ensure critical content is present from the first pass, server-side rendering becomes essential.
  • No JS magic: a pure SPA (Single Page Application) site without HTML fallback takes a huge risk on complete indexing.

SEO Expert opinion

Is this statement consistent with real-world observations?

Yes, and it’s even a stark realization for anyone who has migrated a site to a full-JS stack without SSR. We regularly see massive indexing drops after poorly managed React or Vue migrations — pages disappearing from the index simply because their content no longer arrives in the initial HTML.

What's interesting is that Google has been clear about this mechanism for years, yet many developers and even SEOs continue to treat JS rendering as a technical detail. Splitt sets the record straight: it’s not a detail; it’s the heart of the issue.

What nuances should we add to this statement?

The term 'two waves' is a pedagogical simplification. In reality, Googlebot manages a complex pipeline where some pages are re-crawled in JS rendering mode almost immediately, while others never are. Priority depends on crawl budget, content freshness, and page popularity.

Secondly: the 'absence of initial rendering' doesn’t necessarily mean a total blackout. However, it leads to indexing delays, a loss of context (internal links not discovered during the first pass), and a heightened risk of updates not being considered. [To be verified]: Google does not publish precise statistics on the percentage of pages that actually pass the second wave on a large site.

In what cases can this rule be circumvented?

On a small static site (fewer than 500 pages, few updates), the impact is negligible. Googlebot has the means to fully re-crawl everything. But as soon as we talk about a product catalog, a high-frequency publication blog, or a directory, the problem becomes structural.

Some have attempted to bypass this via dynamic rendering (serving pre-rendered HTML only to bots). Google tolerates this approach… but officially discourages it. And this is where Splitt's message makes perfect sense: rather than hack workaround solutions, it's better to design your architecture so that the initial HTML is rich right from the start.

Practical impact and recommendations

What should you concretely do on an existing site?

First step: audit the initial HTML. Disable JavaScript in your browser (or use a tool like curl) and check what's left. If your H1 titles, paragraphs, and internal links disappear… you have a problem.

Then, implement SSR (Server-Side Rendering) or SSG (Static Site Generation) depending on your stack. Next.js, Nuxt, Gatsby, Eleventy — it doesn’t matter what tool; the goal is the same: send complete content from the first HTTP request. If you are on WordPress with a classic theme, you’re probably good to go. If you're using a headless CMS with a React front, that’s where it gets complicated.

What mistakes should you absolutely avoid?

Don't just check the homepage. Deep pages — product sheets, blog articles, category pages — suffer the most from crawl budget deficit. That’s where the second wave never arrives.

Another classic pitfall: enabling SSR but forgetting to properly configure server-side caching. Result: every Googlebot request generates a complete rendering, your servers crash, and you disable SSR in a panic. Set up smart caching (Varnish, Cloudflare, Redis) to serve pre-rendered HTML without overloading your infrastructure.

How can you check if your site meets this requirement?

Use Google Search Console and inspect a representative URL. Compare the 'Fetched HTML' to the 'Rendered HTML'. If both are identical (or nearly so), you’re in the clear. If the rendering adds 80% of the content… you have an issue.

Also, conduct a crawl with Screaming Frog in 'JavaScript rendering disabled' mode. Count the number of pages with empty content, missing links, absent title tags. It’s a brutal but reliable indicator of your exposure to incomplete indexing risk.

  • Audit the initial HTML on a representative sample of pages (homepage, categories, products, articles)
  • Implement SSR or SSG if critical content currently depends on JavaScript
  • Check consistency between fetched HTML and rendered HTML in Google Search Console
  • Configure server-side caching to avoid overload from dynamic rendering
  • Crawl the site with JavaScript disabled to measure the extent of the problem
  • Prioritize high-SEO-value pages (top landing pages, conversion-generating pages)
The initial HTML is not a technical luxury; it’s a requirement for indexing on any site of significant size. If your critical content arrives only after JavaScript execution, you are risking Google’s crawl budget — and you will lose. Migrating to server-side rendering can be complex depending on your technical stack, especially if you manage a large catalog or a microservices architecture. In this case, it may be wise to seek support from a specialized SEO agency that understands both indexing challenges and technical constraints of your environment.

❓ Frequently Asked Questions

Est-ce que Google indexe quand même le contenu qui arrive via JavaScript ?
Oui, mais avec un retard potentiellement important. Sur un gros site ou un site fréquemment mis à jour, ce retard peut devenir un handicap majeur : certaines pages ne passeront jamais la seconde vague de rendu.
Le dynamic rendering (servir du HTML pré-rendu uniquement aux bots) est-il une solution acceptable ?
Google tolère cette approche mais la déconseille. Le risque : divergence entre ce que voient les utilisateurs et les bots, configurations fragiles, maintenance complexe. Mieux vaut un SSR universel.
Mon site WordPress est-il concerné par ce problème ?
Probablement pas si tu utilises un thème classique qui génère du HTML côté serveur. En revanche, si tu as migré vers un headless WordPress avec un front React/Vue sans SSR, tu es pleinement concerné.
Comment vérifier rapidement si mon HTML initial contient le contenu critique ?
Désactive JavaScript dans ton navigateur (ou utilise curl) et charge une page. Si tes titres, paragraphes et liens internes sont visibles, c'est bon. Sinon, tu as un problème d'indexation potentiel.
Est-ce que Next.js ou Nuxt règlent automatiquement ce problème ?
Oui, si tu configures le SSR ou le SSG correctement. Mais attention : un mauvais paramétrage peut quand même envoyer un squelette vide au bot. Vérifie toujours le HTML réellement servi côté serveur.
🏷 Related Topics
Content Crawl & Indexing AI & SEO JavaScript & Technical SEO Links & Backlinks

🎥 From the same video 2

Other SEO insights extracted from this same Google Search Central video · duration 3 min · published on 10/04/2019

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.