What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

Googlebot carries out indexing in two waves. The first wave deals with server-side content and the second, delayed wave handles client-side rendered content. This can delay the indexing of sites that heavily utilize JavaScript.
13:32
🎥 Source video

Extracted from a Google Search Central video

⏱ 39:17 💬 EN 📅 10/05/2018 ✂ 8 statements
Watch on YouTube (13:32) →
Other statements from this video 7
  1. 10:06 Pourquoi Google ignore-t-il vos liens sans attribut HREF ?
  2. 19:57 Le rendu hybride est-il vraiment la seule solution pour indexer vos pages JavaScript ?
  3. 21:40 Le rendu dynamique est-il vraiment la solution pour indexer vos pages JavaScript ?
  4. 22:42 Puppeteer et Rendertron : faut-il vraiment les utiliser pour rendre son JavaScript crawlable ?
  5. 25:44 Googlebot est-il vraiment bloqué sur Chrome 41 pour JavaScript ?
  6. 30:06 Faut-il vraiment tester la version mobile de chaque page pour éviter les pénalités d'indexation ?
  7. 33:03 Le lazy loading condamne-t-il vos images à l'invisibilité sur Google ?
📅
Official statement from (8 years ago)
TL;DR

Googlebot performs indexing in two waves: the first captures server-side content immediately, while the second, delayed wave processes client-side JavaScript rendering. This delay can prolong indexing by several days for heavily JS-oriented sites. If your critical content relies on client-side rendering, you're losing indexing time compared to competitors who serve static HTML.

What you need to understand

How does this two-phase indexing actually work?

Googlebot starts by crawling and indexing the raw HTML returned by the server. This is the first wave. If your page loads content via client-side JavaScript (React, Vue, Angular without SSR), that content won't appear in this initial analysis.

The second wave occurs later, sometimes several days after the initial crawl. Googlebot then queues the page for rendering in a headless Chrome browser, executes the JS, and discovers dynamically generated content. In the meantime, your page may be indexed partially or with incomplete content.

What is the difference between server content and client content?

Server-side content is immediately available in the source HTML that Googlebot receives during the initial request. There's no need to execute JavaScript: the text, links, and meta tags are there.

Client-side content requires script execution to be displayed. If you do a "View Source" and your main text is missing, it is rendered on the client side. Googlebot will then have to wait for the second wave to see it, which mechanically extends the indexing delay.

What is the actual delay between the two waves?

Google doesn't provide an official figure, but field observations indicate gaps ranging from a few hours to several days depending on the crawl budget allocated to the site. A high-authority site may have its JS rendered in just a few hours, while a small site could wait a week.

This delay becomes critical for news, product launches, or any urgent page. If your competitor serves static HTML while you rely on JS, they will appear in the SERPs before you, even if your content is published first.

  • First wave: crawling raw HTML returned by the server, immediate indexing of available content
  • Second wave: queuing, JavaScript rendering in headless Chrome, indexing of dynamically generated content
  • Variable delay: from a few hours to several days depending on crawl budget and site authority
  • Risk of partial indexing: between the two waves, the page may appear in the index with incomplete or empty content

SEO Expert opinion

Is this statement consistent with field observations?

Absolutely. Tests with modern JS frameworks without SSR consistently show this delay. If you publish a React page with pure client-side rendering, you may see it appear in Search Console as "crawled" but without the main content for days.

What Google doesn’t clearly say is that the delay varies greatly depending on the crawl budget. An authoritative site with many backlinks and high freshness will see its JS rendered quickly. A poorly interlinked small site will wait much longer. Google remains vague on the exact criteria that determine the priority of the second wave.

What nuances should be added to this claim?

Firstly, not all JavaScript sites are equal. If you're using server-side rendering (SSR) or static site generation (SSG), your initial HTML already contains the complete content. The second wave then becomes trivial, just a check that the client render matches the server render.

Furthermore, Google can deliberately deprioritize certain JS sites if it detects suspicious patterns (cloaking, hidden content visible only after interaction). [To be validated]: Google claims to treat modern JS "like a conventional browser", but reality shows limitations (short timeouts, no infinite script execution, partial handling of aggressive lazy-loading).

In what cases does this rule not really penalize?

If your site prioritizes user experience over raw indexing speed, client-side rendering may still be viable. For example, a member area, SaaS, or web app where SEO isn't the main traffic source. The indexing delay becomes secondary.

Additionally, some sites compensate with such a generous crawl budget that the second wave arrives almost instantaneously. But let's be honest: this is the exception, not the rule. Most sites lose indexing time with poorly implemented JS.

Attention: If you rely on quick organic traffic (news, e-commerce with frequent rotations), pure CSR becomes a significant handicap against competitors using SSR or static HTML.

Practical impact and recommendations

What concrete steps should be taken to minimize this delay?

The most effective solution is to migrate to server-side rendering or static generation. Use Next.js for React, Nuxt for Vue, Angular Universal for Angular. Your initial HTML already contains the complete content, allowing Googlebot to index everything in the first wave.

If you stay on CSR, maximize your optimization: reduce JS execution time, limit blocking resources, and use pre-rendering for strategic pages. Tools like Rendertron or Prerender.io generate static HTML snapshots for crawlers, but this is a patch, not a true solution.

What mistakes should be absolutely avoided?

Never rely on aggressive lazy-loading for critical content. If your main text loads only on scroll, Googlebot may never see it in the second wave (timeout before full execution). Load immediately what matters for SEO.

Avoid also depending solely on client rendering for meta tags (title, description, canonical). Even if Google finally sees them in the second wave, the delay can skew initial indexing. Always inject them server-side.

How can I check if my site is properly indexed despite the JS?

Use the URL inspection tool in Search Console. Compare the "raw HTML" and the "final render": if the main content only appears in the final render, you're on pure CSR and are experiencing delays. If both versions are identical, you're good.

Also monitor the discovery vs indexing delays in your server logs. If Googlebot crawls your page on day J but it only appears in the index on J+5, that's the classic symptom of deferred JS. A gap of more than 48 hours should raise alarms.

  • Migrate to SSR/SSG if SEO is strategic (Next.js, Nuxt, Angular Universal)
  • Inject critical meta tags (title, description, canonical) server-side, never just in JS
  • Load the main content immediately, without waiting for scroll or user interaction
  • Regularly test with the Search Console URL inspection: compare raw HTML vs final render
  • Monitor crawl/indexing delays in logs: gap > 48h = JS issue
  • Limit blocking JS resources and optimize execution time (budgets < 500ms ideally)
These technical optimizations can quickly become complex, especially if your JS stack is already in production. If you need to balance performance, UX, and quick indexing, or if you notice suspicious indexing discrepancies, it may be wise to consult a specialized SEO agency for modern web architecture. Personalized support allows for a detailed audit of your rendering, identifying specific bottlenecks related to your framework and implementing tailored solutions without compromising user experience.

❓ Frequently Asked Questions

Le délai entre les deux vagues d'indexation est-il le même pour tous les sites ?
Non, il varie énormément selon le crawl budget alloué. Un site à forte autorité peut voir son JS rendu en quelques heures, tandis qu'un petit site peut attendre plusieurs jours voire une semaine.
Le server-side rendering (SSR) élimine-t-il complètement ce problème ?
Oui, avec du SSR ou du static generation, le HTML initial contient déjà tout le contenu. Googlebot indexe tout dès la première vague, la seconde devient une simple vérification.
Peut-on forcer Googlebot à rendre le JavaScript plus rapidement ?
Pas directement. Tu peux améliorer ton crawl budget (maillage interne, backlinks, fraîcheur du contenu), mais Google décide seul de la priorité de la file d'attente de rendu.
Les balises meta injectées en JavaScript sont-elles prises en compte ?
Oui, mais avec le même délai que le reste du contenu JS. Entre-temps, Google peut indexer ta page avec des meta incomplètes ou par défaut. Mieux vaut les injecter côté serveur.
Un site en React pur sans SSR peut-il quand même bien ranker ?
Oui, si ton crawl budget est généreux et que tu n'es pas dans une course à l'indexation rapide. Mais tu perds systématiquement du temps face à des concurrents en HTML statique ou SSR, surtout sur du contenu frais ou urgent.
🏷 Related Topics
Content Crawl & Indexing JavaScript & Technical SEO Links & Backlinks

🎥 From the same video 7

Other SEO insights extracted from this same Google Search Central video · duration 39 min · published on 10/05/2018

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.