What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

The concept of 'first wave' and 'second wave' was a simplification to explain the crawling and rendering process. Google no longer uses it because it is misleading. The median time between crawling and rendering is 5 seconds, and at the 90th percentile, it’s a few minutes. There is no 'rendering budget'.
1:03
🎥 Source video

Extracted from a Google Search Central video

⏱ 34:50 💬 EN 📅 27/05/2020 ✂ 13 statements
Watch on YouTube (1:03) →
Other statements from this video 12
  1. 3:42 Le contenu JavaScript rendu est-il vraiment indexable sans friction par Google ?
  2. 4:46 Le dynamic rendering avec accordéons dépliés est-il du cloaking selon Google ?
  3. 6:56 Faut-il vraiment abandonner le dynamic rendering au profit du server-side rendering ?
  4. 12:05 Le contenu caché derrière un accordéon ou un onglet est-il vraiment pris en compte par Google ?
  5. 13:07 Les liens JavaScript doivent-ils vraiment être des éléments <a> avec href pour être crawlés ?
  6. 14:11 Les PWA ont-elles vraiment un traitement SEO identique aux sites classiques ?
  7. 17:54 Faut-il arrêter d'utiliser Google Cache pour diagnostiquer vos problèmes d'indexation ?
  8. 21:07 Google peut-il vraiment ignorer une partie de votre site sans prévenir ?
  9. 23:14 Faut-il vraiment s'inquiéter d'un taux de crawl faible ?
  10. 26:52 Pourquoi Googlebot crawle-t-il encore en HTTP/1.1 et pas en HTTP/2 ?
  11. 27:23 Faut-il vraiment découper ses bundles JavaScript par section de site pour le SEO ?
  12. 33:47 Google ignore-t-il vraiment les en-têtes Cache-Control pour le crawl ?
📅
Official statement from (5 years ago)
TL;DR

Google officially abandons the first wave / second wave distinction to describe its crawling and rendering process. Martin Splitt specifies that the median time between crawling and rendering is 5 seconds, and at the 90th percentile, it’s just a few minutes. This clarification also puts an end to the myth of rendering budget — which simply does not exist according to Google.

What you need to understand

Why is Google abandoning this explanatory model?

The concept of first and second wave was a pedagogical shortcut to explain that Google first crawls raw HTML, then comes back later to execute JavaScript and index the rendered content. This simplification had its merits: it helped SEOs visualize a complex process.

But this model created more confusion than anything else. Many deduced that there were two distinct and spaced phases in time, with a significant delay between them. This interpretation fueled optimization strategies based on a false representation of technical reality.

What is the actual rendering timeline according to Google?

Splitt's figures set the record straight: median time of 5 seconds between crawling the HTML and its complete rendering. At the 90th percentile, we talk about a few minutes. These delays are far from the imagined 'rendering budget' or the weeks of waiting that we’ve seen mentioned here and there.

In concrete terms, for the majority of sites, Google executes JavaScript almost immediately after retrieving HTML. The process is smooth and continuous, not segmented into successive waves with distinct queues.

Does the rendering budget really exist?

No. Splitt is categorical: there is no rendering budget in the sense that there is a crawl budget. Google does not ration the number of pages it accepts to render for a given site.

This precision destroys a widespread belief that one had to 'save' rendering budget by limiting JavaScript pages or prioritizing certain URLs. If an indexing problem arises with JS content, look elsewhere: loading times, execution errors, content blocked by robots.txt, but not some hypothetical rendering quota.

  • The first/second wave model was a misleading simplification now abandoned by Google
  • The median delay between crawling and rendering is 5 seconds, not days or weeks
  • The concept of rendering budget does not exist — Google does not limit the number of pages it accepts to render
  • If JavaScript content is not indexed, the problem lies elsewhere (performance, errors, blockages)
  • SEO strategies based on optimizing a supposed rendering budget are therefore obsolete

SEO Expert opinion

Is this statement consistent with field observations?

Yes and no. Splitt's figures (median at 5 seconds) align with what we observe on well-configured sites: JS content appears quickly in the index. But the 90th percentile 'a few minutes' remains vague — we all have examples of sites where rendered content takes several days or even weeks to be indexed.

The important nuance: Splitt is talking about the technical delay between crawling and rendering, not the delay between rendering and visible indexing in the SERPs. These are two different things. Rendering can be fast, but effective indexing depends on other factors: content quality, page popularity, ranking signals, overall site indexing policy.

What are the grey areas in this statement?

[To be verified] Splitt does not mention the frequency of re-rendering for pages that are already indexed. If Google re-crawls a page every 3 days, does it systematically re-render the JavaScript at each pass, or does it rely on a cached version? This question remains unanswered clearly.

Another point: stating that there is no rendering budget does not mean that Google has infinite resources. If your site generates 10 million dynamic JS pages through URL parameters, Google will obviously not render all of them. The real bottleneck remains the crawl budget — if Googlebot does not visit the URL, there is nothing to render.

In what cases does this rule not apply?

Sites with heavy and slow JavaScript are particularly vulnerable. If your page takes 10 seconds to execute client-side JS, Google may technically render it in 5 seconds after crawling, but if the rendering fails or times out, the content will never appear.

Single Page Applications (SPAs) with aggressively lazy-loading content or content that loads after user interaction (clicks, infinite scroll) continue to pose problems. Google does not simulate these interactions — if the content requires a click to appear in the DOM, it will not be indexed, no matter how fast the rendering is.

Attention: This statement does not change the fundamentals of JavaScript SEO. Server-Side Rendering (SSR) or pre-rendering remain the most reliable approaches to ensure immediate and complete indexing of critical content.

Practical impact and recommendations

What should you practically do following this clarification?

Stop optimizing for a phantom rendering budget. Focus on what truly matters: the execution speed of JavaScript, rendering stability, and the quality of the final content. If your JS content is fast, clean, and indexable, Google will render it without problems.

Systematically test your pages using the URL inspection tool in Search Console. Compare raw HTML and rendered HTML to ensure critical content appears correctly. If important elements (titles, texts, links) are missing in the rendering, that's where you need to act — not by trying to 'save a non-existent budget'.

What mistakes should you avoid following this announcement?

Do not deduce from this statement that JavaScript has become risk-free for SEO. Google may render quickly, but that guarantees neither indexing nor good ranking. Content rendered in 5 seconds that takes 8 seconds to display to the end user remains a burden for Core Web Vitals.

Avoid confusing rendering speed with indexing priority. Google may render your page instantly and decide not to index it at all if it is judged low quality, duplicate, or irrelevant. Rendering is just a step — not a guarantee.

How can you check that your JavaScript implementation is SEO-compatible?

Use regular monitoring of the rendered HTML via tools like Oncrawl, Botify, or Screaming Frog in JavaScript mode. Compare results with a non-JS crawl to identify discrepancies. If strategic URLs show significant differences, that's a warning signal.

Also monitor server logs to ensure that Googlebot is not encountering 500 errors, timeouts, or blocked resources (CSS, JS) that would prevent rendering. A 200 status does not suffice — rendering must be able to execute completely without errors in the console.

  • Check that critical content (H1, texts, links) is visible in the rendered HTML through Search Console
  • Test the execution speed of JavaScript — target a Time to Interactive under 3 seconds
  • Monitor server logs for rendering errors on the Googlebot side
  • Regularly compare raw HTML and rendered HTML to spot discrepancies
  • Prioritize SSR or pre-rendering for strategic pages (categories, key products)
  • Audit blocked resources in robots.txt that could hinder complete rendering
This clarification from Google removes a convenient excuse: we can no longer blame a hypothetical rendering budget for our JavaScript indexing problems. The real work remains the same — ensure fast, clean, and comprehensive rendering. These optimizations can be technical and often require specific expertise. If your team lacks resources or specific skills in these areas, enlisting an SEO agency specialized in JavaScript and indexing can save you valuable time and avoid costly mistakes.

❓ Frequently Asked Questions

Le modèle first wave / second wave était-il complètement faux ?
Non, c'était une simplification pédagogique qui contenait une part de vérité : Google crawle d'abord le HTML puis exécute le JavaScript. Mais elle était trompeuse en suggérant deux phases très distinctes et espacées dans le temps, ce qui ne correspond pas à la réalité technique.
Si Google rend en 5 secondes, pourquoi mon contenu JavaScript met des jours à être indexé ?
Splitt parle du délai entre crawl et rendu, pas entre rendu et indexation. Après le rendu, Google doit encore décider d'indexer la page, ce qui dépend de la qualité du contenu, des signaux de ranking, et de la politique d'indexation globale de votre site.
Le rendering budget n'existe vraiment pas du tout ?
Non selon Splitt. Google ne limite pas arbitrairement le nombre de pages qu'il accepte de rendre pour un site donné. Le vrai goulet d'étranglement reste le crawl budget — si Googlebot ne visite pas l'URL, il n'y a rien à rendre.
Dois-je arrêter d'utiliser le SSR maintenant que le rendu est rapide ?
Absolument pas. Le SSR reste la solution la plus fiable pour garantir une indexation immédiate et complète. La rapidité du rendu Google ne compense pas les avantages du SSR en termes de performance utilisateur, de compatibilité totale, et de prévisibilité.
Comment vérifier concrètement que mon contenu JavaScript est bien rendu par Google ?
Utilisez l'outil d'inspection d'URL dans la Search Console, section "Tester l'URL en direct". Comparez le HTML source et le HTML rendu pour vérifier que votre contenu critique (titres, textes, liens) apparaît bien après exécution du JavaScript.
🏷 Related Topics
Crawl & Indexing AI & SEO JavaScript & Technical SEO

🎥 From the same video 12

Other SEO insights extracted from this same Google Search Central video · duration 34 min · published on 27/05/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.