Official statement
Other statements from this video 12 ▾
- 3:42 Le contenu JavaScript rendu est-il vraiment indexable sans friction par Google ?
- 4:46 Le dynamic rendering avec accordéons dépliés est-il du cloaking selon Google ?
- 6:56 Faut-il vraiment abandonner le dynamic rendering au profit du server-side rendering ?
- 12:05 Le contenu caché derrière un accordéon ou un onglet est-il vraiment pris en compte par Google ?
- 13:07 Les liens JavaScript doivent-ils vraiment être des éléments <a> avec href pour être crawlés ?
- 14:11 Les PWA ont-elles vraiment un traitement SEO identique aux sites classiques ?
- 17:54 Faut-il arrêter d'utiliser Google Cache pour diagnostiquer vos problèmes d'indexation ?
- 21:07 Google peut-il vraiment ignorer une partie de votre site sans prévenir ?
- 23:14 Faut-il vraiment s'inquiéter d'un taux de crawl faible ?
- 26:52 Pourquoi Googlebot crawle-t-il encore en HTTP/1.1 et pas en HTTP/2 ?
- 27:23 Faut-il vraiment découper ses bundles JavaScript par section de site pour le SEO ?
- 33:47 Google ignore-t-il vraiment les en-têtes Cache-Control pour le crawl ?
Google officially abandons the first wave / second wave distinction to describe its crawling and rendering process. Martin Splitt specifies that the median time between crawling and rendering is 5 seconds, and at the 90th percentile, it’s just a few minutes. This clarification also puts an end to the myth of rendering budget — which simply does not exist according to Google.
What you need to understand
Why is Google abandoning this explanatory model?
The concept of first and second wave was a pedagogical shortcut to explain that Google first crawls raw HTML, then comes back later to execute JavaScript and index the rendered content. This simplification had its merits: it helped SEOs visualize a complex process.
But this model created more confusion than anything else. Many deduced that there were two distinct and spaced phases in time, with a significant delay between them. This interpretation fueled optimization strategies based on a false representation of technical reality.
What is the actual rendering timeline according to Google?
Splitt's figures set the record straight: median time of 5 seconds between crawling the HTML and its complete rendering. At the 90th percentile, we talk about a few minutes. These delays are far from the imagined 'rendering budget' or the weeks of waiting that we’ve seen mentioned here and there.
In concrete terms, for the majority of sites, Google executes JavaScript almost immediately after retrieving HTML. The process is smooth and continuous, not segmented into successive waves with distinct queues.
Does the rendering budget really exist?
No. Splitt is categorical: there is no rendering budget in the sense that there is a crawl budget. Google does not ration the number of pages it accepts to render for a given site.
This precision destroys a widespread belief that one had to 'save' rendering budget by limiting JavaScript pages or prioritizing certain URLs. If an indexing problem arises with JS content, look elsewhere: loading times, execution errors, content blocked by robots.txt, but not some hypothetical rendering quota.
- The first/second wave model was a misleading simplification now abandoned by Google
- The median delay between crawling and rendering is 5 seconds, not days or weeks
- The concept of rendering budget does not exist — Google does not limit the number of pages it accepts to render
- If JavaScript content is not indexed, the problem lies elsewhere (performance, errors, blockages)
- SEO strategies based on optimizing a supposed rendering budget are therefore obsolete
SEO Expert opinion
Is this statement consistent with field observations?
Yes and no. Splitt's figures (median at 5 seconds) align with what we observe on well-configured sites: JS content appears quickly in the index. But the 90th percentile 'a few minutes' remains vague — we all have examples of sites where rendered content takes several days or even weeks to be indexed.
The important nuance: Splitt is talking about the technical delay between crawling and rendering, not the delay between rendering and visible indexing in the SERPs. These are two different things. Rendering can be fast, but effective indexing depends on other factors: content quality, page popularity, ranking signals, overall site indexing policy.
What are the grey areas in this statement?
[To be verified] Splitt does not mention the frequency of re-rendering for pages that are already indexed. If Google re-crawls a page every 3 days, does it systematically re-render the JavaScript at each pass, or does it rely on a cached version? This question remains unanswered clearly.
Another point: stating that there is no rendering budget does not mean that Google has infinite resources. If your site generates 10 million dynamic JS pages through URL parameters, Google will obviously not render all of them. The real bottleneck remains the crawl budget — if Googlebot does not visit the URL, there is nothing to render.
In what cases does this rule not apply?
Sites with heavy and slow JavaScript are particularly vulnerable. If your page takes 10 seconds to execute client-side JS, Google may technically render it in 5 seconds after crawling, but if the rendering fails or times out, the content will never appear.
Single Page Applications (SPAs) with aggressively lazy-loading content or content that loads after user interaction (clicks, infinite scroll) continue to pose problems. Google does not simulate these interactions — if the content requires a click to appear in the DOM, it will not be indexed, no matter how fast the rendering is.
Practical impact and recommendations
What should you practically do following this clarification?
Stop optimizing for a phantom rendering budget. Focus on what truly matters: the execution speed of JavaScript, rendering stability, and the quality of the final content. If your JS content is fast, clean, and indexable, Google will render it without problems.
Systematically test your pages using the URL inspection tool in Search Console. Compare raw HTML and rendered HTML to ensure critical content appears correctly. If important elements (titles, texts, links) are missing in the rendering, that's where you need to act — not by trying to 'save a non-existent budget'.
What mistakes should you avoid following this announcement?
Do not deduce from this statement that JavaScript has become risk-free for SEO. Google may render quickly, but that guarantees neither indexing nor good ranking. Content rendered in 5 seconds that takes 8 seconds to display to the end user remains a burden for Core Web Vitals.
Avoid confusing rendering speed with indexing priority. Google may render your page instantly and decide not to index it at all if it is judged low quality, duplicate, or irrelevant. Rendering is just a step — not a guarantee.
How can you check that your JavaScript implementation is SEO-compatible?
Use regular monitoring of the rendered HTML via tools like Oncrawl, Botify, or Screaming Frog in JavaScript mode. Compare results with a non-JS crawl to identify discrepancies. If strategic URLs show significant differences, that's a warning signal.
Also monitor server logs to ensure that Googlebot is not encountering 500 errors, timeouts, or blocked resources (CSS, JS) that would prevent rendering. A 200 status does not suffice — rendering must be able to execute completely without errors in the console.
- Check that critical content (H1, texts, links) is visible in the rendered HTML through Search Console
- Test the execution speed of JavaScript — target a Time to Interactive under 3 seconds
- Monitor server logs for rendering errors on the Googlebot side
- Regularly compare raw HTML and rendered HTML to spot discrepancies
- Prioritize SSR or pre-rendering for strategic pages (categories, key products)
- Audit blocked resources in robots.txt that could hinder complete rendering
❓ Frequently Asked Questions
Le modèle first wave / second wave était-il complètement faux ?
Si Google rend en 5 secondes, pourquoi mon contenu JavaScript met des jours à être indexé ?
Le rendering budget n'existe vraiment pas du tout ?
Dois-je arrêter d'utiliser le SSR maintenant que le rendu est rapide ?
Comment vérifier concrètement que mon contenu JavaScript est bien rendu par Google ?
🎥 From the same video 12
Other SEO insights extracted from this same Google Search Central video · duration 34 min · published on 27/05/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.