Official statement
Martin Splitt asserts that HTML and CSS are more reliable than JavaScript for SEO because they are "more resilient." Google recommends using JavaScript sparingly and responsibly. In practical terms, this means prioritizing server-side rendering for critical content and reserving JS for non-essential crawl features.
What you need to understand
Why does Google still insist on HTML and CSS in 2025?
Martin Splitt's statement may seem outdated in the era of modern frameworks. However, it is based on a simple technical reality: HTML and CSS are interpreted instantly by Googlebot, without a JavaScript rendering step. The crawl budget is not consumed by additional computation requests.
When Google speaks of “resilience,” it refers to the ability of a piece of content to be understood even if the JS rendering fails. A server timeout, an error in the JavaScript bundle, a compatibility issue — and your content becomes invisible. With static HTML, this risk does not exist. The bot reads the structure directly.
What does “using JavaScript responsibly” mean?
The expression is intentionally vague. Google does not ban JavaScript — that would be absurd given today’s web architecture. It targets irresponsible usages: poorly configured lazy-loading that blocks essential content, client-side hydration delayed by several seconds, single-page application navigation without proper history management.
Specifically, “responsible” means: critical content must exist in the initial HTML. Interactive enrichments (filters, animations, adding to cart) can be managed in JS. But titles, main text, internal links — all of this must be present before executing any script.
Does this recommendation apply to all types of sites?
No. A showcase site with 20 pages has no interest in complicating its life with Server-Side Rendering (SSR) if its HTML is already static. Conversely, an e-commerce site with 50,000 products in pure client-side React takes a major risk. Product pages generated solely by JavaScript may be crawled late, or even ignored if the crawl budget is constrained.
SaaS applications with private content behind a login are less affected — Google does not crawl these areas. However, any public page intended to rank (blog, landing pages, product sheets) must adhere to the principle of immediate content availability.
- HTML and CSS ensure fast indexing without reliance on JavaScript rendering.
- JavaScript can delay or block access to content in case of errors or timeouts.
- Critical content (titles, text, links) must be present in the initial HTML, before script execution.
- Modern frameworks (Next.js, Nuxt, SvelteKit) enable SSR to balance modern UX with solid SEO.
- Lazy-loading and deferred hydration must be configured to never block critical crawl content.
SEO Expert opinion
Is this statement consistent with field observations?
Yes, but with a significant nuance: Google has been able to crawl JavaScript for years. Sites using React, Vue, or Angular rank perfectly well. However, they encounter specific issues that static HTML sites do not. Longer indexing delays, missing content in Mobile-First if mobile rendering fails, temporary 5xx errors that block rendering.
Field audits show that sites with SSR or static site generation (SSG) are indexed 30 to 50% faster than their pure client-side rendering counterparts. This is not a myth — it is measurable with Google Search Console and server logs. The problem is that Google never provides numerical metrics in its official communications. [To be verified]: No public data quantifies the exact impact of JS on crawl budget.
In what cases does this rule become secondary?
Some sites cannot do without client-side JavaScript. Real-time analytics dashboards, collaborative online editors, dynamic price comparison sites — the content changes too often to be pre-generated. In these contexts, SEO takes a backseat to UX. No one is going to recode a SaaS tool in static HTML to gain three positions in Google.
The other scenario: very large sites with a saturated crawl budget. A media site with 500,000 articles can afford to use JS everywhere if its authority compensates. Google will still crawl and index, as internal PageRank and backlinks force the bot to come back regularly. Conversely, a small niche e-commerce site without inbound links must maximize every quality signal — and here, HTML becomes a clear advantage.
What are the gray areas of this recommendation?
Google says “use JS responsibly,” but never defines the threshold. How many seconds of rendering delay are acceptable? No official answer. We know that Googlebot waits about 5 seconds before considering a page as rendered, but this timeout can vary. [To be verified]: Some SEOs report timeouts from as little as 3 seconds on low crawl budget sites, but Google confirms nothing.
Another ambiguity: JavaScript error management. If a script crashes in production, does Googlebot see the partially rendered content or nothing at all? Tests show that this depends on when the error occurs in the rendering cycle. A crash early in hydration can make the page empty for the bot. Google does not document this behavior.
Practical impact and recommendations
What should you do concretely on an existing site?
If your site is in pure client-side rendering (React, Vue, Angular without SSR), first check the real state of your indexing. Go to Google Search Console, Coverage section, and look at pages “Discovered – currently not indexed.” If this number skyrockets, it’s a clear signal that Googlebot is struggling to process your JS content.
Then, test your critical pages with Google’s Mobile Optimization Test tool. Compare the source HTML (Ctrl+U) with the rendered DOM displayed by the tool. If your H1 titles, main paragraphs, and internal links only appear in the rendered DOM, you are entirely dependent on Google’s JavaScript rendering engine. Risky.
What mistakes should be avoided when migrating to SSR or SSG?
The most common: migrating to SSR without optimizing TTFB (Time to First Byte). A poorly configured Node.js server may return the HTML in 800 ms rather than 150 ms. You gain in indexability, but lose in Core Web Vitals. The LCP explodes, and Google penalizes you based on another criterion. Always measure before/after.
Another pitfall: keeping critical content in lazy-loading even after SSR migration. Some devs leave `loading="lazy"` on above-the-fold images or essential content iframes. The HTML is indeed present, but the content remains invisible at the first render. Google may not see it or may deprioritize it.
How can I check if my site meets Google's expectations?
Set up server log monitoring to isolate Googlebot requests. Analyze the User-Agent, the ratio of 200 vs 5xx, and above all, the presence or absence of requests to your JavaScript bundles. If Googlebot never downloads your scripts, it is settling for the raw HTML — and that’s precisely what Google recommends.
Complement this with regular rendering tests via Google’s Rendering API (formerly Mobile-Friendly Test). Automate these tests on your critical templates (product sheet, blog article, category page). A framework or CDN change may break rendering without you immediately noticing it in production.
- Audit the “Discovered – Not Indexed” pages in Google Search Console to detect JS rendering issues.
- Compare the source HTML and the rendered DOM using Google’s Mobile Optimization Test tool.
- Migrate high-traffic pages to SSR or SSG (Next.js, Nuxt, SvelteKit) if the site is currently in pure client-side rendering.
- Measure TTFB before/after migration to avoid degrading Core Web Vitals.
- Remove lazy-loading from above-the-fold content and critical indexing elements.
- Monitor server logs to analyze Googlebot's real behavior regarding JavaScript resources.
❓ Frequently Asked Questions
Google indexe-t-il vraiment tous les contenus générés en JavaScript ?
Le Server-Side Rendering (SSR) est-il obligatoire pour ranker en 2025 ?
Quels frameworks JavaScript sont les plus SEO-friendly actuellement ?
Comment tester si Googlebot exécute correctement mon JavaScript ?
Le lazy-loading d'images bloque-t-il l'indexation du contenu textuel ?
🎥 From the same video 3
Other SEO insights extracted from this same Google Search Central video · duration 8 min · published on 12/06/2019
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.