Official statement
Other statements from this video 25 ▾
- □ Does Google really experience delays in discovering JavaScript links?
- □ Why does Google ignore your canonical tags when the raw HTML contradicts the rendered output?
- □ Does a raw HTML noindex really prevent JavaScript rendering by Google?
- □ Can you really modify title, meta, and links on the client side with JavaScript without risks?
- □ Raw HTML vs Rendered: Does Google really not care?
- □ Does Google AdSense really penalize your site's speed like any other third-party script?
- □ Should you be worried about 'other error' issues with images in the Search Console?
- □ Should you prioritize user agent or viewport detection for your separate mobile versions?
- □ Do JavaScript navigation links really affect your site's SEO?
- □ Can you really lose control of your canonical by leaving the href attribute empty at load time?
- □ Does Google really use different crawlers for its SEO testing tools?
- □ Are the structured data from your mobile version also applicable to desktop?
- □ Should you really stop fearing JavaScript for SEO?
- □ Do JavaScript links really slow down Google's discovery process?
- □ How can a different canonical tag between raw HTML and rendered output destroy your canonicalization strategy?
- □ Can you really remove a noindex via JavaScript without risking de-indexation?
- □ Is it truly safe to modify meta tags and links with JavaScript without risking your SEO?
- □ Do Google products really get a hidden SEO advantage in search results?
- □ Should you be concerned about 'other' errors in the URL Inspection Tool?
- □ Does Google really overlook your images during web search rendering?
- □ User agent or viewport: Does Google really differentiate for mobile indexing?
- □ Do JavaScript-generated links truly pass ranking signals like traditional HTML links?
- □ Can an empty HTML canonical tag mistakenly force Google to auto-canonicalize your page?
- □ Can the Mobile-Friendly Test really substitute the URL Inspection Tool for auditing mobile crawling?
- □ Why does Google ignore your desktop structured data after switching to mobile-first indexing?
Martin Splitt makes it clear: Google crawls JavaScript without trouble, but when it comes to performance, it’s a different story. Client-side JS negatively impacts loading speed and is less predictable than server rendering. In practical terms? If you don’t have a strong technical reason to use client-side JS, avoid it. It’s not about indexing; it’s about Core Web Vitals and user experience.
What you need to understand
Why does Google distinguish between server-side rendering and client-side rendering?
The nuance is crucial. Google perfectly indexes JavaScript, whether it’s rendered client-side or server-side. The Googlebot executes JS, waits for the DOM to stabilize, and then indexes the content. No crawl or understanding issues.
But indexing does not mean performing. Client-side JS forces the browser to download, parse, and execute code before displaying anything. On the server side, HTML arrives ready. The difference? Several hundred milliseconds on average connections. And those milliseconds weigh heavily in Core Web Vitals.
What makes client-side JavaScript “less predictable”?
Google does not elaborate, and this is intentional. Less predictable means more friction points: JS errors that break rendering, external dependencies that are slow to load, race conditions between scripts, React/Vue hydration blocking interactivity.
On the server side, these risks don’t exist. HTML arrives complete, stable, and immediately usable. The Googlebot doesn’t have to wait for three JS bundles to load in the correct order to understand your page. It’s this stability that Splitt values here.
Is this recommendation contradicted by Google’s previous statements?
No, it refines them. For years, Google has repeated: “we handle JS.” That’s true for indexing, but false for performance. Splitt doesn’t say “never use JS” — he says “use it only when necessary.”
The real target of this statement? Websites that throw React or Vue to display static text. Full SPA blogs. Landing pages that load 400 KB of JS for a form. It’s wasteful, and Google knows it.
- Google crawls and indexes client-side JavaScript without major technical difficulties
- The issue is not indexing, but the negative performance impact (LCP, CLS, INP)
- Server-side rendering (SSR, SSG, static HTML) remains faster, more stable, and more predictable for Google and users
- Client-side JS should be reserved for truly dynamic interactions (filters, interactive maps, dashboards)
- This position is consistent with Google’s policy on Core Web Vitals and UX
SEO Expert opinion
Does this recommendation truly reflect real-world observations?
Yes, without ambiguity. Performance audits consistently show that full client-side JS websites have disastrous PageSpeed scores compared to their SSR or static counterparts. LCP above 3 seconds, TBT skyrocketing, degraded INP. Core Web Vitals don't relent.
And contrary to what some developers believe, SSR is not just a trend. Next.js, Nuxt, Astro — all of these frameworks exist precisely because full client-side has become untenable. Sites migrating to SSR immediately gain 30-50% in loading times. It’s not marginal.
What nuances should be added to this statement?
Splitt does not provide any figures or thresholds. How many KB of JS is acceptable? What performance delta becomes problematic? [Check] on each project. An e-commerce site with dynamic filters will necessarily have more JS than a blog, and that’s justified.
Then he says “less predictable,” but less predictable than what exactly? Less than static HTML, obviously. But what about SSR with partial hydration? Streaming SSR? Islands Architecture? Google doesn’t make this distinction, yet it matters greatly in practice.
In which cases does this rule not apply?
Complex web applications have no choice. A real-time dashboard, a SaaS tool, a collaborative platform — all of these require heavy client-side JS. Google knows this, and these sites are not penalized as long as performance remains acceptable.
The real issue lies with content sites using JS without a valid reason. A WordPress showcase site that loads React just to animate a menu? A Gatsby blog with 300 KB of bundles to display markdown articles? That is unacceptable.
Practical impact and recommendations
What should you concretely do on an existing site?
First, audit your actual usage of JavaScript. Open DevTools, Network panel, filter by JS. How many KB? How many requests? What is the blocking time before First Contentful Paint? If you exceed 150-200 KB of JS for a content site, you have a problem.
Next, identify what is critical and what is not. An image carousel can be lazy-loaded. A dropdown menu can be CSS-only. Animations can use CSS transitions instead of GSAP. Every KB saved improves your metrics.
What mistakes to avoid when migrating to less JS?
Don’t break user experience under the pretext of optimization. Removing JS without a functional alternative is worse than keeping it. If your users depend on dynamic filters, keep them — but optimize their implementation (code splitting, lazy loading, debouncing).
Another classic pitfall: switching to SSR without understanding hydration. Poor SSR can be as slow as full client-side if hydration blocks everything. Test under real conditions, not just locally on fiber optic.
How can I check if my site complies with Google's recommendations?
The Core Web Vitals are your only reliable indicator. PageSpeed Insights, Search Console (Web Signals report), and Chrome User Experience Report provide you with real-world metrics. If your LCP is under 2.5 seconds and your INP is under 200 ms, you are in good standing, regardless of your JS stack.
Use Lighthouse in throttling mode to simulate average connections. A score of 90+ on desktop means nothing if you drop to 40 on mobile 3G. Test on real devices, not just in the Chrome emulator.
- Audit your current JavaScript: identify unnecessary bundles, outdated dependencies, redundant polyfills
- Prefer SSR or static generation for all non-dynamic content (product pages, articles, landing pages)
- Implement code splitting and lazy loading to defer the loading of non-critical JS
- Monitor your Core Web Vitals in Search Console and fix pages that exceed thresholds
- Test under real conditions (mobile, network throttling) before deploying major changes
- Document technical choices: each JS library must have a clear justification
❓ Frequently Asked Questions
Google pénalise-t-il les sites qui utilisent beaucoup de JavaScript côté client ?
Le rendu côté serveur (SSR) est-il obligatoire pour bien se positionner ?
Quelle quantité de JavaScript côté client est acceptable pour Google ?
Les frameworks comme React ou Vue sont-ils déconseillés pour le SEO ?
Comment savoir si mon JavaScript impacte négativement mes performances SEO ?
🎥 From the same video 25
Other SEO insights extracted from this same Google Search Central video · published on 26/04/2021
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.