Official statement
Other statements from this video 28 ▾
- 1:02 Does Google really render all JavaScript pages, regardless of their architecture?
- 1:02 Does Google really render ALL JavaScript, even without initial server-side content?
- 2:05 How can you ensure that Googlebot is truly crawling your site?
- 2:05 How can you ensure that Googlebot is genuinely Googlebot and not an imposter?
- 2:36 Does Google really limit CPU time during JavaScript rendering?
- 2:36 Is it true that Google actually limits CPU time during JavaScript rendering?
- 3:09 Should we stop optimizing for bots and focus solely on the user?
- 5:17 Does the CSS content-visibility property really affect rendering in Google?
- 8:53 How can you measure Core Web Vitals on Firefox and Safari without native API support?
- 11:00 How long does Google really wait before giving up on JavaScript rendering?
- 11:00 How long does Googlebot really wait for JavaScript rendering?
- 20:07 Why does Google display empty pages even when your JavaScript site is working perfectly?
- 21:10 Can blocking JavaScript really stop Google from indexing all the content on your pages?
- 24:48 Has dynamic prerendering become a trap for indexing?
- 26:25 Could your deleted resources be harming your pre-render indexing?
- 26:47 What does Google really do with your initial HTML before JavaScript rendering?
- 27:28 Is it true that Google really analyzes everything in the initial HTML before rendering?
- 27:59 Is it true that Google ignores JavaScript rendering if your noindex tag appears in the initial HTML?
- 27:59 Could a 404 page with JavaScript lead to the complete deindexing of your site?
- 28:30 Why does Google refuse to render JavaScript if the initial HTML contains a meta noindex?
- 30:00 Does Google really compare the initial HTML AND rendered content for canonicalization?
- 30:01 Does Google really catch duplicate content after JavaScript rendering?
- 31:36 Are GET APIs really cached by Google just like any other resource?
- 31:36 Does Google really ignore POST requests during JavaScript rendering?
- 34:47 Does Google really index all pages after JavaScript rendering?
- 35:19 Does Google really render 100% of JavaScript pages before indexing?
- 36:51 How do your failing APIs sabotage your Google indexing?
- 37:12 Are structured data on noindexed pages really lost to Google?
Martin Splitt confirms that AJAX works for SEO if implemented correctly but adds that it is not an ideal technology for search optimization. Every AJAX request introduces additional failure points: blockage in robots.txt, network errors, and timeouts. In practice, avoid using AJAX to load critical content unless you have a genuine UX constraint that justifies it.
What you need to understand
Why does Google see AJAX as an avoidable complexity?
AJAX adds a JavaScript execution layer between the server and the final content. Unlike static HTML delivered directly, content loaded via AJAX requires Googlebot to execute the JavaScript, wait for the network request, and then parse the result.
Each step represents a potential failure point. If the JavaScript file is blocked by robots.txt, the content won't load. If the AJAX request times out after 5 seconds, Googlebot sees nothing. If the API returns a 500 error, the content disappears for Google.
What makes AJAX “functional but not fantastic”?
Google has been able to crawl and index AJAX sites for years. The JavaScript rendering has matured, and Googlebot handles well-constructed Single Page Applications properly.
The issue isn't technical — it's the relative reliability. A site that delivers all its HTML from the server eliminates all those risks at once. AJAX introduces dependencies: availability of the JavaScript CDN, network speed, rendering capability of Googlebot at crawl time.
When is AJAX still acceptable for SEO?
Splitt doesn't say “never use AJAX.” He says: avoid it unless absolutely necessary. If your UX demands real-time interactions, partial page updates, or a seamless app-like experience, AJAX makes sense.
However, to load a critical content block — H1 title, main paragraph, internal linking — prioritize server-side rendering. Reserve AJAX for secondary elements: infinite pagination, product filters, non-essential deferred loads.
- Static HTML or SSR: zero failure points, immediate crawl, guaranteed indexing
- Well-implemented AJAX: works but adds complexity, latency, error risks
- Critical content: always server-side, never loaded in asynchronous JavaScript
- Secondary elements: AJAX acceptable if the UX truly justifies it
- Mandatory monitoring: Search Console, crawl logs, rendering tests to detect failures
SEO Expert opinion
Is this position consistent with field observations?
Absolutely. For years, we have observed that SSR or static HTML indexes faster and more completely than full JavaScript SPAs. Even with high-performing Googlebot rendering, there is a measurable delta.
The classic problems: timeouts during JavaScript rendering, content loaded too late after the first paint, transient network errors that go unnoticed from the user side but block Googlebot. An e-commerce site loading its product sheets via AJAX takes an unnecessary risk if server rendering is possible.
What nuances should be added to this statement?
Splitt talks about “avoidable complexity unless absolutely necessary.” The issue is that many developers see AJAX as a default necessity when it is often a matter of technical convenience.
The real question: does your framework impose AJAX or is it an architectural choice? If you're on React/Vue/Angular in pure SPA mode, migrating to SSR (Next.js, Nuxt, etc.) requires effort. But if you're building a new site, prioritizing server-side rendering from the start avoids all these problems. [To be verified]: Google does not provide specific figures on the indexing gap between pure SSR and CSR, but field audits consistently show a delta.
When does this rule not really apply?
If you're on a closed SaaS tool or a CMS that imposes AJAX without alternatives, you have no choice. In this case, focus on the most robust implementation possible: prerendering, progressive hydration, HTML fallbacks.
Another exception: real-time interfaces (dashboards, collaborative tools) where AJAX is intrinsic to the very concept of the product. But let's be honest — 90% of corporate, e-commerce, or editorial sites have no real constraint justifying loading the main content in asynchronous JavaScript.
Practical impact and recommendations
What should you do concretely if your site uses AJAX?
Your first instinct: audit what is loaded via AJAX. Open DevTools, look at the Network tab, filter for XHR/Fetch. Identify precisely which content arrives after the first HTML render.
If critical content — titles, descriptions, internal links — appears only via AJAX, you have a problem. Prioritize its migration to server-side. If it's secondary content (customer reviews, similar products), it's less urgent but keep an eye on indexing in Search Console.
How to verify that Googlebot can see your AJAX content?
Use the URL inspection tool in Search Console. Compare the raw HTML and the rendering after JavaScript. If an entire area is missing in the rendering, you have a failure.
Also test with a Googlebot user-agent from your browser or a tool like Screaming Frog in JavaScript rendering mode. Check the server logs: does Googlebot access the AJAX endpoints? If you see 403s, 500s, or timeouts, that's where the issue lies.
What mistakes should you absolutely avoid with AJAX in SEO?
Never block JavaScript files or AJAX endpoints in robots.txt. This is the classic mistake: blocking /api/ or /assets/js/ reflexively, and Googlebot can no longer render the page correctly.
Avoid heavy dependencies or too-short timeouts on the server side. If your API takes 8 seconds to respond, Googlebot will give up. Lastly, don't rely on AJAX for Above The Fold content: Google prioritizes content that is immediately visible in the initial HTML.
- Precisely identify what content is loaded via AJAX (DevTools, Network)
- Migrate critical content (H1, main paragraphs, internal linking) server-side
- Test Googlebot rendering in Search Console (URL inspection)
- Check robots.txt: no blocking of necessary JS/API for rendering
- Monitor crawl logs: timeouts, 5xx errors on AJAX endpoints
- Implement SSR or prerendering if complete migration is impossible
❓ Frequently Asked Questions
AJAX empêche-t-il complètement l'indexation par Google ?
Faut-il abandonner les Single Page Applications pour le SEO ?
Comment savoir si Googlebot exécute bien mon JavaScript AJAX ?
Peut-on bloquer certains fichiers JavaScript sans impacter le SEO ?
Le prerendering est-il une solution acceptable pour AJAX et SEO ?
🎥 From the same video 28
Other SEO insights extracted from this same Google Search Central video · duration 46 min · published on 25/11/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.