What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Google uses an evergreen version of Chrome for web page rendering. This version is updated a few weeks after each new stable release of Chrome. The system automatically manages errors and retries in case of rendering failure.
3:47
🎥 Source video

Extracted from a Google Search Central video

⏱ 32:02 💬 EN 📅 10/12/2020 ✂ 12 statements
Watch on YouTube (3:47) →
Other statements from this video 11
  1. 4:49 Google rend-il vraiment TOUTES les pages crawlées avec JavaScript ?
  2. 9:01 Google exploite-t-il vraiment TOUTES vos données structurées, même les invalides ?
  3. 11:40 Le PageRank fonctionne-t-il encore vraiment comme on le pense ?
  4. 13:49 Faut-il vraiment renoncer à acheter des liens de qualité pour son SEO ?
  5. 15:23 Safe Search s'applique-t-il vraiment pendant l'indexation ?
  6. 15:54 Comment Google détecte-t-il la localisation et la langue de vos pages à l'indexation ?
  7. 17:27 Tous les signaux d'indexation sont-ils vraiment des signaux de classement ?
  8. 21:22 JavaScript côté client : Google l'indexe, mais faut-il vraiment l'utiliser pour le SEO ?
  9. 23:38 Quelles erreurs JavaScript tuent votre crawl budget sans que vous le sachiez ?
  10. 24:41 Pourquoi les SEO doivent-ils s'imposer dès la phase d'architecture technique d'un projet web ?
  11. 27:18 Faut-il vraiment viser la perfection SEO pour ranker ?
📅
Official statement from (5 years ago)
TL;DR

Google claims to use an evergreen version of Chrome for rendering, updated a few weeks after each stable release. Essentially, your sites are crawled with a modern browser that supports ES6, JavaScript modules, and recent APIs. The catch? This lag of a few weeks can create discrepancies between what a visitor sees and what Googlebot indexes — especially if you're deploying experimental features without polyfills.

What you need to understand

What does "Evergreen Chrome" mean in the context of Google rendering?

An evergreen browser updates automatically without user intervention. Unlike older versions of Chrome that are frozen in time, Googlebot uses a version that continuously evolves, aligned with the public releases of Chrome.

Martin Splitt specifies that this version follows Chrome's stable updates with a lag of a few weeks. If Chrome 120 is released on December 5, Googlebot will likely switch to this version by the end of December or early January. It's not instantaneous, but it's predictable.

Why does this "few weeks" lag exist?

Google doesn't want to break the indexing of millions of sites with every micro-evolution of Chrome. The lag allows for testing the stability of the rendering engine in a controlled environment before mass deployment on Search infrastructure.

This delay also covers the automatic retry mechanisms: if a page fails during rendering, Googlebot will retry several times before giving up. A failed rendering does not mean zero indexing — the system demonstrates resilience.

Which JavaScript features are actually supported by Googlebot?

With an evergreen version of Chrome, all recent stable APIs are supported: ES6+, fetch, IntersectionObserver, JavaScript modules, async/await, Web Components v1, and even most modern CSS features like grid and custom properties.

On the other hand, experimental or Origin Trial APIs are generally inactive on Googlebot. If your site relies on a bleeding-edge feature still under flag, it won't run during rendering.

  • Evergreen Chrome: Googlebot tracks stable Chrome releases with a delay of a few weeks
  • Modern JavaScript support: ES6+, modules, async/await, fetch, IntersectionObserver are fully functional
  • Automatic retry: in case of rendering failure, the system retries before giving up
  • No experimental APIs: flagged features or those in Origin Trial are not available on the bot side
  • Predictability: you can anticipate the bot's capabilities by following the stable Chrome release schedule

SEO Expert opinion

Is this statement consistent with real-world observations?

Overall, yes. Regular tests on Search Console and through comparative crawls show that Googlebot does indeed execute modern JavaScript without a hitch. Sites using React, Vue, or Angular with standard Babel transpilation index without issue.

But beware: the "few weeks" lag is vague. Splitt does not provide a precise number — 2 weeks? 6 weeks? 8? This lack of transparency complicates planning for sites that quickly adopt new features. [To be verified]: Google does not publish a public changelog of the Chrome versions deployed on Googlebot, unlike what would be ideal.

What nuances should be added to this statement?

First nuance: evergreen does not mean instantaneous. If you deploy a SPA that leverages an API that appeared in Chrome 119 while Googlebot is still on Chrome 117, your content may not display correctly during rendering.

Second nuance: the automatic retry system is good news, but it has its limits. If your JavaScript consistently fails (syntax error, blocked resource, timeout), even after multiple attempts, Googlebot will give up. The retry does not fix bugs — it only compensates for network hiccups or server load.

In what cases does this rule not apply?

If your site uses conditional polyfills detecting the user-agent, you risk serving different code to Googlebot than to real visitors. Some frameworks detect "legacy" browsers and load appropriate code — but if Googlebot is detected as a modern Chrome, it will receive the optimized bundle, which may sometimes lack fallbacks.

Another edge case: sites that perform aggressive feature detection and display content only if a very recent API is available. If this API is not yet in Googlebot's Chrome version, the content remains invisible. Let's be honest: this is rare, but it happens on sites heavily oriented towards PWA or experimental WebAssembly.

Warning: never assume that Googlebot instantly supports the latest version of Chrome. Test with the URL inspection tool in Search Console before deploying code that leverages just-stabilized APIs.

Practical impact and recommendations

What concrete steps should you take to ensure optimal compatibility?

First, audit your JavaScript stack. If you use Babel or a transpiler, configure it to target at least Chrome stable -2 versions (since Googlebot lags a few weeks). Avoid transpiling to ES5 if you don’t need to support IE11 — you will unnecessarily bloat the bundles.

Next, systematically test with the URL inspection tool in Search Console after each major deployment. This tool shows you exactly what Googlebot managed to render, including console errors and blocked resources. If rendering fails, you’ll see why.

What mistakes should you absolutely avoid?

Never block your JavaScript or CSS resources in robots.txt. This is the classic mistake that prevents rendering even if Googlebot supports your code. Even though Google uses modern Chrome, if the JS files are forbidden, the bot cannot execute anything.

Also avoid relying solely on bleeding-edge APIs without a fallback. If your site displays critical content via an API that's in phase 3 of TC39 but not yet in stable Chrome, have a plan B. Rendering will fail silently, and your content will disappear from the index.

How can I check if my site is compliant with Google rendering?

Use Puppeteer or Playwright in headless mode with the current stable version of Chrome -1 or -2 to simulate what Googlebot sees. If your content displays correctly, you’re probably safe. Otherwise, debug before Google discovers the issue.

Also activate JavaScript error monitoring via Sentry or an equivalent tool, including the Googlebot user-agent in your filters. If the bot generates errors that real users don’t see, you’ll know there’s a specific compatibility issue.

  • Configure Babel to target Chrome stable -2 versions minimum
  • Test each major deployment with the Search Console URL inspection tool
  • Never block JS/CSS in robots.txt — check your file today
  • Add fallbacks for any recent API not yet stabilized in Chrome
  • Implement JS error monitoring filtered on Googlebot user-agent
  • Simulate rendering with Puppeteer using Chrome stable -1 or -2 locally
Google's evergreen rendering is great news for modern sites, but the few weeks lag requires caution. Regular testing, avoiding experimental APIs without fallback, and ensuring your critical resources are accessible to the bot are essential. These technical optimizations may seem straightforward in theory, but effectively implementing them at scale on a complex site often requires thorough auditing and regular monitoring. If your team lacks the time or specific expertise in JavaScript rendering for Google, hiring a specialized SEO agency could help you avoid costly mistakes and ensure optimal indexing of your dynamic content.

❓ Frequently Asked Questions

Googlebot utilise-t-il exactement la même version de Chrome que celle installée sur mon ordinateur ?
Non. Googlebot utilise une version stable de Chrome mise à jour avec un décalage de quelques semaines après la release publique. Si vous avez Chrome 120, Googlebot peut encore tourner sur Chrome 118 ou 119.
Si mon JavaScript plante lors du rendering, Googlebot réessaie-t-il automatiquement ?
Oui. Le système de rendering intègre des mécanismes de retry automatiques en cas d'échec. Mais si l'erreur est systématique (bug dans le code, ressource bloquée), même après plusieurs tentatives, le rendering échouera définitivement.
Les APIs JavaScript expérimentales sont-elles supportées par Googlebot ?
Non. Les features en phase Origin Trial ou sous flag dans Chrome ne sont généralement pas activées côté Googlebot. Seules les APIs stables et intégrées dans Chrome stable sont disponibles.
Dois-je encore transpiler mon code JavaScript pour Googlebot en 2025 ?
Ça dépend. Si vous utilisez ES6+, async/await, modules, c'est supporté nativement. Mais si vous exploitez des syntaxes très récentes (stage 3 TC39), transpiler vers Chrome stable -2 versions reste recommandé pour éviter les mauvaises surprises.
Comment savoir quelle version de Chrome utilise actuellement Googlebot pour mon site ?
Google ne publie pas officiellement cette information en temps réel. Vous pouvez inférer la version en inspectant les erreurs console dans l'outil d'inspection d'URL de Search Console, mais il n'y a pas de changelog public du rendering engine.
🏷 Related Topics
Domain Age & History Crawl & Indexing AI & SEO JavaScript & Technical SEO Pagination & Structure

🎥 From the same video 11

Other SEO insights extracted from this same Google Search Central video · duration 32 min · published on 10/12/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.