What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Googlebot does not operate with cookies enabled, so it will only see the content presented to new users. To expose different content (e.g., promotions for existing users), it is advisable to create separate landing pages accessible via links rather than relying on cookies.
1:02
🎥 Source video

Extracted from a Google Search Central video

⏱ 28:49 💬 EN 📅 01/07/2020 ✂ 23 statements
Watch on YouTube (1:02) →
Other statements from this video 22
  1. 0:33 Pourquoi Googlebot ignore-t-il vos cookies et comment adapter votre stratégie de contenu personnalisé ?
  2. 1:02 Peut-on rediriger les utilisateurs connectés vers des URLs différentes sans pénalité SEO ?
  3. 1:35 Changer de framework JavaScript fait-il chuter vos positions Google ?
  4. 1:35 Changer de framework JavaScript ruine-t-il vraiment votre SEO ?
  5. 4:46 Le HTML rendu suffit-il vraiment à garantir l'indexation du JavaScript ?
  6. 4:46 Comment vérifier si votre contenu JavaScript est réellement indexable par Google ?
  7. 5:48 Le contenu derrière login est-il vraiment invisible pour Google ?
  8. 5:48 Le contenu derrière un login est-il vraiment invisible pour Google ?
  9. 6:47 Faut-il vraiment rediriger Googlebot vers www pour contourner les erreurs CORB ?
  10. 8:42 Faut-il traiter Googlebot différemment des utilisateurs pour gérer les redirections ?
  11. 11:20 Faut-il vraiment masquer les bannières de consentement à Googlebot pour améliorer son crawl ?
  12. 11:20 Faut-il afficher les écrans de consentement à Googlebot au risque d'être pénalisé pour cloaking ?
  13. 14:00 Comment identifier précisément les éléments qui dégradent votre Cumulative Layout Shift ?
  14. 18:18 Pourquoi vos outils de test PageSpeed affichent-ils des scores LCP et FCP contradictoires ?
  15. 19:51 Pourquoi vos URLs avec hash (#) ne seront jamais indexées par Google ?
  16. 20:23 Faut-il vraiment supprimer les hashs des URLs d'événements sportifs pour les indexer ?
  17. 23:32 Le pré-rendu pour Googlebot : faut-il vraiment s'en passer ?
  18. 24:02 Faut-il vraiment désactiver JavaScript sur vos pages pré-rendues pour Googlebot ?
  19. 26:42 Le JSON-LD ralentit-il vraiment votre temps de chargement ?
  20. 26:42 Le balisage FAQ Schema est-il vraiment inutile pour vos pages produits ?
  21. 26:42 Le JSON-LD FAQ Schema ralentit-il vraiment votre site ?
  22. 26:42 Le balisage FAQ Schema nuit-il à votre taux de conversion ?
📅
Official statement from (5 years ago)
TL;DR

Googlebot does not use any cookies while crawling — it only sees what an anonymous visitor discovers on their first visit. The direct consequence: all content displayed conditionally via cookies (user promotions, geolocations, A/B variations) remains invisible to Google. The solution? Create separate landing pages, accessible via standard internal links, instead of relying on client-side mechanisms.

What you need to understand

Why does Googlebot ignore cookies?

Googlebot operates in a clean browsing environment — no cookies, no history, no user sessions. This technical choice ensures that the bot consistently crawls the default content, the one presented to an anonymous visitor arriving for the first time.

This approach avoids two pitfalls: on one hand, preventing Google from indexing highly personalized content that only makes sense for a sub-group of users; on the other hand, maintaining crawl consistency across different sessions. If Googlebot accepted cookies, each pass could trigger a different version of the site — a nightmare for indexing.

What does this mean for client-side dynamic content?

Many sites deploy conditional content via JavaScript and cookies: promotional banners for returning visitors, personalized recommendation modules, landing page variations based on previous user journeys. All this content remains invisible to Googlebot unless a static HTML link leads to it.

Let's be honest: this statement does not revolutionize anything. Experienced practitioners have long known that Google does not execute complex browsing scenarios — it follows links, renders initial HTML and JavaScript, and then moves on to the next page.

Which site architectures are affected?

E-commerce sites that display promotions specific to loyal users, SaaS platforms with differentiated onboarding, media outlets that personalize their homepage based on reading history — all share a common risk: a substantial portion of their content remains off Google’s radar.

The same observation applies to poorly configured A/B tests. If variant B only displays after a cookie is set during the first visit, Google will never see it — unless it has a distinct URL or an unconditional rendering mechanism.

  • Googlebot crawls without cookies — it only sees the default content of an anonymous visitor.
  • Any conditional content based on cookies, sessions, or user history remains invisible.
  • The recommended solution: create separate landing pages accessible via standard HTML links.
  • A/B tests should provide exposure through parameterized URLs or server-side rendering, not just client-side.
  • This rule also applies to cookie-based geolocations, temporary banners, and personalized recommendation modules.

SEO Expert opinion

Is this statement consistent with observed practices in the field?

Absolutely — and has always been. Crawl logs consistently show that Googlebot sends requests without a Cookie header, does not record any Set-Cookie in return, and never replays any session between visits. Crawl analysis tools (Oncrawl, Botify, Screaming Frog in Googlebot mode) confirm this behavior.

What is surprising is that Google still has to remind this principle amid the shift towards server-side JavaScript rendering. Many developers confuse “Googlebot executes JavaScript” with “Googlebot behaves like a modern browser with all its capabilities.” The bot remains a performance-oriented crawler, not a user session simulator.

What nuances should be added to this rule?

First point: Googlebot can theoretically receive cookies if the server sends them — but it will not store them or return them in subsequent requests. In practice, if your backend sets an analytics tracking cookie, it does not impact indexing.

Second nuance — and this is where it gets tricky. Google never specifies how long it waits for JavaScript rendering before capturing the final DOM. If a promotion appears after 8 seconds via conditional lazy-loading, [To be checked] whether Googlebot waits long enough. Field tests suggest a timeout between 5 and 10 seconds depending on DOM complexity, but nothing official.

In what scenarios does this rule require a technical overhaul?

Three classic scenarios necessitate intervention. First, e-commerce sites with segmented differentiated content — if your best promotions only show to returning users via cookies, Google will never index them. Creating accessible landing pages /promo/black-friday solves the issue.

Next, SaaS platforms with personalized onboarding. If the discovery funnel changes based on the visitor's origin (cookie set via AdWords campaign, previous session), you lose the ability to index these variations. The solution: expose each journey via a parameterized URL (?onboarding=business vs ?onboarding=freelance).

Beware of poorly configured A/B tests: if your testing tool (Optimizely, VWO, Google Optimize) switches between variants solely via client cookie, Google will only see the default variant. Prefer a server-side split-testing system or exposure via URL parameter with properly managed canonical.

Practical impact and recommendations

What concrete steps should be taken to expose personalized content to Google?

The most robust method remains to create distinct URLs for each content variant you wish to index. Example: instead of displaying a “-20% loyal customers” banner via cookie, create a landing page /promo/loyalty accessible directly and linked from your menu or your sitemap.

An alternative for A/B tests: use a URL parameter combined with server-side rendering. If Googlebot crawls /page?variant=b, the server directly returns variant B without waiting for a cookie or client-side script. Add a canonical tag to the main URL to avoid duplicate content — but at least Google sees the variant.

What mistakes should be avoided during migration?

First mistake: thinking that exposing content via asynchronous JavaScript is enough. If this content only appears conditionally after checking a local cookie, Googlebot will never see it — even if the JS executes correctly. Rendering does not compensate for the absence of a session.

Second trap: multiplying landing pages without a consistent internal linking strategy. Creating /promo/christmas, /promo/black-friday, /promo/loyalty is pointless if no internal link leads to them. Googlebot follows links — if these pages remain orphaned or are only accessible via forms, they will never be crawled.

How can you check whether your configuration correctly exposes content to Googlebot?

Start with a Screaming Frog crawl in Googlebot mode, cookies disabled. Compare with a standard browser crawl, cookies enabled. Any URL or content block present in the second crawl but absent from the first indicates an exposure issue.

Then use Google Search Console's URL Inspection tool. Request a live rendering and check the screenshot + rendered HTML. If your personalized promotion does not appear, it relies on a cookie-based mechanism invisible to Google.

  • Create distinct URLs for each content variant you wish to index (promotions, segmented landing pages, differentiated journeys).
  • Expose these URLs via internal links in the menu, footer, or through XML sitemaps to ensure their discovery by Googlebot.
  • Prefer server-side rendering for A/B tests, with URL parameters and canonicals pointing to the main page.
  • Regularly audit using Screaming Frog in Googlebot mode, cookies disabled, and compare with a standard user crawl.
  • Check in Search Console that strategic pages display correctly via the URL Inspection tool.
  • Avoid banners or modules displayed only via conditional JavaScript after cookies — prefer a default display hidden via CSS if not relevant.
The main challenge remains to align the technical architecture with Googlebot's actual capabilities. All strategic content must have a dedicated URL, an internal link, and unconditional rendering — without relying on cookies or user sessions. These optimizations often require a partial redesign of the personalization system. If your tech stack heavily relies on conditional client-side content, it may be wise to consult a specialized SEO agency to audit the architecture, identify risky areas, and establish a progressive migration plan without disrupting the existing user experience.

❓ Frequently Asked Questions

Googlebot peut-il recevoir des cookies même s'il ne les utilise pas ?
Oui, le serveur peut envoyer des Set-Cookie dans la réponse HTTP, mais Googlebot ne les stockera ni ne les renverra lors des requêtes suivantes. Cela n'impacte donc pas l'indexation.
Les tests A/B côté client sont-ils compatibles avec l'indexation Google ?
Non, si le test repose uniquement sur un cookie client pour basculer entre variantes. Google ne verra que la variante par défaut. Privilégiez un split-testing côté serveur avec paramètre URL et canonical.
Comment exposer des promos réservées aux clients fidèles sans cookies ?
Créez une landing page dédiée (/promo/fidelite) accessible via lien interne ou sitemap. Vous pouvez ensuite rediriger les utilisateurs non éligibles via JavaScript ou serveur, mais Google indexera la page principale.
Le rendering JavaScript compense-t-il l'absence de cookies pour afficher du contenu personnalisé ?
Non. Googlebot exécute le JavaScript initial, mais si ce JS vérifie un cookie local pour afficher du contenu, ce contenu restera invisible. Le rendering ne simule pas une session utilisateur.
Faut-il créer des URLs distinctes pour chaque variante de géolocalisation basée sur cookies ?
Oui, si vous souhaitez que Google indexe chaque variante régionale. Privilégiez des sous-domaines (fr.site.com) ou des sous-répertoires (/fr/) plutôt qu'une détection cookie-based invisible pour le crawler.
🏷 Related Topics
Domain Age & History Content Crawl & Indexing AI & SEO Links & Backlinks

🎥 From the same video 22

Other SEO insights extracted from this same Google Search Central video · duration 28 min · published on 01/07/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.