What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Inclusion of a site in the Chrome User Experience Report (CrUX) is automatic, based on a sample of Chrome traffic. There is no manual registration required. Sites can also collect their own Core Web Vitals data using the JavaScript library to integrate it into Analytics.
1:08
🎥 Source video

Extracted from a Google Search Central video

⏱ 56:47 💬 EN 📅 04/08/2020 ✂ 39 statements
Watch on YouTube (1:08) →
Other statements from this video 38
  1. 1:08 Comment votre site se retrouve-t-il dans le Chrome User Experience Report ?
  2. 2:10 Comment mesurer les Core Web Vitals quand votre site n'est pas dans CrUX ?
  3. 3:14 Les avis négatifs peuvent-ils vraiment pénaliser votre classement Google ?
  4. 3:14 Les avis négatifs peuvent-ils vraiment pénaliser votre ranking Google ?
  5. 7:57 Faut-il vraiment séparer sitemaps pages et images ?
  6. 7:57 Le découpage des sitemaps affecte-t-il vraiment le crawl et l'indexation ?
  7. 9:01 Pourquoi un code 304 Not Modified peut-il bloquer l'indexation de vos pages ?
  8. 9:01 Le code 304 Not Modified est-il vraiment un piège pour votre indexation ?
  9. 11:39 Le cache Google influence-t-il vraiment le ranking de vos pages ?
  10. 11:39 Le cache Google est-il vraiment inutile pour évaluer la qualité SEO d'une page ?
  11. 13:51 Pourquoi votre changement de niche ne génère-t-il aucun trafic malgré tous vos efforts SEO ?
  12. 14:51 Les annuaires de liens sont-ils définitivement morts pour le SEO ?
  13. 17:59 Les pages traduites comptent-elles vraiment comme du contenu dupliqué aux yeux de Google ?
  14. 17:59 Les pages traduites sont-elles vraiment considérées comme du contenu unique par Google ?
  15. 20:20 Pourquoi Google ignore-t-il vos balises canonical et comment forcer l'indexation séparée de vos URLs régionales ?
  16. 22:15 Pourquoi Google ignore-t-il votre canonical sur les sites multi-pays ?
  17. 23:14 Pourquoi votre crawl budget Search Console explose-t-il sans raison apparente ?
  18. 23:18 Pourquoi votre crawl budget Search Console explose-t-il sans raison apparente ?
  19. 25:52 Faut-il vraiment limiter le taux de crawl dans Search Console ?
  20. 26:58 Hreflang et géociblage : Google peut-il vraiment ignorer vos signaux internationaux ?
  21. 28:58 Hreflang et canonical sont-ils vraiment fiables pour le ciblage géographique ?
  22. 34:26 Hreflang et canonical : pourquoi Search Console affiche-t-il la mauvaise URL ?
  23. 34:26 Pourquoi Search Console affiche-t-elle un canonical différent de ce qui apparaît dans les SERP pour vos pages hreflang ?
  24. 38:38 Comment Google différencie-t-il vraiment deux sites en même langue mais ciblant des pays différents ?
  25. 38:42 Faut-il canonicaliser toutes vos versions pays vers une seule URL ?
  26. 38:42 Faut-il vraiment garder chaque page hreflang en self-canonical ?
  27. 39:13 Comment éviter la canonicalisation entre vos pages multi-pays grâce aux signaux locaux ?
  28. 43:13 Faut-il vraiment abandonner les déclinaisons pays dans hreflang ?
  29. 45:34 Faut-il vraiment utiliser hreflang pour un site multilingue ?
  30. 47:44 Les commentaires Facebook ont-ils un impact sur le SEO et l'EAT de votre site ?
  31. 48:51 Faut-il isoler le contenu UGC et News en sous-domaines pour éviter les pénalités ?
  32. 50:58 Faut-il créer une version Googlebot allégée pour accélérer l'exploration ?
  33. 50:58 Faut-il optimiser la vitesse de votre site pour Googlebot ou pour vos utilisateurs ?
  34. 50:58 Faut-il servir une version allégée de vos pages à Googlebot pour améliorer le crawl ?
  35. 52:33 Peut-on créer des pages locales par ville sans risquer une pénalité pour doorway pages ?
  36. 52:33 Comment différencier une page par ville légitime d'une doorway page sanctionnable ?
  37. 54:38 L'action manuelle Google pour doorway pages a-t-elle disparu au profit de l'algorithmique ?
  38. 54:38 Les doorway pages sont-elles encore sanctionnées manuellement par Google ?
📅
Official statement from (5 years ago)
TL;DR

Inclusion in CrUX is automatic and relies on a sample of Chrome traffic — no manual sign-up is required. For an SEO, this means that a low-traffic site may not be featured, which limits access to the official Core Web Vitals data. The solution: collect your own metrics via the web-vitals JavaScript library and integrate them into Analytics to maintain control.

What you need to understand

What is the Chrome User Experience Report and why is it strategic?

The Chrome User Experience Report (CrUX) is a public dataset driven by real browsing data from Chrome users who opted into syncing and sending statistics. It aggregates Core Web Vitals (LCP, INP, CLS) and other user experience metrics across millions of sites. This is the official source Google uses to assess browsing experience in its search engine.

For an SEO practitioner, CrUX is the absolute benchmark: the data it contains is what Google uses for ranking in search results. If your site isn't included in CrUX, you have no visibility into the real metrics Google sees. And importantly, you can't compare your performance to that of your competitors in the user experience arena.

Why doesn't my site appear in CrUX?

Inclusion is based on a minimum traffic threshold from eligible Chrome users (those who have enabled usage statistics). Google does not share the exact threshold, but field observations indicate that a site with less than a few thousand monthly visitors is unlikely to be included. Niche sites, new domains, or individual low-traffic pages often remain invisible.

CrUX aggregates data at the origin (domain) level and, sometimes, at the URL level for the most popular pages. If your site is young, confidential, or in the process of migration, it might not reach the threshold. [To be verified]: Google does not publish precise documentation on the necessary volume, making planning for growing sites challenging.

How can I check if my site is included in CrUX?

Three simple methods allow you to check for your domain's presence in the dataset. First: use PageSpeed Insights by entering your URL. If CrUX has data for your origin, it will appear in a dedicated tab labeled "Field Data." Otherwise, only lab analysis (simulated data) will be available.

The second: directly query the CrUX API through a tool like BigQuery or third-party tools (CrUX API, CrUX Dashboard). The third: check the Core Web Vitals report in Google Search Console, which relies on CrUX. If no data appears in these three interfaces, your site is likely not included in the public dataset.

  • Inclusion in CrUX is automatic and based on a sample of Chrome traffic — no manual effort can force a site’s addition.
  • An undocumented minimum traffic threshold determines eligibility. Low-traffic sites do not appear in the dataset.
  • CrUX data is what Google uses for ranking — lack of CrUX data = lack of visibility on the official metrics leveraged by the engine.
  • Self-collecting data via JavaScript web-vitals is the only alternative for non-included sites or to achieve finer granularity (by page, by user segment).
  • CrUX aggregates at the origin (domain) level and sometimes at the URL level for high-traffic pages. Niche pages often remain invisible even if the domain is included.

SEO Expert opinion

Is this statement consistent with observed practices in the field?

Yes, it is fully consistent. The automatic nature of CrUX inclusion is an established and observable fact: thousands of sites appear in the dataset without any voluntary action from their owners. No sign-up, no forms, no manual validation — everything relies on the volume of eligible Chrome traffic. It’s a model of passive collection that favors high-audience players.

However, the lack of transparency regarding the exact traffic threshold remains a pain point for practitioners. It has been observed that sites with 5,000 monthly visits can be absent, while others with 10,000 visits appear. The traffic composition (Chrome share, share of users with sync enabled) likely plays as significant a role as the raw volume. [To be verified]: Google has never published an official figure, which forces growth sites to work in the dark.

What nuances should be added to this statement?

The first nuance concerns the granularity of the data. CrUX aggregates at the origin (domain) level, which obscures variations between pages. An ultra-high-performing homepage can mask disastrous conversion pages — and vice versa. URL-level data exists in CrUX, but only for pages with significant individual traffic. For an SEO managing optimization by template or category, this is insufficient.

The second: CrUX only captures Chrome users who opted into sending statistics. This excludes Firefox, Safari, Edge (non-recent Chromium), and privacy-first Chrome users. The sample is large but not universal. If your audience is mostly Safari (premium iOS market, for example), CrUX may under-represent your real-world situation.

In what cases does this rule not apply or has limitations?

CrUX does not cover very low-traffic sites, intranets, sites under development or in closed beta. It also does not capture badly instrumented single-page applications (SPAs) where client-side transitions do not generate new exploitable CrUX measures. Sites behind strict authentication (B2B portals, members-only) may see their public traffic below the inclusion threshold.

Another limitation: the latency of CrUX data publication. Metrics are aggregated over a 28-day rolling window and published with a delay. If you fix a performance bug today, it will take several weeks for the improvement to appear in CrUX. For real-time monitoring, self-collection via JavaScript web-vitals remains essential.

Note: Do not confuse a lack of CrUX data with a lack of Core Web Vitals in ranking. Google may utilize other signals (Search Console data, internal data) even if your site does not appear in the public dataset. But you will have no visibility into these metrics.

Practical impact and recommendations

What should I do if my site is not in CrUX?

The first step is to implement your own Core Web Vitals collection using the web-vitals JavaScript library (available on npm). This library measures LCP, INP, CLS, FCP, TTFB on the client side and allows you to send metrics to your Analytics tool (Google Analytics 4, Matomo, etc.) or to a custom endpoint. You will thus retrieve granular data by page, device, and user segment — far beyond what the public CrUX offers.

The second: cross-reference this data with that from Search Console (Core Web Vitals report) to check for consistency. If Search Console reports CWV data but the public CrUX is empty, it indicates that Google is utilizing unpublished internal data. If neither reveals any data, your site is likely below the eligible traffic threshold — or your JavaScript instrumentation is faulty.

What mistakes should be avoided in interpreting CrUX data?

Never assume that the absence of your site in CrUX means Google isn't measuring your Core Web Vitals. Google may collect data via Search Console or other internal channels without publishing it in the CrUX dataset. The lack of public visibility does not equate to a lack of consideration in ranking.

Another common mistake: relying solely on origin-level CrUX data to optimize specific pages. Domain averages mask internal variations. A disastrous product page may be invisible in origin-level CrUX if the rest of the site compensates. Instrument your critical pages individually to drive optimization at the proper granularity level.

How can I check if my site adheres to Google's Core Web Vitals recommendations?

Use PageSpeed Insights in "Field Data" mode (if available) to obtain CrUX metrics. Compare them against the official thresholds: LCP < 2.5s, INP < 200ms, CLS < 0.1 for at least 75% of visits. If field data is missing, switch to lab data (Lighthouse) while keeping in mind that it only reflects a simulated scenario, not the actual user experience.

Supplement with RUM (Real User Monitoring) collection via web-vitals.js to obtain continuous and segmented metrics. Cross-reference with Search Console to detect discrepancies between your instrumentation and the data Google sees. If discrepancies are significant, check your collection code and eligibility of your users (Chrome share, proportion of authenticated traffic, etc.).

  • Check your site's presence in CrUX via PageSpeed Insights, the CrUX API, or Search Console (Core Web Vitals report).
  • Implement the web-vitals JavaScript library to collect LCP, INP, CLS in real-time and send the data to Analytics.
  • Individually instrument critical pages (conversion pages, SEO landing pages) to obtain URL-level granularity.
  • Cross-reference the self-collected data with that from Search Console to detect inconsistencies or blind spots.
  • Avoid relying solely on origin-level CrUX data to optimize specific pages — prioritize granular RUM collection.
  • Monitor CrUX publication latency (28-day rolling) and do not expect real-time responsiveness from this dataset.
Automatic inclusion in CrUX simplifies data collection for high-traffic sites, but leaves niche players without visibility. Implementing RUM collection via JavaScript web-vitals becomes essential to manage optimization of Core Web Vitals. These technical setups — JavaScript instrumentation, cross-device analysis, cross-referencing with Search Console — can be complex to deploy and interpret without deep expertise. To maximize ranking impact and avoid instrumentation pitfalls, partnering with an SEO agency specializing in performance and user data analysis can provide personalized support and actionable recommendations tailored to your infrastructure.

❓ Frequently Asked Questions

Puis-je demander manuellement l'inclusion de mon site dans le CrUX ?
Non. L'inclusion est entièrement automatique et basée sur un échantillon du trafic Chrome. Aucune inscription, aucun formulaire, aucune demande manuelle ne peut forcer l'ajout d'un site dans le dataset.
Quel volume de trafic minimum faut-il pour apparaître dans le CrUX ?
Google ne communique pas de chiffre officiel. Les observations terrain suggèrent qu'un site avec plusieurs milliers de visiteurs Chrome mensuels (ayant activé l'envoi de statistiques) a des chances d'être inclus, mais il n'existe pas de seuil documenté.
Si mon site n'est pas dans le CrUX, Google ne mesure-t-il pas mes Core Web Vitals ?
Pas nécessairement. Google peut collecter des données Core Web Vitals via Search Console ou d'autres canaux internes, même si votre site n'apparaît pas dans le dataset CrUX public. L'absence de visibilité publique ne signifie pas absence de prise en compte dans le ranking.
Comment collecter mes propres données Core Web Vitals si je ne suis pas dans le CrUX ?
Utilisez la bibliothèque JavaScript web-vitals pour mesurer LCP, INP, CLS côté client et envoyer les métriques vers Google Analytics, un endpoint custom ou un outil RUM dédié. Cela vous donne une granularité bien supérieure au CrUX public.
Le CrUX capture-t-il les données de tous les navigateurs ou uniquement Chrome ?
Le CrUX ne capture que les données des utilisateurs Chrome ayant activé l'envoi de statistiques d'utilisation et la synchronisation. Firefox, Safari, et les autres navigateurs ne sont pas couverts. L'échantillon est vaste, mais pas universel.
🏷 Related Topics
AI & SEO JavaScript & Technical SEO Web Performance

🎥 From the same video 38

Other SEO insights extracted from this same Google Search Central video · duration 56 min · published on 04/08/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.