Official statement
Other statements from this video 19 ▾
- 1:41 Contenu de faible qualité : pourquoi Google ne lance-t-il pas systématiquement d'action manuelle ?
- 3:43 Pourquoi vos Core Web Vitals diffèrent-ils autant entre lab et field ?
- 7:23 ccTLD ou sous-répertoires pour l'international : y a-t-il vraiment un avantage SEO ?
- 7:37 Pourquoi une restructuration d'URL provoque-t-elle des fluctuations de trafic pendant 1 à 2 mois ?
- 10:15 Faut-il vraiment optimiser pour l'intention de recherche ou est-ce un piège sémantique ?
- 11:48 Faut-il optimiser son contenu pour BERT ou est-ce une perte de temps ?
- 15:57 Comment tester si SafeSearch pénalise votre contenu dans les résultats Google ?
- 17:32 SafeSearch bloque-t-il vraiment vos résultats enrichis ?
- 19:38 Les Core Web Vitals s'appliquent-ils vraiment partout dans le monde ?
- 22:33 Google traite-t-il vraiment tous les synonymes et variations de mots-clés de la même manière ?
- 26:34 Faut-il vraiment rediriger TOUTES les URLs lors d'une migration ?
- 27:27 Noindex en migration : pourquoi Google considère-t-il que vous perdez toute votre valeur SEO ?
- 28:43 Pourquoi les migrations complexes génèrent-elles toujours des fluctuations de rankings ?
- 32:25 Les Web Stories comptent-elles vraiment comme des pages normales pour Google ?
- 34:58 L'infinite scroll tue-t-il vraiment l'indexation de vos contenus sur Google ?
- 42:21 Pourquoi vos boutons HTML sabotent-ils votre crawl budget ?
- 46:50 Hreflang peut-il remplacer les liens internes pour vos pages internationales ?
- 48:46 Payer pour des liens : où passe exactement la ligne rouge de Google ?
- 50:48 Faut-il vraiment implémenter tous les types Schema.org pour améliorer son SEO ?
Search Console does not calculate its own Core Web Vitals — it directly retrieves data from the Chrome User Experience Report (CrUX). If your numbers do not match your monitoring tools, that's normal: Google relies exclusively on field data from real Chrome users. To investigate a discrepancy, you need to dive into CrUX directly, not Search Console.
What you need to understand
Does Search Console create its own Core Web Vitals metrics?
No, and this is a crucial point that many SEO professionals overlook. Search Console does not perform any proprietary calculations for Core Web Vitals. It simply displays the data collected by the Chrome User Experience Report (CrUX), a public database powered by real Chrome users who consent to sharing their usage statistics.
Practically speaking, this means that the LCP, FID, and CLS scores you see in Search Console reflect the actual experience of your Chrome visitors, not a lab simulation. This is known as field data — as opposed to lab data from Lighthouse or PageSpeed Insights in test mode.
Why don't my monitoring tools match the numbers from Search Console?
Because your Real User Monitoring (RUM) tools may capture all browsers — Safari, Firefox, Edge — while CrUX only collects data from Chrome. If 40% of your traffic comes from Safari, you are comparing two different populations.
Another factor is the publication delay. CrUX aggregates data over a rolling 28-day period and publishes it with a lag. A spike in slowness yesterday will not instantaneously appear in Search Console. Thus, your real-time tools may show degradation that CrUX hasn’t yet incorporated.
What should I do if Search Console reports no data for my site?
This means your site has not reached the minimum Chrome traffic threshold required by CrUX. Google does not publish the exact number, but it is estimated that several thousand Chrome visitors per month are needed for a URL or group of URLs to enter the dataset.
In this case, Search Console displays “Insufficient data.” You can still query the CrUX API directly to check if some isolated pages have data, or use PageSpeed Insights, which sometimes displays CrUX data even when Search Console remains silent.
- Search Console = showcase of CrUX data, not a standalone measurement tool
- Field data reflects real Chrome users over a rolling 28-day period
- No CrUX data = insufficient traffic, not necessarily a performance issue
- To investigate a discrepancy, cross-reference CrUX API, PageSpeed Insights, and your own RUM tools
- Lab data (Lighthouse) and field data (CrUX) measure different things — do not confuse them
SEO Expert opinion
Is this statement consistent with what we observe in the field?
Yes, and it is one of the few assertions from Google where the mechanics are transparent and publicly verifiable. CrUX is an open data repository — anyone can query the API or download BigQuery datasets. Therefore, we can verify that Search Console displays exactly the same figures as CrUX for a given URL.
What is less clear is how Google utilizes this data in its ranking algorithm. We know that Core Web Vitals are a ranking signal since the Page Experience Update, but Google does not specify if an LCP of 2.4s is penalized differently from an LCP of 2.6s. [To be verified]: the marginal impact of small variations in the “Good / Needs Improvement / Poor” thresholds.
What nuances should be applied to this statement?
The first nuance is that CrUX aggregates at the domain, origin, and URL level. Search Console displays data grouped by similar URLs, which can mask disparities. A super-fast page could be drowned in a group where other pages are slow.
The second nuance is that CrUX data is sensitive to geographical distribution and devices. If your Chrome traffic primarily comes from countries with slow connections, your CrUX scores will be poor even if your server is optimized. Google does not weight by region — it aggregates everything.
In what cases does this rule not apply?
If you have blocked Chrome's consent to collect usage statistics in your scripts (rare, but possible through aggressive CSP policies), your users do not send CrUX data. As a result, Search Console remains empty while your site has traffic.
Another edge case is intranet sites or those behind strict authentication. CrUX only collects publicly accessible pages without login. If your site is a B2B SaaS with all content behind a login, you will never have CrUX data, even with millions of Chrome users.
Practical impact and recommendations
What concrete actions should I take if my Search Console data diverges from my RUM tools?
Firstly, check the measured population. Export your RUM data and filter only for Chrome desktop + mobile traffic. Then compare this sub-population with the CrUX figures via PageSpeed Insights or the CrUX API. If the discrepancy persists, it is likely a measurement period issue — CrUX aggregates over a rolling 28-day period, while your tools may be on a 7-day basis.
Next, query the CrUX API directly for your critical URLs. You can obtain more granular data than in Search Console — especially the distribution of the P75 percentiles, which is the threshold Google uses to classify a URL as “Good / Needs Improvement / Poor.” If your P75 is just above the threshold, a small optimization could shift the entire page into the green.
What mistakes should I avoid when analyzing Core Web Vitals?
Mistake #1: focusing solely on Lighthouse. Lighthouse measures in a lab, under controlled conditions — often far better than the real-world situation. A Lighthouse score of 90+ does not guarantee a “Good” CrUX. What matters to Google is the actual user experience, not the synthetic test.
Mistake #2: ignoring device segmentation. CrUX separates mobile and desktop data. If your mobile LCP is poor but your desktop is excellent, Search Console may display “Needs Improvement” overall. Dive into the details to target your optimizations — often, it’s mobile that drags scores down.
How can I verify that my site is being tracked by CrUX?
Go to PageSpeed Insights and test a URL. If the section “Discover what your real users experience” appears with data over 28 days, your site is in CrUX. If you only see lab data (Lighthouse), it means you do not have enough Chrome traffic or your pages are not public.
You can also use the CrUX API via BigQuery for massive analyses. Google publishes monthly datasets — you can cross-reference your URLs with CrUX data and identify which pages are tracked. This is particularly useful for large sites with thousands of URLs.
- Filter your RUM data to keep only Chrome traffic and compare over a rolling 28-day period
- Query the CrUX API for your strategic URLs and analyze the P75 distribution
- Do not rely solely on Lighthouse — prioritize field data for your optimizations
- Segment your analyses by device (mobile vs desktop) to target the real friction points
- Verify via PageSpeed Insights that your main pages are indeed present in CrUX
- If you have no CrUX data, focus on increasing Chrome traffic or making your contents publicly accessible
❓ Frequently Asked Questions
Search Console mesure-t-il les Core Web Vitals en temps réel ?
Pourquoi mes scores Lighthouse sont meilleurs que mes scores CrUX ?
Mon site a du trafic mais aucune donnée CrUX — pourquoi ?
Puis-je accéder aux données CrUX même si Search Console dit « Données insuffisantes » ?
Les Core Web Vitals de CrUX influencent-ils directement le ranking ?
🎥 From the same video 19
Other SEO insights extracted from this same Google Search Central video · duration 1h00 · published on 15/01/2021
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.