Official statement
Other statements from this video 25 ▾
- 1:41 Faut-il vraiment utiliser des canonical cross-domain pour consolider plusieurs sites thématiques ?
- 2:00 Les redirections 302 transmettent-elles le PageRank comme les 301 ?
- 2:00 Le canonical tag transfère-t-il vraiment 100% du PageRank sans aucune perte ?
- 14:00 Faut-il vraiment éviter de mettre tous ses liens sortants en nofollow ?
- 14:10 Faut-il vraiment éviter de mettre tous ses liens sortants en nofollow ?
- 16:16 L'outil de paramètres d'URL dans Search Console : mort-vivant ou encore utile pour votre SEO ?
- 16:36 L'outil URL Parameters de Google fonctionne-t-il encore malgré son interface cassée ?
- 20:01 Pourquoi bloquer le robots.txt empêche-t-il le noindex de fonctionner ?
- 22:03 Les Core Web Vitals sont-ils vraiment le seul critère de vitesse qui compte pour le classement ?
- 23:03 Core Web Vitals : pourquoi Google ignore-t-il les autres métriques de performance pour le Page Experience ?
- 26:50 Le texte alternatif est-il vraiment décisif pour votre visibilité dans Google Images ?
- 26:50 Le texte alternatif des images sert-il vraiment au référencement naturel ?
- 28:26 Les redirections 302 transmettent-elles vraiment autant de PageRank que les 301 ?
- 30:17 Faut-il vraiment cacher les bannières de consentement cookies à Googlebot ?
- 30:57 Faut-il vraiment bloquer les cookie banners pour Googlebot ?
- 34:46 Pourquoi Google affiche-t-il encore d'anciens contenus dans vos meta descriptions ?
- 34:46 Pourquoi Google affiche-t-il parfois vos anciennes meta descriptions dans les SERP ?
- 36:57 Faut-il vraiment afficher les cookie banners à Googlebot ?
- 37:56 Les redirections 302 deviennent-elles vraiment des 301 avec le temps ?
- 40:01 Faut-il vraiment renvoyer un 404 pour les produits définitivement indisponibles ?
- 40:01 Faut-il renvoyer un 404 ou un 200 sur une page produit en rupture de stock ?
- 43:37 Faut-il synchroniser les dates visibles et les dates techniques pour booster son crawl ?
- 43:38 Faut-il vraiment distinguer la date visible de celle des données structurées ?
- 46:46 Pourquoi Google crawle-t-il encore vos anciennes URLs supprimées ?
- 47:09 Pourquoi Google continue-t-il de crawler vos anciennes URLs en 404 ?
Google ranks sites based on Core Web Vitals solely relying on real data from the Chrome User Experience Report (CrUX), not on lab scores. A site that shows 100/100 in PageSpeed Insights but is experienced as slow by real users will be penalized. The gap between synthetic tests and real-world experience makes it essential to monitor CrUX as a priority, not Lighthouse.
What you need to understand
Why does Google prefer real-world data over lab tests?
Lab tests (Lighthouse, WebPageTest, PageSpeed Insights in lab mode) run under controlled conditions: simulated device, stable connection, no cache, no browser extensions. These tests are reproducible and convenient for debugging, but they never reflect the diversity of real configurations.
The Chrome User Experience Report aggregates metrics from millions of Chrome users who have opted to share their browsing data. Google collects Core Web Vitals (LCP, INP, CLS) as they occur in real-world conditions: unstable 4G, underperforming CPU, active extensions, full or empty cache. This real-world data captures the variability of the real world.
This distinction is crucial. A site may score 95/100 in the lab on a fiber connection with a MacBook Pro, then fail to reach 2.8 seconds of LCP on a Samsung Galaxy A with fluctuating 3G network. Google indexes and ranks based on what users are genuinely experiencing, not what a bot measures in an ideal environment.
How does CrUX collect this real data?
CrUX relies on the explicit consent of Chrome users who enable the option “Help improve Chrome by sending usage statistics and error reports”. Only these users contribute to the data. Google then aggregates metrics by origin (entire domain) and by URL (individual pages if there is sufficient traffic).
The data is published with a 28-day rolling delay: what you see today in the CrUX report corresponds to the last four weeks. This means an improvement rolled out yesterday won’t be fully reflected in CrUX for about a month. Lab tests, on the other hand, capture changes instantaneously — but they do not guarantee anything about the actual impact.
Google filters the data to ensure a minimum of volume and anonymity. If your page does not have enough Chrome visitors with sharing enabled, it will not appear in CrUX. In this case, Google resorts to data from the entire origin. No origin in CrUX? Then your Core Web Vitals are not evaluated in the ranking — but this case is rare for sites with significant traffic.
What is the difference between “origin” and “URL” in CrUX?
CrUX distinguishes between two levels of granularity. The origin aggregates all pages of a domain (e.g., https://example.com). The URL represents a specific page (e.g., https://example.com/blog/article-123). Google prefers URL data when available, but resorts to the origin if the volume is insufficient.
In practice: a site with a fast homepage but slow product pages will see average data at the origin level, which can hide localized issues. Conversely, a highly trafficked page with poor metrics can drag the origin down. Monitoring both levels is essential to identify where to act.
- CrUX reflects the real experience of millions of Chrome users, not a simulated environment.
- Data is published with a 28-day latency — fixes take time to reflect.
- Google ranks based on CrUX, not Lighthouse — a perfect lab score guarantees nothing regarding ranking.
- Origin vs. URL: Google uses URL data if available; otherwise, the entire origin.
- Not in CrUX? Your Core Web Vitals do not affect rankings, but this is still a marginal case.
SEO Expert opinion
Is this statement consistent with field observations?
Yes, and it's verifiable. Since the rollout of the Page Experience Update, correlations between ranking and CrUX data are documented, while Lighthouse scores alone predict nothing. I’ve seen sites score 98/100 in the lab stagnate in positions 8-12, while competitors at 72/100 with positive CrUX jump into the top 3. The field confirms: Google isn't misleading on this matter.
The nuance — and it’s a big one — is that CrUX only covers Chrome users with sharing enabled. This represents a fraction of real traffic, biased towards certain geographic areas and types of users. But Google accepts this bias: it does not claim to measure all users, just a sufficiently large sample to be representative. In practice, CrUX captures enough diversity to reflect real trends.
A rarely highlighted point: CrUX data includes visitors with browser cache enabled, script-blocking extensions, degraded connections. This is what makes them valuable — and it's also why a site may perform well in the lab (empty cache, no extensions) but poorly in production. Lab tests remain useful for debugging, but they should never serve as a proxy for ranking.
What limits should you know before taking CrUX at face value?
CrUX aggregates over 28 rolling days. This smooths out peaks and troughs, but it also means that an optimization deployed today will take a month to be fully reflected. If you fix a catastrophic LCP on Monday, don’t expect to see your ranking move on Tuesday — Google continues to consider the previous four weeks. This latency confuses many practitioners who expect immediate responsiveness.
Another limit: CrUX does not publish data for all individual URLs. If a page does not have enough Chrome traffic, it does not appear. Google then resorts to origin data, which can mask localized issues. A super-slow product page with few visitors will not significantly impact global metrics — but it can still harm the experience and conversions. [To be verified]: Google has never published the volume threshold necessary for a URL to appear in CrUX.
In what cases does this rule not apply?
If your site is not in CrUX — typically a very low-traffic site, an intranet, or a new domain — then Google cannot assess your Core Web Vitals. You will neither be rewarded nor penalized on this criterion. Ranking will then rely solely on other signals (relevance, backlinks, content, etc.). This is not a disaster, but you lose an optimization lever.
Another case: sites with heavily geographically biased traffic. CrUX aggregates globally by default, but PageSpeed Insights allows filtering by country. If 95% of your traffic comes from France and your servers are optimized for Europe, your overall CrUX data may be dragged down by a few distant visitors with high latency. Analyze region-segmented data to better understand where to act.
Finally, pages behind authentication or paywalls do not generate public CrUX data. Google can collect these metrics internally (if users are logged into Chrome), but they are not exposed in public APIs. You will need to rely on your own RUM tools to monitor the real experience on these pages.
Practical impact and recommendations
What actions should you take to optimize based on CrUX rather than lab tests?
Start by continuously monitoring CrUX, not just Lighthouse. Use PageSpeed Insights to check the real-world data of your URLs and origin. Complement with the free CrUX API or BigQuery (public dataset) to track weekly progress. If your lab data is green but CrUX is red, it's CrUX that dictates your ranking — focus on that.
Identify gaps between lab and real-world. If Lighthouse shows 1.2s LCP but CrUX shows 3.8s, investigate the reasons on the side of actual users: misconfigured CDN cache, blocking resources in certain countries, third-party scripts that spike loading time on mobile. Activate a RUM (Real User Monitoring) tool to capture metrics from 100% of your visitors, not just the Chrome sample. This will provide you with a finer view than CrUX alone.
Optimize first for the high-traffic pages that drag down your origin metrics. A homepage visited 10,000 times a day carries more weight in CrUX than a campaign landing page with 50 visits. Fix what impacts the most people first. Use Search Console to cross-reference URLs with poor metrics and those generating the most organic traffic.
What mistakes should you absolutely avoid?
Never rely solely on Lighthouse scores to validate your optimizations. A site can jump from 68 to 94 in the lab after deferring a few scripts, but if those scripts still load on the user’s side and degrade the real experience, CrUX won’t budge — and neither will your ranking. Always validate on CrUX before declaring victory.
Another trap: deploy an optimization and check CrUX the next day. You will still see the previous 28 days. Wait at least three to four weeks to measure the real impact. In the meantime, monitor your RUM tools in real-time to confirm that the improvement is indeed happening for users. This patience is hard to sell internally, but it prevents hasty conclusions.
Finally, don’t overlook device-segmented data (mobile vs. desktop) and by region. CrUX publishes separate metrics for each dimension. A site may be green on desktop and catastrophic on mobile. Google primarily indexes and ranks based on the mobile version — if your mobile Core Web Vitals are red, you’re losing ground even if desktop is perfect.
How can I check if my site is being tracked in CrUX?
Visit PageSpeed Insights, enter your URL, and scroll down to the section “Discover the real performance of this page”. If you see CrUX data (LCP, INP, CLS with good/to improve/poor breakdown), your page is tracked. If the message says “Real-world data is not available for this URL,” Google resorts to origin data. No origin data either? Your site is not in CrUX.
Another method: query the CrUX API directly (free, no key required for public requests). You can check both URL and origin data, filter by device (mobile/desktop/tablet) and by region. This is more flexible than PageSpeed Insights and allows automated monitoring. If you manage hundreds of pages, this is the only scalable approach.
Set up automated alerts if your CrUX metrics degrade. A theme change, a new plugin, a misconfigured CDN can shift your Core Web Vitals from green to red in just a few days — but you won’t see it in CrUX for a month. A real-time RUM tool alerts you immediately, before Google records the degradation. This leaves a window to correct before impacting your ranking.
- Monitor CrUX weekly via PageSpeed Insights or the CrUX API, not just Lighthouse.
- Enable a RUM (Real User Monitoring) tool to capture 100% of visitors across all browsers.
- Prioritize optimizations on high-traffic pages that degrade origin metrics.
- Wait 3-4 weeks after deployment to measure the real impact in CrUX.
- Segment data by device (mobile prioritized) and region to identify weakness areas.
- Never validate an optimization solely based on lab scores — always check CrUX systematically.
❓ Frequently Asked Questions
Pourquoi mon score PageSpeed Insights est excellent mais mes Core Web Vitals dans Search Console sont mauvais ?
Combien de temps faut-il pour qu'une optimisation se reflète dans CrUX ?
Mon site n'apparaît pas dans CrUX — est-ce grave pour mon SEO ?
CrUX mesure-t-il tous les navigateurs ou seulement Chrome ?
Les données CrUX sont-elles segmentées par appareil ou par région ?
🎥 From the same video 25
Other SEO insights extracted from this same Google Search Central video · duration 53 min · published on 29/10/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.