Official statement
Other statements from this video 14 ▾
- □ Pourquoi la mise à jour Page Experience ne sera-t-elle pas instantanée ?
- □ AMP suffit-il vraiment à garantir de bonnes Core Web Vitals ?
- □ Le trafic référent influence-t-il vraiment le classement Google ?
- □ Pourquoi vos données Lighthouse ne reflètent-elles jamais la réalité de vos utilisateurs ?
- □ Pourquoi la géolocalisation de vos visiteurs impacte-t-elle vos Core Web Vitals ?
- □ Comment un petit site peut-il vraiment concurrencer les géants du SEO ?
- □ La mise à jour product review s'applique-t-elle uniquement aux sites d'avis spécialisés ?
- □ Les commentaires pourris font-ils chuter le classement de toute la page ?
- □ Faut-il vraiment créer des sitemaps XML séparés par pays pour le multilingue ?
- □ Faut-il vraiment s'inquiéter si la page d'accueil n'apparaît pas en première position dans une requête site: ?
- □ Google calcule-t-il vraiment un score EAT pour votre site ?
- □ Le noindex bloque-t-il vraiment le crawl de vos pages ?
- □ Robots.txt bloque-t-il vraiment l'indexation de vos pages ?
- □ Les Core Web Vitals ne servent-ils vraiment qu'à départager des résultats ex-aequo ?
Google collects Core Web Vitals data through the Chrome User Experience Report over a rolling 28-day period. A technical fix validated today in Lighthouse will only be visible in Search Console about a month later. This latency requires rigorous planning of optimizations and makes any attempt at instant measurement of improvements irrelevant.
What you need to understand
Where do the Core Web Vitals data displayed in Search Console actually come from?‹/h3>
The Core Web Vitals metrics you see in Search Console do not come from an active crawl by Googlebot. They are extracted from the Chrome User Experience Report (CrUX), a public dataset that aggregates real performance measured by millions of Chrome users.
This dataset collects experiences over a rolling 28-day window. Specifically: the figures you see today in Search Console reflect performance measured between D-28 and D. Not yesterday's figures, nor those from last week — those from the complete last month.
Why does Lighthouse show different results from Search Console?
Lighthouse performs a synthetic test in a lab, under controlled conditions. It simulates loading on a standardized device, with a calibrated connection. This is useful for detecting a problem, but it does not reflect the actual experience of your visitors.
Search Console, on the other hand, displays field data sourced from CrUX. This data aggregates performances measured with real users, with their variable connections, diverse devices, and active Chrome extensions. An discrepancy between the two tools is thus perfectly normal — and it is the CrUX data that matters for ranking.
What does this 28-day delay really mean for a practitioner?
If you deploy an optimization today — say, removing a blocking script or caching a font — you will only be able to validate its real impact about a month later. In the meantime, previous performances continue to pollute the averages displayed in Search Console.
This delay requires discipline: it is impossible to implement rapid test-and-learn on Core Web Vitals as one would on a content A/B test. Each modification requires long-term tracking, and urgent fixes produce no immediate visible effect in Google reports.
- CrUX aggregates field data over a rolling 28 days, not in real-time
- Lighthouse tests in a lab, CrUX measures the actual experience of users
- A technical deployment today will appear in Search Console about 4 weeks later
- Urgent fixes do not allow for immediate validation in Google’s official reports
- Optimizations must be planned considering this uncompressible latency
SEO Expert opinion
Is this statement consistent with field observations?
Yes, and it's even one of the few communications from Google that is free of ambiguity. The 28-day delay corresponds exactly to the documented sampling window of CrUX. In practice, practitioners who monitor their metrics via the PageSpeed Insights API or directly through BigQuery consistently observe this delay.
What is less often specified: this latency varies slightly depending on your site's Chrome traffic volume. A site with low traffic may take longer than 28 days to see its data stabilize — or may never appear in CrUX if the minimal threshold of Chrome visitors is not reached. [To verify]: Google does not officially publish this threshold, but observations converge towards a few thousand monthly Chrome visits.
What nuances should be applied to this 28-day rule?
The 28-day delay is an average collection time, not a guarantee. An optimization deployed today will start influencing the data as early as tomorrow — but it will only represent 1/28th of the aggregated dataset. You must wait for the entire rolling window to be renewed to see the complete impact.
Another nuance: if you fix a problem on a specific URL but your site has thousands of URLs, the overall impact in Search Console will remain diluted. CWV reports often aggregate by groups of similar URLs, not URL by URL. A localized fix will not immediately turn an entire group green.
In which cases does this rule not apply?
If your site does not receive enough Chrome traffic to appear in CrUX, this rule becomes irrelevant: you will never have field data in Search Console, regardless of the delay. In that case, Google resorts to origin-level data or nothing at all — and Core Web Vitals will play no role in your ranking due to lack of measurement.
Another exception: sites undergoing a drastic technical migration (CDN change, complete redesign) may see their CrUX data partially reset. The 28-day delay remains valid, but older data may pollute less quickly than expected. [To verify]: no official documentation specifies how Google handles technical discontinuities in CrUX.
Practical impact and recommendations
How to plan your optimizations considering this delay?
Work in minimum 6-week sprints: 2 weeks for deployment, 4 weeks for observation. Document each technical change with its exact date so you can correlate metric variations a month later. Without this rigor, you will never know which change produced which effect.
Use third-party tools like Cloudflare RUM, SpeedCurve, or Sentry Performance to get real-time field data. These tools will never replace CrUX for ranking, but they allow you to validate that an optimization is working before waiting a month. This is the only way to quickly detect an accidental regression.
What mistakes to avoid when tracking Core Web Vitals?
Do not panic if your Lighthouse scores remain green while Search Console shows red. As long as the CrUX data has not refreshed, the old state pollutes the reports. Conversely, do not rest on your laurels if Lighthouse shows 100/100: only real performances matter.
Avoid making a flurry of changes. If you deploy three optimizations in one week, you will not be able to isolate which one worked when the CrUX data finally updates. Proceed with sequential iterations, spaced at least 28 days apart, to maintain a clear causal link.
How to verify that your optimizations have the expected effect?
Query the PageSpeed Insights API or the CrUX API directly to obtain the field data for your critical URLs. These APIs provide the same metrics as Search Console, but with higher granularity. You will thus be able to track the week-to-week evolution, even if Search Console only displays an aggregated view.
Set up continuous synthetic monitoring (Lighthouse CI, WebPageTest automation) to detect regressions at the time of deployment. Even if these tests do not reflect the actual experience, they immediately alert you if a commit breaks performance — without waiting 28 days to discover it in Search Console.
- Plan your optimizations in minimum 6-week sprints to isolate effects
- Document each change with its exact deployment date
- Use third-party RUM tools to validate fixes in real time
- Regularly query the CrUX and PageSpeed Insights APIs for granular tracking
- Avoid making a flurry of changes: proceed with sequential iterations
- Automate synthetic monitoring to detect regressions at commit
❓ Frequently Asked Questions
Pourquoi mes scores Lighthouse sont excellents mais Search Console affiche des URLs lentes ?
Puis-je forcer Google à rafraîchir mes données CrUX plus rapidement ?
Combien de trafic Chrome faut-il pour apparaître dans CrUX ?
Les données CrUX sont-elles agrégées par URL ou par groupe d'URLs ?
Si je déploie une optimisation progressive, à partir de quand commence le compte à rebours de 28 jours ?
🎥 From the same video 14
Other SEO insights extracted from this same Google Search Central video · published on 09/04/2021
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.