What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

The Chrome User Experience Report data used for Core Web Vitals in Search Console has a collection delay of about 28 days. A technical change tested today in Lighthouse will only be visible in Search Console approximately one month later.
🎥 Source video

Extracted from a Google Search Central video

💬 EN 📅 09/04/2021 ✂ 15 statements
Watch on YouTube →
Other statements from this video 14
  1. Pourquoi la mise à jour Page Experience ne sera-t-elle pas instantanée ?
  2. AMP suffit-il vraiment à garantir de bonnes Core Web Vitals ?
  3. Le trafic référent influence-t-il vraiment le classement Google ?
  4. Pourquoi vos données Lighthouse ne reflètent-elles jamais la réalité de vos utilisateurs ?
  5. Pourquoi la géolocalisation de vos visiteurs impacte-t-elle vos Core Web Vitals ?
  6. Comment un petit site peut-il vraiment concurrencer les géants du SEO ?
  7. La mise à jour product review s'applique-t-elle uniquement aux sites d'avis spécialisés ?
  8. Les commentaires pourris font-ils chuter le classement de toute la page ?
  9. Faut-il vraiment créer des sitemaps XML séparés par pays pour le multilingue ?
  10. Faut-il vraiment s'inquiéter si la page d'accueil n'apparaît pas en première position dans une requête site: ?
  11. Google calcule-t-il vraiment un score EAT pour votre site ?
  12. Le noindex bloque-t-il vraiment le crawl de vos pages ?
  13. Robots.txt bloque-t-il vraiment l'indexation de vos pages ?
  14. Les Core Web Vitals ne servent-ils vraiment qu'à départager des résultats ex-aequo ?
📅
Official statement from (5 years ago)
TL;DR

Google collects Core Web Vitals data through the Chrome User Experience Report over a rolling 28-day period. A technical fix validated today in Lighthouse will only be visible in Search Console about a month later. This latency requires rigorous planning of optimizations and makes any attempt at instant measurement of improvements irrelevant.

What you need to understand

Where do the Core Web Vitals data displayed in Search Console actually come from?‹/h3>

The Core Web Vitals metrics you see in Search Console do not come from an active crawl by Googlebot. They are extracted from the Chrome User Experience Report (CrUX), a public dataset that aggregates real performance measured by millions of Chrome users.

This dataset collects experiences over a rolling 28-day window. Specifically: the figures you see today in Search Console reflect performance measured between D-28 and D. Not yesterday's figures, nor those from last week — those from the complete last month.

Why does Lighthouse show different results from Search Console?

Lighthouse performs a synthetic test in a lab, under controlled conditions. It simulates loading on a standardized device, with a calibrated connection. This is useful for detecting a problem, but it does not reflect the actual experience of your visitors.

Search Console, on the other hand, displays field data sourced from CrUX. This data aggregates performances measured with real users, with their variable connections, diverse devices, and active Chrome extensions. An discrepancy between the two tools is thus perfectly normal — and it is the CrUX data that matters for ranking.

What does this 28-day delay really mean for a practitioner?

If you deploy an optimization today — say, removing a blocking script or caching a font — you will only be able to validate its real impact about a month later. In the meantime, previous performances continue to pollute the averages displayed in Search Console.

This delay requires discipline: it is impossible to implement rapid test-and-learn on Core Web Vitals as one would on a content A/B test. Each modification requires long-term tracking, and urgent fixes produce no immediate visible effect in Google reports.

  • CrUX aggregates field data over a rolling 28 days, not in real-time
  • Lighthouse tests in a lab, CrUX measures the actual experience of users
  • A technical deployment today will appear in Search Console about 4 weeks later
  • Urgent fixes do not allow for immediate validation in Google’s official reports
  • Optimizations must be planned considering this uncompressible latency

SEO Expert opinion

Is this statement consistent with field observations?

Yes, and it's even one of the few communications from Google that is free of ambiguity. The 28-day delay corresponds exactly to the documented sampling window of CrUX. In practice, practitioners who monitor their metrics via the PageSpeed Insights API or directly through BigQuery consistently observe this delay.

What is less often specified: this latency varies slightly depending on your site's Chrome traffic volume. A site with low traffic may take longer than 28 days to see its data stabilize — or may never appear in CrUX if the minimal threshold of Chrome visitors is not reached. [To verify]: Google does not officially publish this threshold, but observations converge towards a few thousand monthly Chrome visits.

What nuances should be applied to this 28-day rule?

The 28-day delay is an average collection time, not a guarantee. An optimization deployed today will start influencing the data as early as tomorrow — but it will only represent 1/28th of the aggregated dataset. You must wait for the entire rolling window to be renewed to see the complete impact.

Another nuance: if you fix a problem on a specific URL but your site has thousands of URLs, the overall impact in Search Console will remain diluted. CWV reports often aggregate by groups of similar URLs, not URL by URL. A localized fix will not immediately turn an entire group green.

In which cases does this rule not apply?

If your site does not receive enough Chrome traffic to appear in CrUX, this rule becomes irrelevant: you will never have field data in Search Console, regardless of the delay. In that case, Google resorts to origin-level data or nothing at all — and Core Web Vitals will play no role in your ranking due to lack of measurement.

Another exception: sites undergoing a drastic technical migration (CDN change, complete redesign) may see their CrUX data partially reset. The 28-day delay remains valid, but older data may pollute less quickly than expected. [To verify]: no official documentation specifies how Google handles technical discontinuities in CrUX.

Warning: never rely solely on Lighthouse to validate optimizations aimed at improving your ranking. The CrUX data is the only one that counts for ranking, and it comes with 28 days of uncompressible latency.

Practical impact and recommendations

How to plan your optimizations considering this delay?

Work in minimum 6-week sprints: 2 weeks for deployment, 4 weeks for observation. Document each technical change with its exact date so you can correlate metric variations a month later. Without this rigor, you will never know which change produced which effect.

Use third-party tools like Cloudflare RUM, SpeedCurve, or Sentry Performance to get real-time field data. These tools will never replace CrUX for ranking, but they allow you to validate that an optimization is working before waiting a month. This is the only way to quickly detect an accidental regression.

What mistakes to avoid when tracking Core Web Vitals?

Do not panic if your Lighthouse scores remain green while Search Console shows red. As long as the CrUX data has not refreshed, the old state pollutes the reports. Conversely, do not rest on your laurels if Lighthouse shows 100/100: only real performances matter.

Avoid making a flurry of changes. If you deploy three optimizations in one week, you will not be able to isolate which one worked when the CrUX data finally updates. Proceed with sequential iterations, spaced at least 28 days apart, to maintain a clear causal link.

How to verify that your optimizations have the expected effect?

Query the PageSpeed Insights API or the CrUX API directly to obtain the field data for your critical URLs. These APIs provide the same metrics as Search Console, but with higher granularity. You will thus be able to track the week-to-week evolution, even if Search Console only displays an aggregated view.

Set up continuous synthetic monitoring (Lighthouse CI, WebPageTest automation) to detect regressions at the time of deployment. Even if these tests do not reflect the actual experience, they immediately alert you if a commit breaks performance — without waiting 28 days to discover it in Search Console.

  • Plan your optimizations in minimum 6-week sprints to isolate effects
  • Document each change with its exact deployment date
  • Use third-party RUM tools to validate fixes in real time
  • Regularly query the CrUX and PageSpeed Insights APIs for granular tracking
  • Avoid making a flurry of changes: proceed with sequential iterations
  • Automate synthetic monitoring to detect regressions at commit
The 28-day delay turns Core Web Vitals optimization into an exercise in patience and methodological rigor. Without strict planning, it becomes impossible to correlate cause and effect. These optimizations require sharp technical expertise and long-term monitoring — two skills that not all internal teams master. If you lack resources or your deployments are sequential without visible improvement, considering partnering with a specialized SEO agency can accelerate diagnosis and prevent months of costly trial and error.

❓ Frequently Asked Questions

Pourquoi mes scores Lighthouse sont excellents mais Search Console affiche des URLs lentes ?
Lighthouse teste en laboratoire avec des conditions idéales. Search Console affiche les performances réelles mesurées chez vos visiteurs Chrome sur 28 jours. Les connexions lentes, les appareils anciens et les extensions polluent les données terrain — c'est normal et attendu.
Puis-je forcer Google à rafraîchir mes données CrUX plus rapidement ?
Non. La fenêtre de collecte de 28 jours est incompressible et automatique. Aucune action de votre part ne peut accélérer la mise à jour des données affichées dans Search Console.
Combien de trafic Chrome faut-il pour apparaître dans CrUX ?
Google ne publie pas de seuil officiel, mais les observations terrain suggèrent plusieurs milliers de visites Chrome mensuelles. Un site à très faible trafic ne disposera jamais de données CWV dans Search Console.
Les données CrUX sont-elles agrégées par URL ou par groupe d'URLs ?
Search Console agrège souvent par groupes d'URLs similaires (même template, même type de page). Une optimisation localisée sur une seule URL ne fera pas basculer tout le groupe instantanément.
Si je déploie une optimisation progressive, à partir de quand commence le compte à rebours de 28 jours ?
Le CrUX collecte en continu. Chaque jour écoulé après le déploiement remplace 1/28e des anciennes données par les nouvelles. L'impact complet n'est visible qu'une fois la fenêtre entièrement renouvelée, soit environ 4 semaines.
🏷 Related Topics
Content AI & SEO Web Performance Search Console

🎥 From the same video 14

Other SEO insights extracted from this same Google Search Central video · published on 09/04/2021

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.