What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

The Core Web Vitals data displayed in Search Console is based on the Chrome User Experience Report, aggregated over 28 days. This delay is due to the method of data collection and aggregation, not a slow processing time from Search Console.
27:21
🎥 Source video

Extracted from a Google Search Central video

⏱ 912h44 💬 EN 📅 05/03/2021 ✂ 20 statements
Watch on YouTube (27:21) →
Other statements from this video 19
  1. 36:39 Faut-il vraiment tester ses Core Web Vitals en laboratoire pour éviter les régressions ?
  2. 98:33 Les animations CSS pénalisent-elles vraiment vos Core Web Vitals ?
  3. 121:49 Les Core Web Vitals vont-ils encore changer et comment anticiper les prochaines mises à jour ?
  4. 146:15 Les pages par ville sont-elles vraiment toutes des doorway pages condamnées par Google ?
  5. 185:36 Le crawl budget dépend-il vraiment de la vitesse de votre serveur ?
  6. 203:58 Faut-il vraiment commencer petit pour débloquer son crawl budget ?
  7. 228:24 Faut-il vraiment régénérer vos sitemaps pour retirer les URLs obsolètes ?
  8. 259:19 Pourquoi Google refuse-t-il de fournir des données Voice Search dans Search Console ?
  9. 295:52 Comment forcer Google à rafraîchir vos fichiers JavaScript et CSS lors du rendering ?
  10. 317:32 Comment mapper les URLs et vérifier les redirects en migration pour ne pas perdre le ranking ?
  11. 353:48 Faut-il vraiment renseigner les dates dans les données structurées ?
  12. 390:26 Faut-il vraiment modifier la date d'un article à chaque mise à jour ?
  13. 432:21 Faut-il vraiment limiter le nombre de balises H1 sur une page ?
  14. 450:30 Les headings ont-ils vraiment autant d'importance que le pense Google ?
  15. 555:58 Les mots-clés LSI sont-ils vraiment utiles pour le référencement Google ?
  16. 585:16 Combien de liens par page faut-il pour optimiser le PageRank interne ?
  17. 674:32 Les requêtes JSON grèvent-elles vraiment votre crawl budget ?
  18. 717:14 Faut-il vraiment bloquer les fichiers JSON dans votre robots.txt ?
  19. 789:13 Google peut-il deviner qu'une URL est dupliquée sans même la crawler ?
📅
Official statement from (5 years ago)
TL;DR

The Core Web Vitals data displayed in Google Search Console is aggregated over a rolling 28-day period, not in real-time. This delay is inherent to the data collection method used by the Chrome User Experience Report (CrUX), which aggregates data from millions of real users. Specifically, if you fix a performance issue today, you won’t see the full impact in Search Console until 4 weeks later - which forces you to anticipate and plan your optimizations differently.

What you need to understand

Where does this 28-day delay come from exactly? <\/h3>

Google uses the Chrome User Experience Report <\/strong> (CrUX) as the data source for Core Web Vitals in Search Console. This report aggregates real browsing data <\/strong> collected from millions of Chrome users who have opted in to share their usage statistics.<\/p>

The 28-day delay is not a bug or a technical slowness of Search Console. It is a voluntary aggregation window <\/strong>: Google compiles the metrics over a rolling 28-day period to obtain a statistically significant sample and filter out sudden fluctuations (traffic spikes, temporary technical incidents, rolling out a progressive update).<\/p>

What does "rolling 28-day aggregation" actually mean? <\/h3>

Each day, the data displayed in Search Console represents the average of the previous 28 days <\/strong>. If you check your Core Web Vitals on April 15, you see the aggregated data from March 18 to April 14.<\/p>

This rolling window shifts daily: the next day (April 16), the period will be from March 19 to April 15. The oldest data gradually becomes obsolete, while the newest data trickles in. That’s why a drastic improvement <\/strong> never results in a sudden change in Search Console—it gradually diffuses over 28 days.<\/p>

Why doesn’t Google display real-time data? <\/h3>

Web performance metrics are highly volatile <\/strong>. A user on a slow 4G connection gets an LCP of 6 seconds, while another on fiber gets 1.2 seconds for the same page. A sudden traffic spike can stress your servers and temporarily degrade your metrics.<\/p>

By aggregating over 28 days, Google smooths out these variations and obtains a representative measure <\/strong> of the actual experience of your visitors. It’s this average that is used as a ranking signal, not the instantaneous performances of a given day.<\/p>

  • Search Console data is always 28 days behind your actual optimizations <\/strong>.<\/li>
  • A gradual improvement <\/strong> will take about 4 weeks to fully reflect in the interface.<\/li>
  • Daily variations <\/strong> are not visible—only sustainable trends emerge.<\/li>
  • The CrUX is based on real Chrome users <\/strong>, not on synthetic tests (Lighthouse, PageSpeed Insights).<\/li>
  • Only URLs with sufficient traffic <\/strong> appear in the CrUX (undocumented threshold).<\/li><\/ul>

SEO Expert opinion

Is this statement consistent with field observations? <\/h3>

Yes, and it can be observed directly in the public BigQuery CrUX <\/strong>. CrUX datasets are published monthly with a lag of about 2 weeks, and each row aggregates 28 days of data. When you deploy a major optimization (lazy-loading, CDN, critical CSS redesign), you indeed experience an uncompressible delay <\/strong> before seeing the impact in Search Console.<\/p>

The practical problem: SEO clients often expect immediate results after a performance project. You need to integrate this 4-week latency delay <\/strong> into your project timelines and reports—otherwise, you create unnecessary frustration.<\/p>

What nuances need to be added to this explanation? <\/h3>

Mueller says the delay is due to the "method of collection and aggregation", but he omits a critical point: the minimal traffic threshold <\/strong>. A URL receiving 10 visits per day will never appear in the CrUX, regardless of its performance. Google does not precisely document this threshold—[To be verified] <\/strong> on your own sites via the CrUX API.<\/p>

Another nuance: CrUX data is collected only from Chrome desktop and mobile <\/strong>, with users having opted in to share statistics. This is not a perfectly representative sample of your actual audience if you have significant traffic on Firefox, Safari, or Edge (although Chromium likely shares data).<\/p>

In what cases does this 28-day aggregation pose a problem? <\/h3>

For seasonal sites <\/strong> or product launches, it's a nightmare. Imagine an e-commerce site launching a capsule collection on April 1: even if the site is perfectly optimized, Search Console will still display data polluted by March traffic until April 28. You can’t assess the real SEO impact of your launch until a full month has passed.<\/p>

The same goes for news or event sites <\/strong>: a traffic spike related to breaking news can degrade your Core Web Vitals for 28 days, even if you fix the problem within 48 hours. Aggregation works against you when volatility is intrinsic to your model.<\/p>

Warning: <\/strong> If you work on a site with highly fluctuating traffic (flash sales, one-off events), do not rely solely on Search Console to guide your CWV optimizations. Use real-time tools (custom RUM, Cloudflare Analytics) in addition to detect degradations immediately.<\/div>

Practical impact and recommendations

How to anticipate this delay in your SEO roadmaps? <\/h3>

Plan your performance projects at least 6 weeks before a critical deadline <\/strong> (product launch, seasonal peak, marketing campaign). Allow 2 weeks for implementation and testing, then 4 weeks for CrUX data to stabilize in Search Console.<\/p>

If you work in an agile manner with short sprints, integrate a validation delay <\/strong>: do not validate a CWV objective immediately after deployment, but in the following sprint (2-3 weeks later). Clearly communicate this timeline to stakeholders to avoid misunderstandings about ROI timelines.<\/p>

What tools to use in addition to Search Console? <\/h3>

Search Console is useful for long-term monitoring, but too slow for operational management <\/strong>. Set up a Real User Monitoring (RUM) tool like SpeedCurve, Cloudflare Web Analytics, or a custom script that sends CWV metrics to your own database.<\/p>

These tools provide you with daily feedback <\/strong>, even hourly, on your actual performances. You detect an LCP regression the same day, not 3 weeks later. This is essential for iterating quickly and fixing bugs before they pollute your CrUX data for an entire month.<\/p>

Should you wait 28 days before validating an optimization? <\/h3>

No, but nuance your validation. Use Lighthouse and PageSpeed Insights in lab mode <\/strong> for immediate technical validation: if your LCP drops from 4s to 1.5s under controlled conditions, the optimization works. But don’t declare victory until real CrUX data confirms it.<\/p>

Document your deployments with a precise date in a dashboard. When you check Search Console 4 weeks later, you can correlate the improvement to the actions <\/strong> taken. Without this traceability, it’s impossible to know if the score increase comes from your CSS redesign or a coincidental traffic drop.<\/p>

  • Plan CWV projects at least 6 weeks before critical deadlines <\/strong>.<\/li>
  • Set up a RUM tool for daily management <\/strong> alongside Search Console.<\/li>
  • Document each deployment with a date and description in a SEO changelog <\/strong>.<\/li>
  • Test in lab (Lighthouse) for technical validation, but wait 28 days for CrUX validation <\/li><\/strong>.
  • Communicate this structural delay to clients and stakeholders as early as the project kickoff <\/strong>.<\/li>
  • Monitor anomalous traffic spikes <\/strong> that can pollute your metrics for a month.<\/li><\/ul>
    The 28-day aggregation is not a bug; it is a feature of CrUX. Adapt your working methods accordingly: anticipate, instrument, document. If you manage a complex site with significant business stakes, these optimizations require expertise in web performance and rigorous orchestration. Working with a specialized SEO agency can help you avoid costly mistakes and significantly speed up your skill enhancement on these technical topics.<\/div>

❓ Frequently Asked Questions

Peut-on accélérer la mise à jour des Core Web Vitals dans Search Console ?
Non, c'est impossible. Le délai de 28 jours est structurel, lié à la méthode d'agrégation du CrUX. Aucune action de votre côté ne peut raccourcir cette fenêtre.
Les données CrUX utilisées par Search Console sont-elles exactement les mêmes que celles de PageSpeed Insights ?
Oui, les deux outils utilisent la même source (CrUX), mais PageSpeed Insights affiche également des données au niveau de l'origine (domaine entier) alors que Search Console détaille par URL ou groupes d'URLs.
Mon site a un faible trafic — pourquoi je ne vois aucune donnée CWV dans Search Console ?
Le CrUX exige un seuil minimal de trafic Chrome pour apparaître dans les rapports. Google ne documente pas ce seuil précisément, mais empiriquement il faut plusieurs centaines de visites mensuelles par URL.
Si je corrige un problème de LCP aujourd'hui, quand verrai-je l'impact complet dans Search Console ?
L'amélioration commencera à apparaître progressivement dès demain (les nouvelles données entrent quotidiennement), mais l'impact complet ne sera visible qu'après 28 jours, quand toutes les anciennes données auront quitté la fenêtre glissante.
Les Core Web Vitals mesurés par Lighthouse en local sont différents de ceux du CrUX — pourquoi ?
Lighthouse simule un environnement contrôlé (lab data), alors que le CrUX mesure des utilisateurs réels avec des connexions, devices et contextes variés (field data). Les deux métriques sont complémentaires, pas interchangeables.

🎥 From the same video 19

Other SEO insights extracted from this same Google Search Central video · duration 912h44 · published on 05/03/2021

🎥 Watch the full video on YouTube →

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.