Official statement
Other statements from this video 19 ▾
- 36:39 Faut-il vraiment tester ses Core Web Vitals en laboratoire pour éviter les régressions ?
- 98:33 Les animations CSS pénalisent-elles vraiment vos Core Web Vitals ?
- 121:49 Les Core Web Vitals vont-ils encore changer et comment anticiper les prochaines mises à jour ?
- 146:15 Les pages par ville sont-elles vraiment toutes des doorway pages condamnées par Google ?
- 185:36 Le crawl budget dépend-il vraiment de la vitesse de votre serveur ?
- 203:58 Faut-il vraiment commencer petit pour débloquer son crawl budget ?
- 228:24 Faut-il vraiment régénérer vos sitemaps pour retirer les URLs obsolètes ?
- 259:19 Pourquoi Google refuse-t-il de fournir des données Voice Search dans Search Console ?
- 295:52 Comment forcer Google à rafraîchir vos fichiers JavaScript et CSS lors du rendering ?
- 317:32 Comment mapper les URLs et vérifier les redirects en migration pour ne pas perdre le ranking ?
- 353:48 Faut-il vraiment renseigner les dates dans les données structurées ?
- 390:26 Faut-il vraiment modifier la date d'un article à chaque mise à jour ?
- 432:21 Faut-il vraiment limiter le nombre de balises H1 sur une page ?
- 450:30 Les headings ont-ils vraiment autant d'importance que le pense Google ?
- 555:58 Les mots-clés LSI sont-ils vraiment utiles pour le référencement Google ?
- 585:16 Combien de liens par page faut-il pour optimiser le PageRank interne ?
- 674:32 Les requêtes JSON grèvent-elles vraiment votre crawl budget ?
- 717:14 Faut-il vraiment bloquer les fichiers JSON dans votre robots.txt ?
- 789:13 Google peut-il deviner qu'une URL est dupliquée sans même la crawler ?
The Core Web Vitals data displayed in Google Search Console is aggregated over a rolling 28-day period, not in real-time. This delay is inherent to the data collection method used by the Chrome User Experience Report (CrUX), which aggregates data from millions of real users. Specifically, if you fix a performance issue today, you won’t see the full impact in Search Console until 4 weeks later - which forces you to anticipate and plan your optimizations differently.
What you need to understand
Where does this 28-day delay come from exactly? <\/h3>
Google uses the Chrome User Experience Report <\/strong> (CrUX) as the data source for Core Web Vitals in Search Console. This report aggregates real browsing data <\/strong> collected from millions of Chrome users who have opted in to share their usage statistics.<\/p> The 28-day delay is not a bug or a technical slowness of Search Console. It is a voluntary aggregation window <\/strong>: Google compiles the metrics over a rolling 28-day period to obtain a statistically significant sample and filter out sudden fluctuations (traffic spikes, temporary technical incidents, rolling out a progressive update).<\/p> Each day, the data displayed in Search Console represents the average of the previous 28 days <\/strong>. If you check your Core Web Vitals on April 15, you see the aggregated data from March 18 to April 14.<\/p> This rolling window shifts daily: the next day (April 16), the period will be from March 19 to April 15. The oldest data gradually becomes obsolete, while the newest data trickles in. That’s why a drastic improvement <\/strong> never results in a sudden change in Search Console—it gradually diffuses over 28 days.<\/p> Web performance metrics are highly volatile <\/strong>. A user on a slow 4G connection gets an LCP of 6 seconds, while another on fiber gets 1.2 seconds for the same page. A sudden traffic spike can stress your servers and temporarily degrade your metrics.<\/p> By aggregating over 28 days, Google smooths out these variations and obtains a representative measure <\/strong> of the actual experience of your visitors. It’s this average that is used as a ranking signal, not the instantaneous performances of a given day.<\/p>What does "rolling 28-day aggregation" actually mean? <\/h3>
Why doesn’t Google display real-time data? <\/h3>
SEO Expert opinion
Is this statement consistent with field observations? <\/h3>
Yes, and it can be observed directly in the public BigQuery CrUX <\/strong>. CrUX datasets are published monthly with a lag of about 2 weeks, and each row aggregates 28 days of data. When you deploy a major optimization (lazy-loading, CDN, critical CSS redesign), you indeed experience an uncompressible delay <\/strong> before seeing the impact in Search Console.<\/p> The practical problem: SEO clients often expect immediate results after a performance project. You need to integrate this 4-week latency delay <\/strong> into your project timelines and reports—otherwise, you create unnecessary frustration.<\/p> Mueller says the delay is due to the "method of collection and aggregation", but he omits a critical point: the minimal traffic threshold <\/strong>. A URL receiving 10 visits per day will never appear in the CrUX, regardless of its performance. Google does not precisely document this threshold—[To be verified] <\/strong> on your own sites via the CrUX API.<\/p> Another nuance: CrUX data is collected only from Chrome desktop and mobile <\/strong>, with users having opted in to share statistics. This is not a perfectly representative sample of your actual audience if you have significant traffic on Firefox, Safari, or Edge (although Chromium likely shares data).<\/p> For seasonal sites <\/strong> or product launches, it's a nightmare. Imagine an e-commerce site launching a capsule collection on April 1: even if the site is perfectly optimized, Search Console will still display data polluted by March traffic until April 28. You can’t assess the real SEO impact of your launch until a full month has passed.<\/p> The same goes for news or event sites <\/strong>: a traffic spike related to breaking news can degrade your Core Web Vitals for 28 days, even if you fix the problem within 48 hours. Aggregation works against you when volatility is intrinsic to your model.<\/p>What nuances need to be added to this explanation? <\/h3>
In what cases does this 28-day aggregation pose a problem? <\/h3>
Practical impact and recommendations
How to anticipate this delay in your SEO roadmaps? <\/h3>
Plan your performance projects at least 6 weeks before a critical deadline <\/strong> (product launch, seasonal peak, marketing campaign). Allow 2 weeks for implementation and testing, then 4 weeks for CrUX data to stabilize in Search Console.<\/p> If you work in an agile manner with short sprints, integrate a validation delay <\/strong>: do not validate a CWV objective immediately after deployment, but in the following sprint (2-3 weeks later). Clearly communicate this timeline to stakeholders to avoid misunderstandings about ROI timelines.<\/p> Search Console is useful for long-term monitoring, but too slow for operational management <\/strong>. Set up a Real User Monitoring (RUM) tool like SpeedCurve, Cloudflare Web Analytics, or a custom script that sends CWV metrics to your own database.<\/p> These tools provide you with daily feedback <\/strong>, even hourly, on your actual performances. You detect an LCP regression the same day, not 3 weeks later. This is essential for iterating quickly and fixing bugs before they pollute your CrUX data for an entire month.<\/p> No, but nuance your validation. Use Lighthouse and PageSpeed Insights in lab mode <\/strong> for immediate technical validation: if your LCP drops from 4s to 1.5s under controlled conditions, the optimization works. But don’t declare victory until real CrUX data confirms it.<\/p> Document your deployments with a precise date in a dashboard. When you check Search Console 4 weeks later, you can correlate the improvement to the actions <\/strong> taken. Without this traceability, it’s impossible to know if the score increase comes from your CSS redesign or a coincidental traffic drop.<\/p>What tools to use in addition to Search Console? <\/h3>
Should you wait 28 days before validating an optimization? <\/h3>
❓ Frequently Asked Questions
Peut-on accélérer la mise à jour des Core Web Vitals dans Search Console ?
Les données CrUX utilisées par Search Console sont-elles exactement les mêmes que celles de PageSpeed Insights ?
Mon site a un faible trafic — pourquoi je ne vois aucune donnée CWV dans Search Console ?
Si je corrige un problème de LCP aujourd'hui, quand verrai-je l'impact complet dans Search Console ?
Les Core Web Vitals mesurés par Lighthouse en local sont différents de ceux du CrUX — pourquoi ?
🎥 From the same video 19
Other SEO insights extracted from this same Google Search Central video · duration 912h44 · published on 05/03/2021
🎥 Watch the full video on YouTube →Related statements
Get real-time analysis of the latest Google SEO declarations
Be the first to know every time a new official Google statement drops — with full expert analysis.
💬 Comments (0)
Be the first to comment.