Official statement
Other statements from this video 25 ▾
- 1:41 Faut-il vraiment utiliser des canonical cross-domain pour consolider plusieurs sites thématiques ?
- 2:00 Les redirections 302 transmettent-elles le PageRank comme les 301 ?
- 2:00 Le canonical tag transfère-t-il vraiment 100% du PageRank sans aucune perte ?
- 14:00 Faut-il vraiment éviter de mettre tous ses liens sortants en nofollow ?
- 14:10 Faut-il vraiment éviter de mettre tous ses liens sortants en nofollow ?
- 16:16 L'outil de paramètres d'URL dans Search Console : mort-vivant ou encore utile pour votre SEO ?
- 16:36 L'outil URL Parameters de Google fonctionne-t-il encore malgré son interface cassée ?
- 20:01 Pourquoi bloquer le robots.txt empêche-t-il le noindex de fonctionner ?
- 22:03 Les Core Web Vitals sont-ils vraiment le seul critère de vitesse qui compte pour le classement ?
- 25:15 Les tests PageSpeed mentent-ils sur vos Core Web Vitals ?
- 26:50 Le texte alternatif est-il vraiment décisif pour votre visibilité dans Google Images ?
- 26:50 Le texte alternatif des images sert-il vraiment au référencement naturel ?
- 28:26 Les redirections 302 transmettent-elles vraiment autant de PageRank que les 301 ?
- 30:17 Faut-il vraiment cacher les bannières de consentement cookies à Googlebot ?
- 30:57 Faut-il vraiment bloquer les cookie banners pour Googlebot ?
- 34:46 Pourquoi Google affiche-t-il encore d'anciens contenus dans vos meta descriptions ?
- 34:46 Pourquoi Google affiche-t-il parfois vos anciennes meta descriptions dans les SERP ?
- 36:57 Faut-il vraiment afficher les cookie banners à Googlebot ?
- 37:56 Les redirections 302 deviennent-elles vraiment des 301 avec le temps ?
- 40:01 Faut-il vraiment renvoyer un 404 pour les produits définitivement indisponibles ?
- 40:01 Faut-il renvoyer un 404 ou un 200 sur une page produit en rupture de stock ?
- 43:37 Faut-il synchroniser les dates visibles et les dates techniques pour booster son crawl ?
- 43:38 Faut-il vraiment distinguer la date visible de celle des données structurées ?
- 46:46 Pourquoi Google crawle-t-il encore vos anciennes URLs supprimées ?
- 47:09 Pourquoi Google continue-t-il de crawler vos anciennes URLs en 404 ?
Google has made its decision: only the Core Web Vitals (LCP, FID, CLS) will count towards the Page Experience ranking factor. Goodbye to Lighthouse scores and other lab metrics. The assessment relies solely on real-world data from the Chrome User Experience Report, with a distinction between desktop and mobile. Specifically, optimizing your Lighthouse score is no longer enough — you must aim for the actual performance perceived by Chrome users.
What you need to understand
Google has clarified a gray area that was causing confusion among many practitioners: only the three Core Web Vitals (LCP, FID, CLS) will be used to evaluate Page Experience in the ranking algorithm. No overall Lighthouse score, no Time to Interactive, no Speed Index.
This statement puts an end to speculation about the use of complementary metrics. Google's intent is clear: to focus on what directly impacts user experience rather than on technical abstractions.
What's the difference between real-world data and lab data?
Lab data (Lighthouse, PageSpeed Insights in lab mode) simulates loading under controlled conditions — throttled network, emulated device, cleared cache. It’s reproducible, but completely disconnected from the reality of your actual visitors.
The Chrome User Experience Report (CrUX), on the other hand, aggregates real performance measured from Chrome users worldwide. This is the dataset Google uses to calculate your official Core Web Vitals — the ones that matter for ranking. If your actual traffic mainly comes from 4G mobile users in rural areas, your CWV will reflect that reality, not a fiber optic test in Paris.
Why are desktop and mobile evaluated separately?
Because performance is rarely identical between the two contexts. A site might show an excellent LCP on desktop (fast server, stable connection, powerful device) but flop on mobile (slow network, limited CPU, unoptimized images).
Google indexes and ranks mobile and desktop separately since Mobile-First Indexing. It made sense for the Core Web Vitals to follow this logic. In practice, if you are targeting primarily mobile traffic, it’s your mobile CrUX score that counts — and that’s often where the issue lies.
Does the CrUX Report always contain data for my site?
No, and this is a critical point. For a site to appear in CrUX, it needs a minimum volume of Chrome traffic — Google does not disclose the exact threshold, but it is estimated that several thousand Chrome visitors per month are required.
If your site is too small or too new, you won’t have origin-level CrUX data. Google may then resort to aggregated data at the URL level, or even not have enough data to apply the Page Experience signal. In this case, no penalty… but no bonus either.
- Only the three Core Web Vitals (LCP, FID, CLS) matter for Page Experience, not Lighthouse or other lab metrics
- The CrUX data (real-world) is used, not simulations in a controlled environment
- Desktop and mobile are evaluated independently — optimizing one does not guarantee anything for the other
- No CrUX data = no Page Experience signal applied (neither bonus nor penalty)
- The Lighthouse score remains useful for diagnosis, but does not predict your official CWV
SEO Expert opinion
Is this statement consistent with observed practices in the field?
Yes, and it finally resolves a debate that has persisted since the initial announcement of Page Experience. We regularly saw sites with catastrophic Lighthouse scores but excellent CrUX CWV — and vice versa. The former did not suffer any negative impact, while the latter enjoyed the boost.
A typical case: an e-commerce site loaded with third-party scripts, degrading Time to Interactive and Speed Index in lab but showing real users (mostly cached, fast connections) an LCP under 2.5s and a clean CLS. Google does not penalize this site because the real experience is good. It’s rational.
What nuances should be applied to this claim?
First nuance: Google says “only the Core Web Vitals,” but that doesn’t prevent other indirect signals from playing a role. A site with a catastrophic bounce rate because it takes 8 seconds to load won’t be saved by clean CWV — user behavior will send a distinct negative signal.
Second nuance: the Core Web Vitals are just a light tie-breaker. John Mueller has repeated several times: content and relevance remain a priority. A competitor with mediocre content but perfect CWV will not surpass you if your content is significantly better. But at equal relevance, CWV makes the difference.
In what cases does this rule not apply?
If you don’t have enough Chrome traffic to be included in CrUX, Google cannot apply the signal. This is the case for many niche B2B sites, new sites, or those with very localized traffic outside of Chrome (Safari dominating in certain countries).
Another edge case: very low-traffic pages on a site. CrUX aggregates at the origin level (entire domain), but Google can also use URL-level data when they exist. An orphan page without traffic will not generate specific CrUX data — it will inherit the overall origin score. [To be verified] whether Google applies different weighting based on the granularity of the available data.
Practical impact and recommendations
What should you do concretely to optimize your Core Web Vitals?
First step: identify where you really stand with CrUX data. The Search Console (Essential Web Signals section) indicates which URLs are “good,” “to improve,” or “poor” according to official thresholds. If you don’t have data, test with the CrUX API or wait until you have traffic.
Then, prioritize actions based on impact. For LCP, look at the weight and lazy-loading of your hero image, server response (TTFB), and preloading of critical resources. For CLS, track images without dimensions, ads that push content, and fonts that load late. For FID, reduce heavy JavaScript on the main thread — split, defer, trim.
What mistakes should absolutely be avoided?
Don’t push to raise Lighthouse from 60 to 95 if your CrUX CWV are already green. You will waste time on lab optimizations that will not impact either ranking or the real experience. Lighthouse remains a diagnostic tool, not an end in itself.
Another classic mistake: optimizing only desktop because “it’s simpler.” If your traffic is primarily mobile (check Analytics), you should focus on mobile as a priority. A desktop LCP of 1.8s and a mobile LCP of 4.2s is a failure — Google ranks mobile first.
How can I check if my site meets Google’s expectations?
The “Essential Web Signals” report in Search Console is your absolute reference. Google displays aggregated CrUX data there grouped by similar URL sets. If all your URLs are green (≥75% of visits pass the “good” threshold), you are fine for the Page Experience signal.
Complement this with PageSpeed Insights in “real data” mode (CrUX tab, not lab). You will see the P75 percentiles — this is the threshold Google uses for ranking. A P75 LCP of 2.8s is still orange, so there’s room for improvement. Monitor these metrics monthly, especially after deployments that touch the frontend.
- Consult the Essential Web Signals report in Search Console to know your real CrUX status
- Prioritize mobile if it's your dominant audience — desktop alone is no longer sufficient
- Optimize LCP (hero image, TTFB, preload), CLS (dimensions, stable layout), and FID (light JS)
- Do not confuse Lighthouse score and Core Web Vitals CrUX — only the latter counts for ranking
- Monitor monthly evolution via CrUX API or PageSpeed Insights in real mode
- Test deployments in staging with RUM (Real User Monitoring) tools before production
❓ Frequently Asked Questions
Mon score Lighthouse est à 95, pourquoi mes Core Web Vitals CrUX sont-ils mauvais ?
Que se passe-t-il si mon site n'a pas de données CrUX ?
Les Core Web Vitals desktop et mobile sont-ils pondérés différemment ?
Dois-je optimiser toutes mes pages ou seulement les plus visitées ?
Les Core Web Vitals peuvent-ils compenser un contenu faible ?
🎥 From the same video 25
Other SEO insights extracted from this same Google Search Central video · duration 53 min · published on 29/10/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.