What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

The differences in results across Core Web Vitals measurement tools may stem from the distinction between lab data and field data. The PageSpeed Insights tool, for instance, uses field data, which can differ from simulated data.
11:12
🎥 Source video

Extracted from a Google Search Central video

⏱ 59:42 💬 EN 📅 03/09/2020 ✂ 10 statements
Watch on YouTube (11:12) →
Other statements from this video 9
  1. 2:20 Pourquoi Google refuse-t-il d'indexer vos pages malgré un contenu que vous jugez pertinent ?
  2. 5:48 Pourquoi les données site: et Search Console ne correspondent-elles jamais ?
  3. 8:04 Faut-il vraiment abandonner AMP pour votre stratégie SEO ?
  4. 17:40 Comment Google traite-t-il vraiment les pages de phishing dans ses résultats de recherche ?
  5. 31:32 Faut-il vraiment exclure les URLs mobiles des sitemaps XML ?
  6. 33:06 Pourquoi Google détecte-t-il des différentiels de couverture entre mobile et desktop dans Search Console ?
  7. 41:04 Faut-il vraiment utiliser la balise picture pour servir vos images WebP ?
  8. 47:58 Les données structurées améliorent-elles vraiment votre positionnement dans Google ?
  9. 54:20 Google pénalise-t-il vraiment les sites avec plusieurs URLs en première page ?
📅
Official statement from (5 years ago)
TL;DR

Google distinguishes between two types of Core Web Vitals data: lab data (simulated) and field data (real users). PageSpeed Insights primarily uses field data, which explains the sometimes significant discrepancies with other tools. For an SEO, this means that it's essential to cross-reference multiple sources before prioritizing optimizations, as an isolated test can be misleading.

What you need to understand

What is the fundamental difference between lab data and field data?

Lab data comes from controlled tests conducted under standardized conditions. A tool like Lighthouse simulates a page load with a predefined network profile, a typical device, and a browser configured identically for each test. This approach ensures reproducibility but may not accurately reflect the real experience of your visitors.

Field data aggregates metrics collected from real users through the Chrome User Experience Report (CrUX). This data captures the real diversity: unstable 4G connections, older smartphones, varied browsers, and multiple geolocations. This is the data that Google uses for ranking, not lab scores.

Why does PageSpeed Insights show different results from other tools?

PageSpeed Insights combines two approaches: it runs a Lighthouse test (lab) and simultaneously retrieves CrUX data (field) if your site has enough Chrome traffic. The score displayed at the top comes from the lab test, but the "Discover Real Performance" section shows field data.

The trap? Many SEOs focus on the green/orange/red score from the Lighthouse test without checking the CrUX data. However, a site can score 95/100 in the lab and be classified as "slow" in CrUX if real users experience degraded network conditions. This is where the confusion arises.

What factors create these measurement discrepancies?

Lab tests typically use a fixed network throttling (for example, simulated 4G) and a standard device (Moto G4). In real conditions, your users browse on variable connections, from iPhone 15s or Samsung Galaxy A10s, with ad blockers or extensions, sometimes in the background with twenty tabs open.

Browser caches also behave differently. A lab test loads the page cold, whereas CrUX data include return visits with resources already cached. If your caching strategy is aggressive, your actual users may experience a much better experience than what is measured in the lab.

  • Lab data: reproducible, useful for debugging, but do not reflect real traffic
  • Field data (CrUX): aggregated over a rolling 28-day period, represent real Chrome visitors, used for ranking
  • PageSpeed Insights: combines both approaches, but only CrUX data impact ranking
  • Frequent discrepancies: a high Lighthouse score does not guarantee good Core Web Vitals in production
  • Action rule: always prioritize CrUX data for strategic decisions, use lab data for diagnosis

SEO Expert opinion

Is this distinction consistent with field observations?

Absolutely. SEO audits regularly unveil sites that celebrate a Lighthouse score of 98/100 while suffering from poor CWV in production. A classic case? An e-commerce site tests its empty homepage, while the actual product pages load ten times more third-party scripts (customer reviews, chat, recommendations). The lab test overlooks this reality.

Conversely, some heavy WordPress sites score 40/100 in Lighthouse but display correct CrUX data thanks to ahigh-performance CDN and a majority of users on fiber. The nuance is critical: Google ranks based on real experience, not a synthetic test.

What gray areas remain in this statement?

Google remains surprisingly vague on several points. First, what proportion of Chrome users is necessary to generate usable CrUX data? [To be verified] The documentation mentions a threshold of visits without ever quantifying it precisely. Low-traffic sites operate blindly.

Next, how does Google weight the different CrUX metrics (origin vs. specific URL, desktop vs. mobile) in the ranking algorithm? [To be verified] We know that mobile takes precedence, but to what extent exactly? This opacity complicates prioritization of optimizations for limited budgets.

Warning: CrUX data only cover connected Chrome users who have opted in to usage statistics. Your Safari/Firefox/Edge audience does not appear anywhere, which can create measurement bias if your demographics differ from the average.

When should lab data still be monitored?

Lab tests remain essential for technical diagnosis. When your CrUX data deteriorate, it's impossible to identify the regression without a reproducible test. Lighthouse points out blocking scripts, unoptimized images, and the guilty layout shift. Field data tell you "there's a problem," while the lab tells you "here's where."

For low-traffic sites without CrUX data, lab tests become the only available compass. Better to optimize on imperfect data than to navigate blindly. In this case, multiply test profiles (desktop, mobile, 3G, 4G) to compensate for the absence of real data.

Practical impact and recommendations

How can I identify which data source to prioritize for my site?

Start by checking if your site has CrUX data. Open PageSpeed Insights and look at the "Discover Real Performance" section. If it displays metrics over 28 days, you’re in luck: focus on these numbers for your strategic decisions. Temporarily ignore the Lighthouse score displayed at the top.

If no field data appears (message "No data available"), you have no choice: use lab tests while multiplying profiles (Lighthouse, WebPageTest with multiple locations, mobile and desktop tests). Set up weekly tests to detect regressions before they impact users.

What interpretative errors must be avoided at all costs?

The number one error? Celebrating a green Lighthouse score without ever checking the real CrUX data. Many clients come in proud of a 95/100 PageSpeed, while their product pages show as orange in Search Console. The lab test was conducted on an empty homepage, not on the URLs generating traffic.

The second pitfall: comparing non-comparable measures. A Lighthouse audit done from Paris on fiber optic will never reveal what your Indian mobile users are experiencing on 3G. If your traffic is international, use WebPageTest with multiple locations, or better yet, analyze CrUX data by country in BigQuery.

What concrete monitoring strategy should be implemented?

For sites with CrUX data: export metrics monthly from Search Console (Experience section > Core Web Vitals) and cross-reference them with your Analytics data. Identify segments of problematic URLs (product categories, ad landing pages) and prioritize optimizations based on traffic impact.

For sites without CrUX: automate weekly Lighthouse tests via Lighthouse CI or WebPageTest API. Store historical results to detect regressions. Set alerts if LCP exceeds 2.5s or CLS exceeds 0.1 under controlled conditions. These technical optimizations can quickly become time-consuming and require sharp expertise. If internal resources are lacking, engaging a specialized SEO agency can provide precise diagnosis and a personalized action plan without disbanding teams.

  • Check for CrUX data in PageSpeed Insights for each key section of the site
  • Monthly export of Core Web Vitals from Search Console and cross-reference with actual traffic
  • Set up automated weekly Lighthouse tests for sites without field data
  • Multiply test profiles (mobile/desktop, 3G/4G, various geographical locations)
  • Never make an optimization decision based on a single isolated lab test
  • Document discrepancies between lab and field to identify critical environment variables
The lab/field distinction explains why two tools display conflicting results for the same page. Google uses only field data (CrUX) for ranking, making lab tests useful for diagnosis but insufficient for decision-making. Effective monitoring combines both: CrUX for strategy, Lighthouse for debugging. For sites without sufficient Chrome traffic, lab tests remain the only option but must be multiplied and contextualized.

❓ Frequently Asked Questions

Pourquoi mon score PageSpeed Insights est-il différent de mon score Lighthouse en local ?
PageSpeed Insights utilise des serveurs Google avec une configuration réseau et un profil appareil standardisés, tandis que ton test local dépend de ta connexion actuelle et de ta machine. Les deux sont des tests de laboratoire mais avec des environnements différents, d'où les écarts.
Les données CrUX incluent-elles les utilisateurs connectés à un VPN ?
Oui, tant qu'ils utilisent Chrome avec la synchronisation des statistiques activée. Le VPN peut affecter la géolocalisation enregistrée mais n'exclut pas l'utilisateur des données CrUX.
Que faire si mon site n'a aucune donnée CrUX disponible ?
Concentre-toi sur les tests de laboratoire en multipliant les profils (mobile, desktop, plusieurs localisations via WebPageTest). Optimise pour atteindre de bons scores synthétiques en attendant de générer suffisamment de trafic Chrome pour obtenir des données terrain.
Les données CrUX prennent-elles en compte les visiteurs Safari et Firefox ?
Non, uniquement les utilisateurs Chrome ayant accepté le partage des statistiques d'utilisation. Cela peut créer un biais si ton audience utilise massivement d'autres navigateurs.
Combien de temps faut-il pour qu'une optimisation apparaisse dans les données CrUX ?
Les données CrUX sont agrégées sur 28 jours glissants. Une amélioration déployée aujourd'hui commencera à impacter les métriques progressivement, avec un effet pleinement visible après environ quatre semaines.
🏷 Related Topics
Domain Age & History AI & SEO Mobile SEO Web Performance

🎥 From the same video 9

Other SEO insights extracted from this same Google Search Central video · duration 59 min · published on 03/09/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.