Official statement
Other statements from this video 9 ▾
- 3:17 La vitesse mobile est-elle vraiment un facteur de classement qui change la donne ?
- 3:50 Pourquoi PageSpeed Insights intègre-t-il maintenant des données utilisateur réelles en plus des scores simulés ?
- 12:33 Faut-il mettre en noindex les pages panier vides de votre site e-commerce ?
- 14:35 Faut-il vraiment baliser chaque avis client individuellement en données structurées ?
- 35:10 Les balises canonical peuvent-elles bloquer l'indexation de vos pages stratégiques ?
- 65:00 Comment Google juge-t-il vraiment la qualité d'un site multilingue ?
- 71:20 Les plaintes DMCA peuvent-elles vraiment faire disparaître vos pages de Google ?
- 73:20 Google Search Console : pourquoi 16 mois de données changent-ils vraiment la donne pour votre SEO ?
- 75:39 Les commentaires non pertinents nuisent-ils vraiment au référencement de vos pages ?
PageSpeed Insights relies on the Chrome User Experience Report, powered by real user data from individuals who opted to share their browsing experience. This approach provides on-the-ground metrics rather than lab simulations. For SEO, this means that scores reflect what your visitors actually experience, but only those using Chrome and who have enabled data sharing.
What you need to understand
Where do the data in PageSpeed Insights actually come from?
PageSpeed Insights combines two types of distinct data. Lab data comes from Lighthouse and simulates loading under standardized conditions. Field data comes from the Chrome User Experience Report (CrUX), which aggregates actual performance measured from millions of Chrome users.
CrUX does not collect data from all visitors. Only users who have explicitly enabled usage statistic sharing in Chrome contribute to this dataset. This data is anonymized and aggregated by origin (entire domain) or by specific URL if traffic is sufficient.
What’s the difference between real measurement and simulation?
Simulated metrics from Lighthouse test your site in a controlled environment: simulated 4G connection, throttled processor, browser without extensions. It's reproducible but artificial. Real metrics from CrUX reflect the diversity of field conditions: variable connections, heterogeneous devices, active browser extensions.
This distinction is crucial for diagnosing problems. An excellent Lighthouse score can coexist with poor CrUX data if your actual visitors mainly have slow connections or older devices. Conversely, good field data with a mediocre Lighthouse score indicates that your technical optimizations are already benefiting your users.
What limitations should you be aware of regarding this data?
CrUX has several structural biases. It only captures Chrome desktop and mobile users who have enabled data sharing, which excludes Safari, Firefox, and all browsers without telemetry. If your audience primarily uses alternative browsers, CrUX data represents only a fraction of your actual traffic.
Low-traffic pages do not have CrUX data at the URL level. You will only get aggregated metrics at the origin level, masking performance variations between pages. The exact threshold is not publicly documented, but it is generally observed that several thousand monthly visits are needed to obtain URL-level data.
- CrUX data reflects the real experience of Chrome users who opted for sharing, not a simulation
- Two types of data coexist in PageSpeed Insights: lab (Lighthouse) and field (CrUX)
- CrUX metrics do not cover all browsers or all Chrome users (opt-in required)
- Low-traffic pages only have aggregated data at the domain level
- A gap between lab and field data often reveals a disconnect between your testing conditions and user reality
SEO Expert opinion
Does this opt-in approach bias the results?
Let’s be honest: yes, bias exists. Users who enable data sharing in Chrome are probably not representative of the entire web. One can assume a overrepresentation of tech-savvy users, likely better equipped with hardware and connectivity. Are users on low-end devices or in low-bandwidth areas proportionally represented in CrUX? [To be confirmed]
Google does not publish detailed statistics on CrUX demographics or the real opt-in rate. This opacity makes it challenging to assess representativeness. For broad B2C sites, the bias is likely limited. For specific audiences (emerging countries, older populations, less tech-savvy sectors), the gap can be substantial.
Are CrUX data consistent with other measurement tools?
In most cases, the trends observed in CrUX correspond to internal measurements from Google Analytics or RUM (Real User Monitoring) solutions. Absolute values often differ, but relative variations remain consistent: a regression detected in your RUM tools will generally also appear in CrUX, with a lag of a few days.
CrUX aggregates data over rolling 28 days, which smooths out sharp variations but delays the detection of changes. If you deploy a major optimization, expect to see the impact in PageSpeed Insights with a latency of 7 to 14 days before the older data is sufficiently diluted.
Should you prioritize lab or field data for optimization?
Both inform different aspects. Lab data identifies technical opportunities: blocking resources, unoptimized JavaScript, oversized images. They are reproducible and allow measuring the impact of a change in isolation.
Field data validates the real impact on your users. A site can have a Lighthouse score of 95 and poor Core Web Vitals in production if the server infrastructure does not keep up under real load. Conversely, perfect lab optimizations will have no impact if your visitors primarily use 3G connections where the bottleneck lies elsewhere. Specifically: optimize first what the field data shows, validate with the lab.
Practical impact and recommendations
How can you effectively leverage CrUX data to enhance your SEO?
Start by checking if your pages have CrUX data at the URL level in PageSpeed Insights. If only origin data is available, your traffic is insufficient for fine granularity. In this case, focus on the overall domain optimization rather than on specific pages.
Analyze the gap between lab and field data. A high Lighthouse score with mediocre field Core Web Vitals indicates a problem with the infrastructure (slow server, failing CDN) or an audience on constrained hardware. An inverse gap suggests that your testing conditions do not accurately reflect the real experience: test on devices and connections representative of your actual traffic.
Which metrics should be prioritized in the field data?
Google uses three Core Web Vitals as direct ranking signals: LCP (Largest Contentful Paint), FID (First Input Delay, soon to be replaced by INP), and CLS (Cumulative Layout Shift). These three metrics must meet the “good” thresholds for at least 75% of your visitors measured in CrUX.
LCP remains the most impactful metric for SEO: it measures the time before the largest visible element appears. Prioritize optimizing LCP if you need to choose. While CLS is less directly related to ranking, it greatly impacts conversion rates and engagement: an unstable layout frustrates users and increases bounce rates.
What to do if CrUX data is missing or inconsistent?
New or low-traffic sites do not appear in CrUX for several weeks. Use lab data as a temporary proxy while waiting to accumulate sufficient Chrome traffic. Meanwhile, implement your own RUM (Real User Monitoring) solution to capture real performance across all browsers, not just Chrome.
If your field data significantly diverges from your internal measurements, check the geographic distribution and mobile/desktop split in CrUX through the public BigQuery dataset. A discrepancy may reveal that your actual audience significantly differs from the opt-in Chrome sample, limiting the relevance of CrUX for guiding your optimizations.
- Check the availability of CrUX data at the URL level for your strategic pages
- Consistently compare lab and field metrics to identify gaps
- Prioritize LCP optimization if resources are limited
- Implement a proprietary RUM solution to cover all browsers
- Monitor Core Web Vitals changes over 28 days to detect regressions
- Test optimizations on devices and connections that represent your real audience
❓ Frequently Asked Questions
Les données CrUX incluent-elles les utilisateurs en navigation privée ?
Combien de temps faut-il pour qu'une optimisation apparaisse dans PageSpeed Insights ?
Pourquoi mon site a-t-il des données CrUX au niveau origine mais pas au niveau URL ?
Les données CrUX diffèrent-elles entre mobile et desktop ?
Peut-on accéder aux données CrUX en dehors de PageSpeed Insights ?
🎥 From the same video 9
Other SEO insights extracted from this same Google Search Central video · duration 1h04 · published on 26/01/2018
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.