Official statement
Other statements from this video 13 ▾
- □ Can Poor Translations Harm Your Entire Multilingual Site?
- □ Is duplicate content on product pages really harmless for your SEO?
- □ Should you translate all your pages or focus your efforts on the most strategic ones?
- □ Is it really necessary to disable geotargeting in Search Console for an international site?
- □ Does Google really index hidden text in your HTML code?
- □ Should you prefer rel=canonical over user-agent redirects for unindexed pages?
- □ Should you implement your SEO optimizations all at once instead of gradually?
- □ No Google cache on my page: Should I be worried about my indexing?
- □ Does Googlebot really ignore all browser permissions when crawling?
- □ Should You Really Use Google's Indexing API to Speed Up Your Content Indexing?
- □ Is the Page Experience score really essential for making it into Top Stories?
- □ Does Google really assign an EAT score to your website?
- □ Does Google really favor sequential links or multiple pages for SEO pagination?
Google confirms that the Core Web Vitals data in Search Console comes exclusively from the Chrome User Experience Report (CrUX), which means it represents a subset of Chrome users who have enabled certain settings. No extrapolation is done for Safari, Firefox, or other browsers. Google considers this data to be sufficiently representative as performance varies little between browsers.
What you need to understand
Where exactly does the Core Web Vitals data displayed in Search Console come from?
The Page Experience and Core Web Vitals metrics visible in Search Console rely exclusively on the Chrome User Experience Report (CrUX). This dataset compiles data from actual Chrome users who have opted to share their usage statistics and browsing history.
Specifically? Only Chrome users with sync enabled and usage reporting allowed contribute to these metrics. This effectively excludes all Safari, Firefox, Edge (non-Chromium), and even some Chrome users who have disabled these settings.
Why does Google make no extrapolation for other browsers?
Google assumes that a website's performance remains relatively similar from one browser to another. A site that is fast on Chrome is likely to be fast on Safari and vice versa — even if the rendering engines differ.
This assumption significantly simplifies data collection but raises legitimate questions. On mobile, Safari dominates in certain markets (United States, Western Europe), while Chrome reigns on Android. User behaviors and network conditions can significantly differ between these two ecosystems.
What are the key takeaways for an SEO practitioner?
- Single source: All CWV data in Search Console = CrUX = opt-in Chrome users only
- No extrapolation: Data from Safari, Firefox, or other browsers are never included, even by estimation
- Assumed representativeness: Google considers that Chrome performance sufficiently reflects the overall experience
- Subset of users: Even among Chrome users, only those who have enabled certain settings contribute to the data
- Confirmed SEO impact: This data (partial or not) remains the official basis for assessing Page Experience signals in ranking
SEO Expert opinion
Is this methodology truly representative of the real user experience?
Let’s be honest: limiting data collection to opt-in Chrome users creates a significant methodological bias. On iOS, Safari often represents 60-70% of mobile traffic depending on the sectors. Ignoring these users entirely in the CWV equation poses an obvious representativeness problem.
iPhone Safari users generally behave differently: varying 4G/5G connections, different processing power, distinct memory management. A site might display excellent CrUX metrics (Chrome desktop + Android) while offering a poor experience on Safari iOS — and Google will have no direct visibility into this.
[To be verified] The claim that “browsers have similar performance” should be supported by concrete data. Field tests regularly show significant discrepancies between Chrome and Safari, especially regarding Cumulative Layout Shift and First Input Delay.
When does this limitation become problematic?
This methodology particularly penalizes sites with a predominantly iOS/Safari audience: high-end e-commerce, luxury sector, certain lifestyle media. For these sites, the Search Console data may significantly diverge from the actual experience faced by most visitors.
A second critical point: users who disable Chrome tracking — often the most tech-savvy and demanding — are entirely absent from the statistics. Thus, we are primarily measuring the experience of “average” users or those less aware of privacy.
Should you still prioritize Chrome optimization for SEO?
Yes, pragmatically speaking. Since Google exclusively uses CrUX for its ranking signal, optimizing for Chrome remains the strict SEO priority. It is the only browser whose performance will directly impact your ranking.
But — and this is crucial — this does not absolve you from optimizing the overall experience. A site that is fast on Chrome but slow on Safari will lose conversions, increase its bounce rate, and degrade its business metrics. SEO is just one component of performance.
Practical impact and recommendations
What concrete steps should be taken to optimize the Core Web Vitals measured by Google?
The first step: focus your optimization efforts on Chrome, since it is the only browser that directly influences your ranking. Always test your changes on Chrome desktop and Chrome mobile (Android) before deploying.
Use PageSpeed Insights and Search Console as reference tools to identify problem pages. These tools rely on CrUX and therefore reflect exactly what Google sees from your site.
How can you avoid neglecting user experience on other browsers?
Implement a cross-browser monitoring strategy alongside your SEO optimization. Use RUM (Real User Monitoring) tools like SpeedCurve, Cloudflare Analytics, or New Relic to capture real performance on Safari, Firefox, and Edge.
Manually test your critical pages on Safari iOS (iPhone) and Safari macOS. These manual tests often reveal layout shift issues or image loading discrepancies that Chrome handles differently.
Segment your Google Analytics analysis by browser and device to identify correlations between performance (as measured by your RUM tools) and business metrics (conversion rate, time on site). A site that is “green” in Search Console may show a catastrophic bounce rate on Safari.
What checklist should you apply to maximize the representativeness of your optimizations?
- Prioritize Chrome optimization for SEO ranking (LCP, CLS, INP measured via CrUX)
- Install a cross-browser RUM tool to capture performance on Safari, Firefox, Edge
- Manually test each major change on Chrome Android, Chrome Desktop, Safari iOS, and Safari macOS
- Segment Google Analytics by browser to detect user behavior gaps
- Never rely solely on Search Console data — consistently compare with your own real-world metrics
- Specifically audit high-traffic Safari pages if your iOS audience exceeds 40%
- Monitor Core Web Vitals after every deployment for at least 28 days (CrUX collection period)
❓ Frequently Asked Questions
Les données Core Web Vitals dans Search Console incluent-elles les utilisateurs Safari ?
Si mon trafic est majoritairement Safari, les données Search Console sont-elles fiables ?
Google extrapolera-t-il un jour les données des autres navigateurs ?
Un site peut-il être vert dans Search Console mais lent sur Safari ?
Dois-je optimiser uniquement pour Chrome puisque c'est ce que Google mesure ?
🎥 From the same video 13
Other SEO insights extracted from this same Google Search Central video · published on 31/12/2021
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.