Official statement
Other statements from this video 28 ▾
- 1:02 Google rend-il vraiment toutes les pages JavaScript, quelle que soit leur architecture ?
- 1:02 Google rend-il vraiment TOUT le JavaScript, même sans contenu initial server-side ?
- 2:05 Comment vérifier que Googlebot crawle vraiment votre site ?
- 2:05 Comment vérifier que Googlebot est vraiment Googlebot et pas un imposteur ?
- 2:36 Google limite-t-il vraiment le temps CPU lors du rendu JavaScript ?
- 2:36 Google limite-t-il vraiment le temps CPU lors du rendu JavaScript ?
- 3:09 Faut-il arrêter d'optimiser pour les bots et se concentrer uniquement sur l'utilisateur ?
- 5:17 La propriété CSS content-visibility impacte-t-elle le rendu dans Google ?
- 11:00 Combien de temps Google attend-il vraiment avant d'abandonner le rendu JavaScript ?
- 11:00 Combien de temps Googlebot attend-il vraiment pour le rendu JavaScript ?
- 20:07 Pourquoi Google affiche-t-il des pages vides alors que votre site JavaScript fonctionne parfaitement ?
- 20:07 AJAX fonctionne en SEO, mais faut-il vraiment l'utiliser ?
- 21:10 Le JavaScript bloquant peut-il vraiment empêcher Google d'indexer tout le contenu de vos pages ?
- 24:48 Le prérendu dynamique est-il devenu un piège pour l'indexation ?
- 26:25 Pourquoi vos ressources supprimées peuvent-elles détruire votre indexation en prérendu ?
- 26:47 Que fait vraiment Google avec votre HTML initial avant le rendu JavaScript ?
- 27:28 Google analyse-t-il vraiment tout dans le HTML initial avant le rendu ?
- 27:59 Pourquoi Google ignore-t-il le rendu JavaScript si votre balise noindex apparaît dans le HTML initial ?
- 27:59 Pourquoi une page 404 avec JavaScript peut-elle faire désindexer tout votre site ?
- 28:30 Pourquoi Google refuse-t-il de rendre le JavaScript si le HTML initial contient un meta noindex ?
- 30:00 Google compare-t-il vraiment le HTML initial ET rendu pour la canonicalisation ?
- 30:01 Google détecte-t-il vraiment le duplicate content après le rendu JavaScript ?
- 31:36 Les APIs GET sont-elles vraiment mises en cache par Google comme les autres ressources ?
- 31:36 Google cache-t-il vraiment les requêtes POST lors du rendu JavaScript ?
- 34:47 Est-ce que Google indexe vraiment toutes les pages après rendu JavaScript ?
- 35:19 Google rend-il vraiment 100% des pages JavaScript avant indexation ?
- 36:51 Pourquoi vos APIs défaillantes sabotent-elles votre indexation Google ?
- 37:12 Les données structurées sur pages noindex sont-elles vraiment perdues pour Google ?
Martin Splitt recommends using development tools to collect lab data on Firefox and Safari, which do not natively support Core Web Vitals APIs. The focus should be on monitoring frame timings to aim for 60 fps, a reliable indicator of GPU or main thread blocking. This pragmatic approach helps fill the data gap for 30% of global browser traffic.
What you need to understand
Why do certain browsers still block Core Web Vitals APIs?
Firefox and Safari have yet to natively integrate the APIs that measure LCP, FID, and CLS on the client side. This stance is not due to technical reasons but political factors: both vendors believe these metrics favor the Google ecosystem.
In concrete terms, if you rely solely on traditional JavaScript RUM (Real User Monitoring), you miss out on 25 to 35% of your actual user data depending on your audience. This represents a significant analytical black hole considering that Safari dominates mobile in the US with nearly 50% market share.
What exactly are frame timings and why 60 fps?
Frame timings measure the time it takes for the browser to calculate and render each frame. The 60 fps target corresponds to a frame every 16.67 ms — the threshold below which the human eye perceives perfect fluidity.
When your main thread is blocked by heavy JavaScript or the GPU struggles with complex CSS animations, the frame rate drops. This signal is much more direct than attempting to reconstruct an equivalent CLS or FID on these browsers. You measure the direct impact felt by the user.
Are lab data reliable enough to replace RUM?
No, and Martin Splitt is not saying that. Lab data (DevTools, Lighthouse) provide a technical baseline: under controlled conditions, does your site hold up? But they do not replace real-world conditions: unstable 3G connections, low mobile CPU, browser extensions.
The recommended approach here is a tactical compromise: since it is not possible to measure native CWV on Firefox/Safari in RUM, use DevTools to identify structural bottlenecks. Then cross-reference with your Chrome RUM data to extrapolate. It’s not ideal, but it’s better than nothing.
- Firefox and Safari represent ~30% of global web traffic, which cannot be ignored in SEO
- Frame timings provide a reliable proxy for perceived smoothness, even without CWV APIs
- Lab data complement RUM but do not replace it — they can identify structural issues, not actual user friction
- Systematically monitor performance across all browsers, not just Chrome, to avoid critical blind spots
SEO Expert opinion
Is this recommendation truly actionable for an SEO practitioner?
Let's be honest: Martin Splitt is primarily addressing front-end developers rather than SEOs here. Opening Firefox’s DevTools to analyze frame timings is not a daily task for most SEO professionals, even seniors.
The advice remains valid in principle — frame timings are a great indicator of perceived performance — but it lacks a layer of tooling. How many SEOs know how to interpret a timeline profiler or identify that a drop to 45 fps is due to CSS reflow rather than a third-party script? [To be verified]: The lack of guidance on automation tools (Selenium scripts, CI/CD with lighthouse-ci) limits practical applicability.
Is targeting 60 fps coherent with Core Web Vitals thresholds?
Not directly. The official CWV do not measure frame rate: LCP targets 2.5s, FID 100ms, CLS 0.1. Aiming for 60 fps is a heuristic for overall smoothness, relevant for user experience but does not mechanically translate into a good CWV score.
For instance, you could have a site consistently at 60 fps but a catastrophic LCP of 4s because your hero image is 3 MB. Conversely, a site with excellent LCP can experience micro-stutters at 45 fps while scrolling, which can negatively affect user perception without impacting official metrics. The two approaches are complementary, not interchangeable.
What is the limit of this approach on mobile Safari?
Mobile Safari imposes specific constraints: aggressive background CPU throttling, strict memory management, historical CSS bugs (notably on transform animations). Measuring in a lab on a MacBook Pro does not reflect what happens on an iPhone 12 in real-world conditions.
Moreover, Safari blocks many profiling APIs for privacy reasons (limited User Timing API, no Long Tasks API). As a result, even if you want to apply Splitt's recommendation, you encounter technical impossibilities. The only true solution remains real device testing via BrowserStack or equivalent, which adds complexity and cost.
Practical impact and recommendations
How can you effectively collect frame timings without native APIs?
The manual method via DevTools works for occasional debugging, not for continuous monitoring. Open the Performance tab on Firefox/Safari, record a typical browsing session (scrolling, clicks, interactions), then analyze the flame chart to identify frames that exceed 16.67 ms.
To automate, use Lighthouse in navigation mode on Firefox via Playwright or Puppeteer (which has supported Gecko since 2023). Set up CI/CD tests that run across multiple rendering engines. Tools like SpeedCurve or Calibre can monitor cross-browser, but at a non-negligible monthly cost.
What are the typical bottlenecks that interrupt 60 fps?
Forced reflows (layout thrashing) top the list: reading then modifying a geometric property in a JavaScript loop forces the browser to recalculate the layout on every iteration. A classic example: animating an element's height with JavaScript instead of using transform: scaleY().
Non-composited CSS animations are another pitfall: animating width, height, top, left triggers costly repaints. Stick to transform and opacity, the only properties GPU-accelerated across all browsers. Finally, third-party scripts (analytics, ads, chat) frequently block the main thread — audit them ruthlessly.
Is it really necessary to optimize for Firefox and Safari if Google is not crawling them?
Google crawls and indexes using Chromium/Blink, that's a fact. But the Core Web Vitals used for ranking come from the Chrome User Experience Report (CrUX), which aggregates real user data from Chrome only. So technically, a slow site on Safari does not directly affect your SEO.
However — and this is where it becomes critical — a sluggish site on Safari leads to bounces, less engagement, fewer conversions. These indirect behavioral signals influence your ranking through other mechanisms (organic CTR, return rates, potential dwell time). Ignoring 30% of your audience to scrape 2 points of CWV on Chrome is a short-sighted strategy.
- Set up automated Lighthouse tests on Firefox and Safari using Playwright in your CI/CD pipeline
- Systematically audit CSS animations: use only transform and opacity to ensure cross-browser smoothness
- Deploy paid cross-browser monitoring (SpeedCurve, Calibre, WebPageTest) if your Safari/Firefox traffic exceeds 20%
- Manually measure frame timings via DevTools on real devices (iPhone, iPad) at least quarterly
- Cross-reference your CrUX data (Chrome RUM) with your behavioral analytics (GA4) to identify cross-browser performance gaps
- Prioritize optimizations that benefit all engines: image compression, lazy loading, JS reduction, aggressive caching
❓ Frequently Asked Questions
Peut-on mesurer LCP et CLS sur Firefox avec du JavaScript custom ?
Les frame timings influencent-ils directement le ranking Google ?
Faut-il optimiser différemment pour Safari mobile vs desktop ?
Quelle est la part de trafic Firefox/Safari à partir de laquelle c'est critique ?
Les données PageSpeed Insights incluent-elles Firefox et Safari ?
🎥 From the same video 28
Other SEO insights extracted from this same Google Search Central video · duration 46 min · published on 25/11/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.