Official statement
Other statements from this video 22 ▾
- 0:33 Pourquoi Googlebot ignore-t-il vos cookies et comment adapter votre stratégie de contenu personnalisé ?
- 1:02 Googlebot crawle-t-il avec les cookies activés ou ignore-t-il votre contenu personnalisé ?
- 1:02 Peut-on rediriger les utilisateurs connectés vers des URLs différentes sans pénalité SEO ?
- 1:35 Changer de framework JavaScript fait-il chuter vos positions Google ?
- 1:35 Changer de framework JavaScript ruine-t-il vraiment votre SEO ?
- 4:46 Le HTML rendu suffit-il vraiment à garantir l'indexation du JavaScript ?
- 4:46 Comment vérifier si votre contenu JavaScript est réellement indexable par Google ?
- 5:48 Le contenu derrière login est-il vraiment invisible pour Google ?
- 5:48 Le contenu derrière un login est-il vraiment invisible pour Google ?
- 6:47 Faut-il vraiment rediriger Googlebot vers www pour contourner les erreurs CORB ?
- 8:42 Faut-il traiter Googlebot différemment des utilisateurs pour gérer les redirections ?
- 11:20 Faut-il vraiment masquer les bannières de consentement à Googlebot pour améliorer son crawl ?
- 11:20 Faut-il afficher les écrans de consentement à Googlebot au risque d'être pénalisé pour cloaking ?
- 18:18 Pourquoi vos outils de test PageSpeed affichent-ils des scores LCP et FCP contradictoires ?
- 19:51 Pourquoi vos URLs avec hash (#) ne seront jamais indexées par Google ?
- 20:23 Faut-il vraiment supprimer les hashs des URLs d'événements sportifs pour les indexer ?
- 23:32 Le pré-rendu pour Googlebot : faut-il vraiment s'en passer ?
- 24:02 Faut-il vraiment désactiver JavaScript sur vos pages pré-rendues pour Googlebot ?
- 26:42 Le JSON-LD ralentit-il vraiment votre temps de chargement ?
- 26:42 Le balisage FAQ Schema est-il vraiment inutile pour vos pages produits ?
- 26:42 Le JSON-LD FAQ Schema ralentit-il vraiment votre site ?
- 26:42 Le balisage FAQ Schema nuit-il à votre taux de conversion ?
Google recommends three methods to track the elements responsible for a poor CLS: manually blocking requests in Chrome DevTools, automating tests with Puppeteer, or directly consulting the Lighthouse and PageSpeed Insights reports that now display the incriminating elements. For an SEO, this means that we can finally shift from a global diagnosis to surgical corrections of visual stability issues. The manual elimination approach remains time-consuming, but automation via Puppeteer is becoming the real avenue for auditing large-scale sites.
What you need to understand
Why does CLS pose a complex diagnostic problem?
The Cumulative Layout Shift measures the visual instability of a page, but its calculation aggregates all shifts without pointing to a specific culprit. A global score of 0.25 or 0.30 says nothing about the origin: a lazy-loaded image without dimensions, a late ad banner, a web font causing a violent FOIT.
This opacity makes correction haphazard. We optimize by guesswork, correct an element, rerun a test, and find the score barely budges. Martin Splitt implicitly acknowledges this problem by proposing isolation methods through elimination.
What exactly does blocking requests in Chrome DevTools provide?
The Network tab of Chrome DevTools allows for manual blocking of resources — third-party scripts, fonts, images — and re-running Core Web Vitals metrics via the Performance tab or directly through Lighthouse. By blocking suspicious requests one by one, we can observe how CLS evolves.
Concretely? If you block your ad network script and your CLS drops from 0.28 to 0.05, you’ve found your culprit. It’s artisanal, but it works. The problem is that this approach does not scale: auditing 50 different templates with 15 types of requests per page becomes a logistical nightmare.
Can Puppeteer really automate this diagnosis at scale?
Puppeteer, Google's headless browsing framework, allows for automation of this process: loading a page, programmatically blocking resources via the DevTools Protocol, rerunning metrics, logging results. A script can test all combinations in a single overnight run.
Splitt mentions this avenue without detailing — typical. For a technical SEO, this assumes proficiency in Node.js, the Chrome DevTools protocol, and collecting CrUX or Lighthouse metrics via API. It’s powerful, but it requires a real investment in development. Agencies proficient in Puppeteer have a distinct competitive advantage in Core Web Vitals audits.
- The CLS aggregates all visual shifts without pointing to responsible elements
- Manual request blocking in Chrome DevTools allows for isolating culprits through elimination
- Puppeteer offers a powerful automation to test resource combinations at scale
- Lighthouse and PageSpeed Insights now directly display the affected elements, reducing the need for manual methods
- The approach remains time-consuming without automated tooling
SEO Expert opinion
Does this statement truly reflect the state of the art in CLS diagnostics?
Let’s be honest: by presenting manual request blocking as a viable solution, Splitt describes a band-aid method that every technical SEO has been practicing for months. It’s not a revelation; it’s a post facto validation of field practices. The real addition is the mention of Puppeteer — but without a script example, without documentation, without a guide. Typical of Google: pointing a direction without providing the map.
The official tools — Lighthouse, PageSpeed Insights — have indeed evolved to display the incriminating elements, but their accuracy remains variable depending on the loading context. A test on a Fast 4G connection won't yield the same culprits as a Slow 3G test. [To be verified] how well these reports capture edge cases where the CLS only explodes under certain network or device conditions.
What limitations does this approach present in real production?
Blocking requests assumes that you already know the suspects. If your CLS comes from poorly configured lazy-loading on images dynamically injected by a JS framework, you won’t see anything in the Network tab until the script executes. And if the problem arises from a layout recalculation triggered by a client-side resize event? No requests to block.
Puppeteer solves part of the problem but introduces a technical complexity that excludes the majority of SEOs. And that's where it fails: Google encourages a systems engineering approach to solve a marketing problem. Agencies without dev resources in-house are left stuck with manual audits or paid third-party tools like DebugBear or Treo.
In what cases is this method insufficient?
Shifts caused by user interactions — a cookie banner that deploys on scroll, a sticky header that retracts — will not be captured by a standard Lighthouse audit that simulates an initial load. You need to script interaction scenarios in Puppeteer, which multiplies the complexity tenfold.
And what about SPA sites where CLS degrades after a client-side navigation? Metrics come in via CrUX on actual traffic, but reproducing those conditions in automated audits requires fine orchestration of Puppeteer with complete user flows. Splitt mentions none of this — because the on-the-ground reality is much messier than the slide deck.
Practical impact and recommendations
What concrete steps should be taken to diagnose high CLS?
Start with an audit using PageSpeed Insights or Lighthouse on your critical templates: homepage, product sheets, articles. Note the elements displayed as responsible — often images, ad iframes, or content blocks injected in JavaScript. Ensure these elements have explicit dimensions in HTML or CSS before loading.
If the tool does not point anything clear, switch to manual mode: open Chrome DevTools, Network tab, block third-party requests one by one (Google Ads, Analytics, Google Fonts), rerun a Lighthouse audit. Time-consuming, but it often reveals the culprit in 10-15 minutes.
How to automate this diagnosis using Puppeteer?
If you have Node.js skills, set up a Puppeteer script that loads your page, programmatically blocks categories of resources (third-party scripts, images, fonts), collects metrics via Chrome DevTools Protocol, and logs the results in a CSV. You can then correlate the CLS delta with each type of blocked resource.
Specifically, use page.setRequestInterception(true) to intercept requests, filter by domain or MIME type, block with request.abort(), and then collect metrics using the Performance.getMetrics() method. It’s technical, but it allows auditing 500 URLs overnight. If you lack this skill in-house, calling in a specialized SEO agency that masters these tools can save you weeks of trial and error and ensure a comprehensive diagnosis.
What mistakes should be avoided in correcting CLS?
Don’t correct CLS by masking the symptoms: forcing a min-height on all blocks to reserve space does not solve anything if the final element is larger or smaller. You create unnecessary gaps or equally burdensome layout recalculations. Size each element precisely according to its actual size.
Another pitfall: correcting CLS in lab (Lighthouse) without checking field impact (CrUX). A site may show a CLS of 0.05 under simulated conditions and explode to 0.30 under real traffic due to variations in dynamic content or non-deterministic ads. Always cross-check lab and field before validating a correction.
- Audit critical templates with PageSpeed Insights and note displayed responsible elements
- Ensure all visual elements have explicit dimensions (width/height in HTML or aspect-ratio in CSS)
- Manually block third-party requests in Chrome DevTools to isolate culprits
- Automate diagnosis with Puppeteer if you have the necessary dev resources
- Validate corrections in field (CrUX) and not just in lab (Lighthouse)
- Monitor CLS post-deployment for at least 4 weeks to capture seasonal variations
❓ Frequently Asked Questions
Le blocage de requêtes dans Chrome DevTools suffit-il pour diagnostiquer tous les problèmes de CLS ?
Puppeteer est-il indispensable pour auditer le CLS à grande échelle ?
Lighthouse et PageSpeed Insights affichent-ils toujours les bons éléments responsables du CLS ?
Peut-on corriger le CLS en réservant simplement de l'espace avec min-height sur tous les blocs ?
Le CLS mesuré en lab reflète-t-il toujours le CLS réel des utilisateurs ?
🎥 From the same video 22
Other SEO insights extracted from this same Google Search Central video · duration 28 min · published on 01/07/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.