Official statement
Other statements from this video 13 ▾
- 0:33 La pagination en JavaScript pose-t-elle vraiment un problème pour Google ?
- 1:36 Faut-il vraiment corriger toutes les erreurs 404 remontées dans Search Console ?
- 4:04 Le server-side rendering est-il vraiment la solution miracle pour le SEO JavaScript ?
- 5:16 Les graphiques JavaScript créent-ils du contenu dupliqué sur vos pages ?
- 5:49 Faut-il vraiment regrouper vos fichiers JavaScript pour préserver votre budget de crawl ?
- 5:49 Pourquoi fixer les dimensions CSS de vos graphiques peut-il sauver vos Core Web Vitals ?
- 7:00 Les redirections JavaScript géolocalisées peuvent-elles vraiment être crawlées sans risque ?
- 11:30 Faut-il vraiment s'inquiéter des titres corrompus dans l'opérateur site: ?
- 12:35 Faut-il vraiment faire du server-side rendering pour ses métadonnées ?
- 14:42 Faut-il vraiment éviter les CDN pour vos appels API ?
- 16:50 Faut-il vraiment limiter le nombre d'appels API côté client pour améliorer son SEO ?
- 30:33 Faut-il vraiment considérer Googlebot comme un utilisateur avec besoins d'accessibilité ?
- 31:59 Faut-il traiter la visibilité SEO comme une exigence technique au même titre que la performance ?
Google states that third-party scripts like Analytics slow down pages and should be loaded last with the defer attribute, after critical JavaScript. This recommendation implies accepting a potential loss of tracking data for the sake of performance. Ultimately, you are trading off the quality of your analytics metrics against user experience — a dilemma that few professionals correctly anticipate.
What you need to understand
Why does Google specifically target third-party scripts?
Third-party scripts represent an external computational burden that you do not control. Unlike the JavaScript you develop in-house, these scripts — Google Analytics, Yandex Metrica, Facebook Pixel, Hotjar — come from remote servers, add extra network calls, and execute code that may block the main rendering.
The issue is not so much with the file size as it is with their timing of execution. An analytics script that triggers before the visible content is displayed delays user interaction. Google measures this through Core Web Vitals, particularly LCP and FID — two metrics that are directly penalized by the premature execution of non-essential scripts.
What does it really mean to load a script 'as late as possible'?
Martin Splitt is referring here to execution prioritization. The defer attribute allows the browser to download the script in parallel without blocking HTML parsing, then only execute it after the DOM is fully constructed. This is different from async, which executes as soon as the download is complete — potentially in the middle of rendering.
But Splitt goes further: he suggests loading these scripts after the vital JavaScript, which means the JavaScript that generates the visible content or allows interaction. Technically, this may involve dynamic loading via JavaScript — injecting the script tag only after an event like DOMContentLoaded or window.load.
What is the trade-off that Google describes?
Google explicitly admits that this strategy results in a potential loss of data. If a user leaves the page before the analytics script initializes, their visit will not be recorded. If someone clicks on an external link before tracking is active, that event disappears.
This is a conscious trade-off: losing a few percentage points of analytics accuracy in exchange for gaining tenths of a second in loading time. For Google, user experience outweighs the completeness of tracking — a position that may clash with marketing teams accustomed to comprehensive dashboards.
- Third-party scripts slow down rendering and interactivity, not just raw loading time
- The defer attribute is a minimum — deferred loading post-DOMContentLoaded is preferable for non-critical scripts
- Accepting a loss of 1 to 5% of tracking data is the price to pay for optimizing Core Web Vitals
- This recommendation applies to all third-party scripts, not just Analytics: ad pixels, chatbots, social widgets
- Google prioritizes measurable user experience over the quality of internal metrics
SEO Expert opinion
Is this statement consistent with what we observe in the field?
Yes, and it’s actually one of the rare occasions where Google's public position aligns perfectly with observations. PageSpeed Insights audits consistently point to third-party scripts as responsible for degrading Total Blocking Time and FID. Sites that have delayed loading Analytics or Tag Manager see measurable gains in LCP — often between 0.3 and 0.8 seconds.
What's interesting is that Splitt articulates a compromise that Google does not always make explicit: performance comes at a cost in terms of data. It is rare for a Googler to openly admit that an SEO recommendation can degrade something else — here, the quality of marketing tracking.
What nuances should be added to this recommendation?
Not all third-party scripts are created equal. An analytics script that passively records pageviews does not have the same critical impact as a content personalization tool or an A/B testing system that modifies the DOM. If your third-party JavaScript affects visible content, pushing it back after rendering can create a flash of unpersonalized content — which degrades the user experience.
Another point: data loss is not uniform across audiences. On mobile with unstable connections, a user might abandon the page before the script loads. On desktop with fiber, the impact is nearly zero. [To verify]: Google does not provide any metrics on the actual percentage of loss — is it 1%, 5%, 10%? That depends on traffic profile, but the lack of precise numbers makes the decision tricky for sites with high analytical stakes.
In which cases does this rule not apply?
If your business model relies on the accuracy of advertising tracking — for instance, an affiliate site where every click must be measured for commission — sacrificing data could cost more than gaining a few performance points. Similarly, fraud detection tools or legal compliance (GDPR, consent) cannot be deferred without legal risk.
e-commerce sites with critical remarketing pixels must weigh the pros and cons. Losing 3% of Facebook tracking could mean losing 3% of attributed conversions to retargeting. In these cases, a hybrid approach — loading the pixel with defer but manually triggering critical events — may be necessary.
Practical impact and recommendations
What concrete steps should you take to apply this recommendation?
Start by identifying all your third-party scripts: open the DevTools Console, go to the Network tab, filter by JS, and locate the external domains (google-analytics.com, facebook.net, yandex.ru, etc.). Classify them by criticality: does this script affect visible content? Does it just measure events?
For purely analytics scripts, add the defer attribute to the script tag. Better yet, inject them dynamically after the DOMContentLoaded or window.load event using JavaScript. Basic example: document.addEventListener('DOMContentLoaded', function() { var script = document.createElement('script'); script.src = 'analytics.js'; document.body.appendChild(script); }).
What errors should be avoided during implementation?
Do not confuse defer and async. Async loads and executes as quickly as possible, without respecting the order of scripts. If your Tag Manager relies on a library loaded before it, async may break dependencies. Defer ensures execution in the order they appear in the HTML.
Another trap: deferring a script that initializes a GDPR consent manager may put you in violation. If your cookie banner needs to block other scripts before consent, it must load early — even at the cost of slightly degraded performance. The same goes for anti-fraud scripts on payment forms.
How can you verify that the optimization is effective without breaking tracking?
Use PageSpeed Insights or Lighthouse to measure Total Blocking Time before and after changes. A deferred analytics script should reduce TBT by 100 to 300 ms on mobile. Also check the LCP: if a third-party script was blocking the rendering of a hero image, deferring it can improve LCP by several tenths of a second.
On the tracking side, compare the Analytics volumes over a week before/after. A decrease of 1-2% in sessions is acceptable and consistent with Splitt's statement. Beyond 5%, there's likely a problem with implementation — ensure that the script loads properly, even later, and that no ad blocker is preventing it from executing.
- List all third-party scripts loaded on your key pages (home, category, product sheet)
- Add defer to pure analytics scripts (Google Analytics, Yandex Metrica, Matomo)
- Dynamically inject ad pixels after DOMContentLoaded if possible
- Exclude critical scripts from this optimization: GDPR consent, anti-fraud, content personalization
- Measure TBT and LCP before/after using Lighthouse or WebPageTest
- Monitor analytics metrics for 7-10 days to detect any abnormal data loss
❓ Frequently Asked Questions
L'attribut defer suffit-il ou faut-il injecter les scripts dynamiquement après le chargement ?
Quel pourcentage de perte de données analytics est acceptable selon Google ?
Est-ce que différer Google Tag Manager peut casser le tracking e-commerce ?
Les scripts de consentement RGPD doivent-ils aussi être différés ?
Cette optimisation améliore-t-elle directement le classement dans les SERP ?
🎥 From the same video 13
Other SEO insights extracted from this same Google Search Central video · duration 36 min · published on 30/10/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.