Official statement
Other statements from this video 11 ▾
- 1:01 Faut-il vraiment contacter l'équipe AdSense pour résoudre vos problèmes de performance PageSpeed ?
- 1:01 Faut-il vraiment retarder le JavaScript AdSense pour booster votre SEO ?
- 2:35 Pourquoi Google refuse-t-il de communiquer les dimensions du viewport de Googlebot ?
- 3:07 Comment Googlebot gère-t-il réellement le contenu en bas de page ?
- 3:38 Faut-il abandonner l'infinite scroll pour être correctement indexé par Google ?
- 6:24 Pourquoi Googlebot utilise-t-il un viewport de 10 000 pixels ?
- 9:23 Pourquoi Google refuse-t-il d'indexer le contenu qui dépend du viewport ?
- 10:11 Pourquoi Google fixe-t-il la largeur du viewport de son crawler à 1024 pixels ?
- 12:38 Les meta tags no-archive en JavaScript fonctionnent-ils vraiment ?
- 14:24 Google analyse-t-il vraiment les meta tags avant ET après le rendu JavaScript ?
- 15:27 Faut-il rendre les meta tags côté serveur ou accepter qu'ils soient modifiés par JavaScript ?
Google confirms that the Intersection Observer works with Googlebot and triggers the loading of lazy-loaded content, within unspecified limits. For SEO, this means you can use this JavaScript API for lazy loading without fear of indexing issues. Just ensure that your implementation actually generates new content under Google's crawling conditions—and test regularly.
What you need to understand
Why does this statement change the game for lazy loading?
For years, JavaScript lazy loading was considered a major SEO risk. Content loaded after user interaction or scrolling often remained invisible to Googlebot, which does not scroll or wait indefinitely for the full rendering. The Intersection Observer API, introduced in 2016, allows content to load when an element enters the viewport—typically for images or text blocks.
Martin Splitt here claims that Googlebot triggers all intersection observers that generate new content, within certain limits. What does that mean in practice? The bot simulates infinite scrolling until no new content appears, or until it reaches a time or resource limit. This is a tactical shift: the Intersection Observer becomes a recommended approach, not a workaround to avoid.
What are these 'certain limits' Google is talking about?
Google does not elaborate. We know that Googlebot uses a rendering budget: it won’t wait 30 seconds for your site to load 500 lazy content blocks. The limits likely pertain to JavaScript execution time, number of network requests, and depth of simulated scrolling. If your lazy loading creates an infinite cascade or relies on complex interactions (clicks, hovers), there’s no guarantee Googlebot will see it all.
The phrasing remains vague—intentionally. Google does not publish an official figure ("we crawl up to X seconds of rendering") to prevent sites from optimizing to the millimeter. However, field experience shows that the first 10-15 lazy-loaded blocks generally go through well; beyond that, it becomes random. [To be verified] on your strategic pages via Google Search Console and rendering tests.
Should you abandon server-side lazy loading because of this?
No. Just because the Intersection Observer works with Googlebot doesn’t mean it works perfectly all the time. Sites with a high editorial content load (news, e-commerce with hundreds of products) still benefit from prioritizing server-side rendering for critical content. Lazy-loading images, videos, or ancillary widgets? Perfect. Lazy-loading your first 3 paragraphs or your first 20 category products? Risky.
The recommended approach remains progressive hydration: the basic HTML is served server-side, while the Intersection Observer enriches the user experience. Googlebot sees the initial content without relying on JavaScript—and then benefits from lazy-loaded content if the rendering budget allows it.
- The Intersection Observer is compatible with Googlebot and triggers the loading of lazy content
- Google imposes undocumented limits on the time and resources allocated to JavaScript rendering
- Critical content must remain accessible without JavaScript or in the first lazy-loaded blocks
- Testing rendering through Google Search Console (URL Inspection) is essential to validate indexing
- The hybrid approach (server + lazy loading) remains the safest for high-SEO-stakes sites
SEO Expert opinion
Is this statement consistent with field observations?
Overall, yes. Rendering tests show that Googlebot has been effectively triggering the Intersection Observer for several years—at least since 2019, when Google started using Chromium 79+ for rendering. On pages with 5-10 lazy-loaded blocks, a crawl rate close to 100% of the content is observed. The bot simulates scrolling, waits for the observers to trigger, retrieves the new DOM, and starts again.
Where it gets tricky: sites with complex infinite scrolls, nested Ajax dependencies, or long timeouts. I’ve seen cases where Googlebot stopped after 8-10 blocks when there were still 50 left. The “limit” that Splitt talks about is not a fixed number—it depends on server response time, the complexity of JavaScript, and probably the crawl budget allocated to the domain. [To be verified] under real conditions, page by page.
What are the gray areas in this statement?
Splitt says “within certain limits” without specifying what they are. This is typical of Google: confirming a technical capability without providing a contractual guarantee. We don’t know if these limits are in number of blocks, rendering time, DOM volume, or network requests. We also don’t know if they vary by site—a site with high PageRank likely has more rendering budget? Probably, but nothing official.
Another gray area: Splitt talks about the Intersection Observer that “generates new content.” What counts as “new”? A text block added to the DOM, clearly. But a simple CSS class change? A lazy load of images without text? Google doesn’t clarify. Experience shows that text and links are well crawled; lazy-loaded images also go through, but with indexing delays that can be longer.
In what cases does this approach fall short?
When your business model depends on exhaustive and rapid indexing of thousands of pages or blocks. E-commerce with 500 products in lazy loading? You will lose references. News sites with infinite streams of briefs? Googlebot will only see a fraction. In these cases, it’s better to paginate cleanly with distinct URLs or deliver content as static HTML.
The Intersection Observer works well for user experience (progressive loading, Core Web Vitals performance) but remains a gamble for SEO. If your site generates 80% of its organic traffic through 10 strategic pages, you can afford it—and test. If you depend on the long tail and thousands of indexed pages, don’t put all your chips on JavaScript lazy loading.
Practical impact and recommendations
How can I check if my lazy loading is being crawled by Google?
Use the URL Inspection tool in Google Search Console. Paste the URL of a page with lazy loading, click on “Test live URL,” then “View crawled page” > “HTML version.” Scroll through the source code: if your lazy-loaded blocks appear in the DOM, that’s a good sign. If you only see the first blocks or empty placeholders, Googlebot did not load everything.
Compare with a Screaming Frog crawl with JavaScript enabled. Set a timeout of 10-15 seconds and check that the word count crawled matches what you see in a standard browser. If you detect a discrepancy of more than 20%, there’s a rendering problem. Also test with Puppeteer or Playwright simulating infinite scrolling—that will give you an idea of a bot's real behavior.
What mistakes should be avoided when implementing the Intersection Observer?
Never lazy-load content above the fold. Googlebot doesn’t scroll by default: if your first 3 paragraphs or main title are lazy-loaded, they may not be indexed immediately. Use the Intersection Observer for blocks below the first viewport—typically starting from the 2nd or 3rd scrolling screen.
Avoid cascading dependencies: an observer triggering a fetch which triggers another observer. The longer the chain, the higher the risk that Googlebot abandons midway. Favor a flat and predictable lazy loading logic: each block loads independently when it enters the viewport, without waiting for the previous one to finish.
Should I change my current implementation if it’s already working?
If your pages are correctly indexed and you are not seeing any organic traffic loss, leave it as is. Splitt’s statement confirms that the Intersection Observer is supported, but not mandatory. If you are using another approach (server-side rendering, classic pagination, lazy loading via scroll events) and it’s working, continue.
However, if you had avoided lazy loading due to indexing concerns, you might want to reconsider this strategy. The Intersection Observer becomes a viable option to improve Core Web Vitals (LCP, CLS) while preserving crawlability. But test it first on secondary pages before deploying on your strategic pages—and monitor indexing for 2-3 weeks post-deployment.
- Test each strategic page via Google Search Console > URL Inspection and check the crawled DOM
- Set up a Screaming Frog crawl with JavaScript mode with sufficient timeout (minimum 10-15s)
- Never lazy-load content above the fold or critical elements (H1, first paragraphs, strategic internal links)
- Avoid nested observer cascades—favor a flat and parallel loading logic
- Monitor indexing in Search Console for 2-3 weeks after deployment to detect any deficits
- Implement a fallback: if JavaScript fails, critical content must remain accessible as static HTML
❓ Frequently Asked Questions
L'Intersection Observer ralentit-il le crawl de Googlebot ?
Dois-je lazy-loader mes images avec l'Intersection Observer pour le SEO ?
Combien de blocs lazy-loadés Googlebot peut-il crawler sur une page ?
Faut-il ajouter un fallback noscript pour le contenu lazy-loadé ?
L'Intersection Observer fonctionne-t-il avec les autres moteurs de recherche (Bing, Yandex) ?
🎥 From the same video 11
Other SEO insights extracted from this same Google Search Central video · duration 18 min · published on 10/12/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.