What does Google say about SEO? /

Official statement

Full-page 'hero' images do not impact indexing if the content is in the DOM without requiring scrolling.
19:35
🎥 Source video

Extracted from a Google Search Central video

⏱ 57:45 💬 EN 📅 29/04/2020 ✂ 20 statements
Watch on YouTube (19:35) →
Other statements from this video 19
  1. 2:38 Should you really multiply sitemaps when you have a lot of URLs?
  2. 2:38 Is it really necessary to split your sitemap into multiple files to index a large site?
  3. 5:15 Why does replacing HTML with JavaScript canvas hurt SEO?
  4. 5:18 Should you ditch HTML5 canvas to ensure your content gets indexed?
  5. 10:56 Should you ditch the noscript attribute for SEO?
  6. 12:26 Should you really ditch noscript for rendering your content?
  7. 15:13 What happens when your HTML metadata contradicts the JavaScript ones?
  8. 16:19 Do complex JavaScript menus really block the indexing of your navigation?
  9. 18:47 Does Googlebot really follow all the JavaScript links on your site?
  10. 19:28 Do full-page hero images really harm Google indexing?
  11. 20:04 Why does Google keep crawling your old URLs after a redesign?
  12. 22:25 Is it true that Google really respects the canonical tag?
  13. 25:48 How does the initial load of a SPA potentially ruin your SEO?
  14. 26:20 Does the initial load time of SPAs hurt your organic traffic?
  15. 28:13 Do Service Workers really enhance the crawling and indexing of your site?
  16. 36:00 Will Server-Side Rendering Become Essential for the SEO of JavaScript Applications?
  17. 36:17 Should you go all in on server-side rendering to excel in JavaScript?
  18. 41:29 Does JavaScript really represent the future of web development for SEO?
  19. 52:01 Are Third-Party Scripts Really Hurting Your Core Web Vitals?
📅
Official statement from (6 years ago)
TL;DR

Google claims that full-page hero images do not affect indexing as long as the textual content is present in the DOM without needing to scroll. The key takeaway? Googlebot accesses the raw HTML, not what is visible on the screen above the fold. In practice, an immersive hero only poses a problem if the main content requires user interaction to load or be revealed in the DOM.

What you need to understand

What’s behind this statement about hero images today?

Full-screen hero images have become a standard in modern web design, especially on showcase sites, portfolios, and landing pages. Many SEO practitioners rightfully worry: if the majority of the initial viewport is taken up by an image, does Googlebot see the textual content located further down the page?

Martin Splitt addresses this concern directly. The key point to understand: Googlebot does not think in terms of “above the fold” like a human user. It analyzes the complete DOM of the page. If your textual content is present in the HTML source code without the need for virtual scrolling or a JavaScript action to be injected, Googlebot will read it.

What does ‘content in the DOM without scrolling’ exactly mean?

This phrasing deserves clarification. “Without needing to scroll” does not mean that the content must be visible without scrolling for the user. It means that the content must be present in the initial or rendered HTML code without any scrolling action needed to trigger its loading.

In other words: if your hero takes up 100vh and your main text begins just below in the normal HTML flow, it is perfectly indexable. The problem only arises if you use lazy loading JavaScript that depends on scroll to inject content only when the user scrolls — which can prevent Googlebot from seeing this content if the script does not execute properly.

What are the implications for highly visual sites?

This statement reassures sites that rely on immersive visual experiences: creative agencies, photography portfolios, luxury product sites. You can retain your stunning hero images without sacrificing indexing, as long as you adhere to the fundamental rule.

However, be cautious not to confuse indexing with semantic relevance. A page with 90% hero image and 10% text will be indexed, sure, but it will likely have less semantic depth than a balanced page. Google will understand what the page is about, but with less contextual signal to rank for competitive queries.

  • Googlebot reads the complete DOM, not just what is visible above the fold
  • Full-screen hero images do not block indexing as long as the HTML content is present below
  • The real risk: scroll-dependent lazy loading that injects content via JavaScript
  • Crucial distinction: indexing ≠ ranking. A poor text content will be indexed but perform poorly in visibility
  • Always test your rendering with the URL inspection tool in Search Console to see what Googlebot actually receives

SEO Expert opinion

Is this statement consistent with real-world observations?

Overall, yes. Controlled environment testing confirms that Googlebot easily indexes pages with full-height heroes as long as the content is in the initial HTML or in the final JavaScript rendering that is accessible without interaction. I have audited dozens of agency/portfolio sites with this pattern: indexing was never the issue.

The real trap lies elsewhere: some developers implement ‘progressive reveal’ scripts that only load text sections on user scroll. If these scripts are not detected and executed by Googlebot (or if the JS execution fails silently), the content will never be visible to the bot. The result: indexed page, but with almost empty content. [To be checked] regularly with Search Console inspection.

What nuances should be added to this statement?

Martin Splitt remains deliberately vague about the definition of “without needing to scroll”. He doesn’t specify how Googlebot handles scripts that listen for scroll events. Based on my observations, Googlebot does not trigger scroll events during JavaScript rendering — so any content that depends on these events risks remaining invisible.

Another point: the statement addresses only indexing, not ranking. A page with a hero of 2000px height and 150 words of text will be indexed, but it will likely have a poor semantic relevance score compared to competitors who provide more developed content. The visual/text balance remains strategic for organic performance.

In which cases does this rule not fully apply?

Let’s be honest: this rule assumes a functional JavaScript rendering. If your site uses a modern JS framework (React, Vue, Angular) and the content exists only in client-side rendering, you are entirely dependent on Googlebot’s ability to execute your JavaScript without error. Error in the bundle? Render timeout? External dependency that fails? Your content disappears.

Second problematic case: sites with multiple conditional hero variants (A/B testing, geolocation-based personalization) that inject different content based on the user context. Googlebot will see only one variant, potentially not the one you wish to index. Always systematically test the variant served to the bot.

Warning: Don’t confuse “Googlebot can index” with “it’s optimal for ranking”. A massive hero pushes textual content far down into the DOM — which can dilute the strong semantic signals Google looks for at the start of the page to establish main topical relevance. User experience matters too: a too-intrusive hero can degrade your Core Web Vitals (CLS, LCP) and indirectly impact your visibility.

Practical impact and recommendations

What should you do to ensure indexing with a full-screen hero?

First action: check that your main textual content is indeed present in the HTML source code, not just injected via scroll-dependent JavaScript. Inspect the raw source code (Ctrl+U / Cmd+Option+U): if you see your paragraphs, headers, and semantic elements, that's a good sign. If everything is in empty div tags filled later by JS, test Googlebot’s rendering.

Use the URL inspection tool in Search Console to request a live test of your page. Analyze the “Rendered HTML” tab: does Googlebot see the complete textual content? If not, identify the responsible script and correct it. Sometimes, a simple timeout in JavaScript execution can render the content invisible to the bot.

What mistakes should you absolutely avoid with hero images?

Number one mistake: using aggressive lazy loading that conditions the content’s appearance to an onScroll event. Googlebot does not scroll virtually during rendering, so this content will remain invisible. If you need to lazy-load, do it only for non-critical resources (footer images, secondary widgets), never on the main textual content.

Another common mistake: placing textual content in absolute or fixed position above the hero, creating reading issues for Googlebot. Favor a natural and predictable HTML flow: hero first, main content afterwards in logical document order. CSS can rearrange visually if needed, but HTML must remain semantically coherent.

How can I check that my implementation is compliant and optimal?

Beyond the Search Console inspection, test with Screaming Frog with JavaScript enabled/disabled. Compare the two renderings: if you lose content in JS disabled mode and that content does not reappear in Googlebot’s JS rendering, you have a problem. Also, check the server logs to confirm that Googlebot can access the JavaScript resources needed for rendering (no 403, 404 or timeout).

Analyze your Core Web Vitals specifically on pages with hero images. A poorly optimized hero (uncompressed image, unsuitable format, blocking synchronous loading) can degrade your LCP and indirectly impact your ranking. Use WebP or AVIF, size appropriately, and serve responsive to mobile viewports.

  • Check that the textual content is present in the HTML source, visible in Googlebot’s rendering (Search Console)
  • Avoid scroll-dependent lazy loading for main content — only for non-critical resources
  • Test JavaScript rendering with Screaming Frog (JS enabled/disabled mode) and compare results
  • Optimize hero images for Core Web Vitals: modern formats (WebP/AVIF), compression, appropriate dimensions
  • Maintain a sufficient textual balance for semantic relevance, don’t limit to 100-150 words under a massive hero
  • Regularly audit logs to detect potential JavaScript rendering failures on Googlebot’s side
A full-screen hero is not an obstacle to indexing if your technical architecture is clean and the content remains accessible in the DOM without user interaction. However, these optimizations — JavaScript rendering, strategic lazy loading, semantic balance, Core Web Vitals — can quickly become complex to orchestrate without deep technical expertise. If you notice inconsistencies between what you see and what Googlebot indexes, or if you wish to maximize your organic performance without compromising visual experience, the support of a specialized SEO agency can be crucial to avoid technical pitfalls and fully exploit the potential of your pages.

❓ Frequently Asked Questions

Une image hero de 100vh empêche-t-elle Googlebot d'indexer le contenu en dessous ?
Non, tant que le contenu textuel est présent dans le DOM initial ou rendu JavaScript sans nécessiter d'interaction de scroll pour être chargé. Googlebot analyse le DOM complet, pas seulement ce qui est visible au-dessus du pli.
Le lazy loading conditionnel au scroll pose-t-il problème pour l'indexation ?
Oui, si le contenu principal n'est injecté dans le DOM qu'après un événement de scroll utilisateur. Googlebot ne déclenche pas ces événements lors du rendu, donc le contenu restera invisible. Réservez le lazy loading au scroll pour les ressources non critiques uniquement.
Comment vérifier que Googlebot voit bien mon contenu malgré le hero ?
Utilisez l'outil d'inspection d'URL dans Search Console et demandez un test en direct. Analysez l'onglet HTML rendu pour confirmer que tout votre contenu textuel apparaît bien dans le rendu final de Googlebot.
Un hero massif peut-il impacter mon ranking même si l'indexation fonctionne ?
Oui, indirectement. Un hero trop imposant avec peu de contenu textuel réduit la profondeur sémantique de la page et peut dégrader les Core Web Vitals (LCP notamment), deux facteurs qui influencent le ranking. L'indexation ne garantit pas la performance organique.
Faut-il privilégier le rendu côté serveur (SSR) pour sécuriser l'indexation avec un hero ?
Ce n'est pas strictement nécessaire si votre rendu client-side fonctionne correctement pour Googlebot. Cependant, le SSR élimine les risques d'échec JavaScript et garantit que le contenu est immédiatement présent dans le HTML initial, ce qui reste la solution la plus robuste.
🏷 Related Topics
Domain Age & History Content Crawl & Indexing Images & Videos JavaScript & Technical SEO

🎥 From the same video 19

Other SEO insights extracted from this same Google Search Central video · duration 57 min · published on 29/04/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.