What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

It is not recommended to serve bots pages without webfonts to gain performance. The added complexity (bot detection, extra code, potential bugs) far exceeds the minimal benefit. It is better to block webfonts via robots.txt or use font-display:swap.
5:46
🎥 Source video

Extracted from a Google Search Central video

⏱ 38:29 💬 EN 📅 18/05/2020 ✂ 10 statements
Watch on YouTube (5:46) →
Other statements from this video 9
  1. 1:06 Le dynamic rendering est-il vraiment sans risque pour le SEO ?
  2. 1:38 Le dynamic rendering ralentit-il vraiment votre serveur ou améliore-t-il le crawl budget ?
  3. 2:39 Pourquoi Google traite-t-il les redirections JavaScript comme des 302 et non des 301 ?
  4. 2:39 Google fait-il vraiment une différence entre redirections 301 et 302 pour le SEO ?
  5. 3:42 Googlebot peut-il vraiment crawler les liens cachés dans un menu hamburger ?
  6. 7:01 Comment gérer correctement les erreurs 404 dans une SPA sans risquer la désindexation ?
  7. 14:57 Pourquoi Googlebot rate-t-il vos contenus chargés par Web Workers ?
  8. 30:51 Le contenu masqué dans les accordéons est-il vraiment indexé par Google ?
  9. 31:49 Faut-il vraiment abandonner l'implémentation manuelle du structured data ?
📅
Official statement from (5 years ago)
TL;DR

Google strongly discourages serving bots versions of pages without webfonts for performance gains. Bot detection, extra code, and potential bugs create complexity that far outweighs the marginal speed benefits. Two simple and less risky alternatives exist: block webfonts via robots.txt or use font-display:swap for asynchronous loading of fonts.

What you need to understand

What Makes This Practice Appealing to Some SEOs?

The idea of serving lightweight pages to bots stems from a legitimate observation: Googlebot and other crawlers consume server time and crawl budget. Removing webfonts—which can weigh anywhere from a few dozen to several hundred kilobytes—seems like a logical optimization to speed up rendering and reduce load.

In practice, some high-volume sites or those with a constrained technical infrastructure have been tempted by this approach. The reasoning: why serve a custom font to a bot that doesn't "see" the design? Why not send it a streamlined version, faster to load, to maximize crawled pages?

What Is Google’s Concern?

Martin Splitt points out three major weaknesses in this strategy. First, bot detection is never 100% reliable—spoofed user agents, third-party crawlers, SEO tools masquerading as Googlebot. You risk serving the wrong version to the wrong visitor.

Next, the additional code necessary to manage this conditional logic introduces fragility. Each layer of detection is an opportunity for bugs: misconfiguration errors, improperly invalidated caches, server rules that apply too broadly. And when it breaks, Google might index a degraded version of your site.

Finally, the actual gain is minimal compared to the risk. Modern webfonts are compressed, often cached, and their impact on overall rendering time remains marginal compared to JavaScript, images, or multiple network requests. In other words: you're complicating your technical stack to gain a few milliseconds.

What Alternatives Does Google Recommend?

Two simple approaches without the risk of cloaking. First option: block webfonts via robots.txt. Googlebot will not download them but will continue to crawl and index your content normally. No bot detection needed, no conditional logic on the server side.

Second approach: use font-display:swap in your @font-face declarations. The browser immediately displays the text with a system font, then swaps to the webfont as soon as it’s loaded. Googlebot gets a quick rendering, users do too, and you avoid any unnecessary technical complexity.

  • Bot Detection: never 100% reliable, risk of serving the wrong version
  • Additional Code: each layer of conditional logic = opportunity for bugs
  • Marginal Gains: webfonts are minor compared to JS, images, network requests
  • Alternative 1: block webfonts via robots.txt (simple, low-risk)
  • Alternative 2: font-display:swap for asynchronous loading

SEO Expert opinion

Is This Statement Consistent with Ground Observations?

Yes, and it aligns with a consistent principle at Google: avoid any form of cloaking, even if well-intentioned. Serving different content to bots remains a gray area—technically allowed in certain cases (mobile vs desktop in responsive design, for example), but closely monitored by the algorithm. Here, the difference concerns aesthetic resources, but the mechanism remains the same: detection + variation = risk.

In practice, sites that have tried this approach often encounter unpredictable indexing issues. Misconfigured CDN caches serving the bot version to users, overly broad server rules blocking legitimate crawlers, or Google detecting manipulation and applying penalties. The risk is not worth the potential gain.

In What Scenarios Could This Rule Be Nuanced?

It's hard to imagine a legitimate scenario. Even on sites with an ultra-constrained crawl budget (millions of pages, daily updates), blocking webfonts via robots.txt suffices. No need for bot detection, no conditional code, same result.

The only edge case: a site with extremely large webfonts (multiple megabytes, unoptimized custom fonts) where download time genuinely hampers crawling. But then, the real problem lies upstream—it’s necessary to optimize or replace those fonts, not circumvent the issue through cloaking. [To be verified] if Google allows for a documented exception in this context, but no official communication supports this.

What Is Googlebot's Real Priority Regarding Performance?

Reduce Time to First Byte (TTFB), limit chain redirection, optimize server velocity. Webfonts rank much lower in the hierarchy of optimizations. A server that takes 800 ms to respond penalizes crawling far more than a 50 KB webfont.

And for user-side Core Web Vitals, font-display:swap elegantly solves layout shift and rendering time issues. You benefit on both fronts—bot and human—without technical compromises. This is exactly what Martin Splitt recommends and aligns with the "single version, optimized for all" approach that Google has advocated for years.

Practical impact and recommendations

What Should You Do If You Are Already Serving Different Pages to Bots?

First step: audit your server configuration to identify all bot detection rules. Look for conditions on user-agent, variations in CSS/fonts according to visitor, CDN caches with keys based on user-agent. Document each divergence point between bot version and user version.

Next, test the real impact of this optimization. Measure the crawl time before/after removing webfonts for bots. If the gain is less than 100 ms per page, you are in the noise measurement—better to simplify your stack. If the gain exceeds 200 ms, ask yourself: why are your webfonts so heavy?

How to Shift to a Compliant Approach Without Losing Performance?

Option 1: add your webfont files to robots.txt with Disallow. Googlebot will no longer download them but will continue to crawl your HTML/CSS pages normally. No risk of cloaking, no additional code. Test in Search Console (URL Inspection tool) to verify that rendering remains correct.

Option 2: implement font-display:swap in your @font-face declarations. Text displays instantly with a system font, the webfont loads in the background and swaps as soon as available. You improve LCP and CLS at the same time, and Googlebot gets a quick rendering without downloading the font. This is the solution recommended by Google, and it also benefits users on slow connections.

What Mistakes to Avoid During This Transition?

Do not block your entire CSS via robots.txt thinking you’ll save time—Google needs styles to understand layout and detect main content. Only block .woff, .woff2, .ttf, .eot files if you opt for this approach.

Also, avoid keeping a dormant bot detection in your code, even if disabled. If a developer mistakenly reactivates it during a deployment, you risk backlash. Completely remove the conditional logic to eliminate risk at the source.

  • Audit all bot detection rules (server, CDN, application)
  • Measure the real performance gain before deciding
  • Block webfonts via robots.txt OR implement font-display:swap
  • Test bot rendering with the URL Inspection tool in Search Console
  • Completely eliminate bot detection logic from the source code
  • Document the new configuration to avoid future regressions
In summary: serving different pages to bots creates more problems than it solves. The two alternatives proposed by Google—blocking via robots.txt or using font-display:swap—are simple, low-risk, and equally effective. If you want to finely optimize your crawl budget while avoiding technical pitfalls, consulting a specialized SEO agency can be wise for an in-depth audit and tailored support suited to your infrastructure.

❓ Frequently Asked Questions

Bloquer les webfonts via robots.txt impacte-t-il le rendu côté Google ?
Non. Googlebot crawle et indexe votre contenu HTML et CSS normalement, il affiche simplement le texte avec une police système par défaut. Le contenu reste identique et lisible.
Font-display:swap crée-t-il un layout shift pénalisant pour les Core Web Vitals ?
Si la police de remplacement a des métriques proches de la webfont finale, le shift est minime. Optimisez en utilisant des polices système similaires (ex: Arial pour une sans-serif) et ajustez size-adjust si nécessaire.
Peut-on servir une version AMP aux bots et la version classique aux users ?
Non, c'est du cloaking. AMP doit être proposé comme alternative déclarée (balise rel=amphtml) ou version unique, jamais servie conditionnellement selon le user-agent sans que l'utilisateur puisse y accéder directement.
La détection de Googlebot via user-agent est-elle fiable ?
Elle ne l'est pas à 100 %. Des crawlers tiers, outils SEO ou utilisateurs peuvent usurper le user-agent. Google recommande de vérifier via reverse DNS, mais cela ajoute de la complexité et reste imparfait.
Existe-t-il d'autres ressources à bloquer via robots.txt pour économiser du crawl budget ?
Oui : images décoratives lourdes, fichiers JS non critiques pour le rendu, PDF internes non destinés à l'indexation. Mais toujours tester l'impact sur le rendu avec l'outil Inspection d'URL avant de bloquer.
🏷 Related Topics
Domain Age & History Crawl & Indexing AI & SEO Web Performance Search Console

🎥 From the same video 9

Other SEO insights extracted from this same Google Search Central video · duration 38 min · published on 18/05/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.