What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Crawl speed measures only the time to fetch a URL from your server, not the rendering in the browser. This is different from Core Web Vitals, which include JavaScript, external resources, and complete rendering.
🎥 Source video

Extracted from a Google Search Central video

💬 EN 📅 18/02/2022 ✂ 24 statements
Watch on YouTube →
Other statements from this video 23
  1. Google compte-t-il vraiment tous les liens visibles dans Search Console ?
  2. Faut-il vraiment concentrer son contenu sur moins de pages pour ranker ?
  3. Les critères d'avis produits Google s'appliquent-ils même si votre site n'est pas classé comme site d'avis ?
  4. L'API Indexing de Google fonctionne-t-elle vraiment pour tous les contenus ?
  5. L'E-A-T influence-t-il vraiment le classement Google ou n'est-ce qu'un mythe ?
  6. Les mentions de marque sans lien ont-elles un impact sur votre référencement ?
  7. Les commentaires d'utilisateurs améliorent-ils vraiment le classement dans Google ?
  8. Les certificats SSL premium influencent-ils vraiment le référencement Google ?
  9. PDF et HTML avec le même contenu : faut-il craindre une cannibalisation dans les SERPs ?
  10. Peut-on vraiment piloter l'indexation des PDF via les headers HTTP ?
  11. Faut-il encore utiliser rel=next et rel=prev pour la pagination ?
  12. Googlebot peut-il vraiment indexer vos contenus en défilement infini ?
  13. Faut-il vraiment indexer toutes les pages de son site ?
  14. Faut-il s'inquiéter de la page référente affichée dans Google Search Console ?
  15. Faut-il vraiment rediriger l'ancien sitemap en 301 ou soumettre le nouveau directement ?
  16. Pourquoi 97% de crawl refresh est-il un signal positif pour votre site ?
  17. Comment Google détermine-t-il réellement la vitesse de crawl de votre site ?
  18. Pourquoi Google ralentit-il son crawl après un changement d'hébergement ?
  19. Le paramètre de taux de crawl est-il vraiment un plafond et non un objectif ?
  20. Le CTR peut-il vraiment pénaliser le reste de votre site ?
  21. Le maillage interne est-il vraiment l'élément le plus déterminant pour le SEO ?
  22. Le linking interne agit-il vraiment instantanément après recrawl ?
  23. Faut-il s'inquiéter si Google ne crawle pas toutes vos pages ?
📅
Official statement from (4 years ago)
TL;DR

Google clearly distinguishes between crawl speed and Core Web Vitals. The former measures only the time to fetch a URL from the server, without JavaScript or rendering. CWV incorporates the complete user experience: rendering, external resources, interactivity. Confusing the two means missing what really matters.

What you need to understand

What does Google actually mean by "crawl speed"?

Crawl speed measures the time Googlebot needs to fetch raw HTML code from your server. Period. No rendering, no JavaScript execution, no external resource loading.

This is a purely technical metric that concerns infrastructure performance: server response time (TTFB), network latency, resource availability. If your server takes 3 seconds to deliver HTML, your crawl speed is catastrophic — even if the page displays instantly on the user's end.

Do Core Web Vitals measure the same thing?

No. Core Web Vitals evaluate the actual user experience in the browser. They include complete rendering: time to display main content (LCP), responsiveness to interactions (INP), visual stability (CLS).

These metrics integrate JavaScript, CSS, images, external fonts, lazy loading, React/Vue hydration — everything that makes up the perceived experience. A site can have excellent crawl speed (HTML delivered in 200ms) but disastrous CWV (LCP at 5s due to a bloated JS bundle).

Why does this distinction change the game for SEO?

Because Google uses these two metrics for different objectives. Crawl speed impacts crawl budget: the faster your pages are to fetch server-side, the more Googlebot can explore in its allocated time.

CWV influences ranking via the Page Experience signal. Optimizing one without the other is like walking on one leg. A slow server-side site will be crawled infrequently. A fast server-side but poor user experience site will be well-crawled but poorly ranked.

  • Crawl speed: server metric, impacts crawl budget and content discovery efficiency
  • Core Web Vitals: user metrics, integrate complete rendering and JavaScript, influence rankings
  • Optimizing only TTFB doesn't guarantee good CWV if the front-end is poorly built
  • Conversely, an ultra-lightweight front-end doesn't compensate for a sluggish server

SEO Expert opinion

Is this distinction new, or is Google just stating the obvious?

Let's be honest: for seasoned SEO professionals, this clarification isn't a revelation. We've known for years that TTFB and LCP don't play in the same league. But Mueller is answering here a recurring confusion among beginners and even some developers who think a CDN + cache solves everything.

What this statement confirms is that Google measures two distinct pipelines. One for exploration efficiency (how many URLs can I crawl per second), another for experience quality (is the user suffering or not). Confusing them means optimizing in the wrong direction.

What nuances should we add to this statement?

Mueller simplifies — intentionally. He says crawl speed "only measures fetch time," but Googlebot doesn't crawl in a vacuum. It accounts for HTTP responses, redirects, 5xx errors, timeouts. A server throwing random 503 errors has catastrophic "speed" even if HTML arrives fast when it works.

Another point: Mueller speaks of "complete rendering" for CWV, but attention — CWV is measured from real users (CrUX), not by Googlebot. The bot can crawl without executing JS (standard mode), or with rendering (WRS), but that doesn't impact CWV. These come from Chrome, not from crawl. [To verify] whether Google also uses Lighthouse synthetic data in certain cases (new sites without Chrome traffic).

In what cases doesn't this rule fully apply?

For sites with tight crawl budgets (large e-commerce, aggregators, news sites), crawl speed becomes critical even with green CWV. If Google takes 2 seconds to fetch each URL, it will only explore 43,000 pages per day with a fixed crawl budget. Reducing to 500ms multiplies the explored surface by 4.

Conversely, for a brochure site of 20 pages, crawl speed is anecdotal. Google will crawl everything anyway. However, poor CWV can tank CTR and bounce rate — thus indirectly impact SEO through behavioral signals.

Warning: Never sacrifice user experience (CWV) to optimize crawl speed. Google always prioritizes UX in its ranking algorithm. A server-side ultra-fast but front-end unusable site won't rank.

Practical impact and recommendations

What should you prioritize first: crawl speed or Core Web Vitals?

It depends on your context. If you have millions of pages and Search Console shows uncrawled URLs or saturated crawl budget, server speed is the priority. Reduce TTFB, enable Brotli compression, optimize database queries, scale infrastructure.

If you have a typical site (hundreds to tens of thousands of pages) and your CWV are orange or red, focus on the front-end. LCP, INP, CLS — that's what impacts your ranking and conversion rate. A TTFB of 300ms vs 150ms makes no difference if your LCP is 4 seconds.

How do you precisely measure these two metrics?

For crawl speed, check Search Console > Settings > Crawl Statistics. Google gives you the average download time in milliseconds. Compare with your actual TTFB (curl, WebPageTest, New Relic). If the gap is significant, Googlebot is seeing something different than your users — likely a server geolocation issue or user-agent handling problem.

For Core Web Vitals, rely on real-world CrUX data (PageSpeed Insights, Search Console > Experience > Web Vitals). Lighthouse is useful for diagnostics, but synthetic scores don't always reflect reality. A site might score 95/100 in the lab and crash in production due to real traffic, A/B tests, third-party ads.

What mistakes should you absolutely avoid?

  • Don't block CSS/JS in robots.txt to save crawl budget — Google needs these resources to evaluate CWV through rendering
  • Optimize only initial HTML and forget critical resources (fonts, hero images, navigation JS) that tank LCP
  • Confuse TTFB and FCP: a fast server doesn't guarantee fast First Contentful Paint if the browser waits for 50 external requests
  • Ignore crawl budget for large sites: even with perfect CWV, if Google doesn't crawl your new pages, they'll never rank
  • Sacrifice UX to shave 50ms off TTFB: the ranking impact of CWV is higher than crawl speed for 99% of sites
Summary: Crawl speed and Core Web Vitals measure two distinct things. The first impacts exploration efficiency, the second ranking and user experience. Both deserve attention, but in different proportions depending on your site type. For large catalogs with limited crawl budget, prioritize server infrastructure first. For typical sites, focus on CWV. These optimizations often require pointed technical expertise and tricky tradeoffs between SEO, development, and infrastructure — if you lack internal resources, working with a specialized SEO agency can significantly accelerate compliance and avoid costly mistakes.

❓ Frequently Asked Questions

Un bon TTFB garantit-il de bons Core Web Vitals ?
Non. Le TTFB mesure uniquement la rapidité du serveur à renvoyer le HTML initial. Les CWV incluent tout le rendu côté client : JavaScript, images, fonts, layout shifts. Un TTFB de 100ms n'empêche pas un LCP de 4s si votre bundle JS pèse 2 Mo.
La vitesse de crawl impacte-t-elle directement mon ranking ?
Pas directement. Elle influence le crawl budget, donc la capacité de Google à découvrir et indexer rapidement vos nouvelles pages. Indirectement, une page non crawlée ne peut pas ranker. Mais ce n'est pas un facteur de ranking en tant que tel, contrairement aux CWV.
Googlebot exécute-t-il JavaScript pour mesurer la vitesse de crawl ?
Non. La vitesse de crawl mesure le temps de récupération du HTML brut, sans rendu. Le WRS (Web Rendering Service) peut exécuter du JS pour indexer le contenu, mais c'est un processus séparé qui n'affecte pas la métrique de vitesse de crawl.
Dois-je optimiser différemment pour Googlebot et pour les utilisateurs ?
Idéalement, non. Optimisez votre infrastructure serveur (TTFB, cache, compression) pour le crawl ET votre front-end (JS, images, rendu) pour les CWV. Les deux bénéficient au SEO. Cloaker pour Googlebot est risqué et généralement contre-productif.
Comment savoir si mon crawl budget est saturé ?
Dans Search Console, allez dans Paramètres > Statistiques d'exploration. Si vous voyez un plateau dans les pages crawlées par jour alors que vous publiez du nouveau contenu régulièrement, c'est un signe. Comparez aussi les URLs découvertes vs crawlées dans le rapport de couverture.
🏷 Related Topics
Crawl & Indexing JavaScript & Technical SEO Domain Name Web Performance

🎥 From the same video 23

Other SEO insights extracted from this same Google Search Central video · published on 18/02/2022

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.