What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

The switch to HTTP/2 crawling by Googlebot has no impact on the user-visible speed metrics (Core Web Vitals). These performance reports remain unchanged as they measure the real user experiences, not what Googlebot sees.
6:27
🎥 Source video

Extracted from a Google Search Central video

⏱ 1h01 💬 EN 📅 15/01/2021 ✂ 27 statements
Watch on YouTube (6:27) →
Other statements from this video 26
  1. 2:11 Comment la position d'un lien dans l'arborescence influence-t-elle vraiment la fréquence de crawl ?
  2. 2:11 Les liens depuis la homepage augmentent-ils vraiment la fréquence de crawl ?
  3. 2:43 Pourquoi Google ignore-t-il vos balises title et meta description ?
  4. 3:13 Pourquoi Google réécrit-il vos titres et meta descriptions malgré vos optimisations ?
  5. 4:47 Faut-il vraiment se soucier du crawl HTTP/2 de Google ?
  6. 4:47 Faut-il vraiment s'inquiéter du passage de Googlebot au crawling HTTP/2 ?
  7. 5:21 HTTP/2 booste-t-il vraiment le crawl budget ou surcharge-t-il simplement vos serveurs ?
  8. 6:21 HTTP/2 améliore-t-il vraiment les Core Web Vitals de votre site ?
  9. 8:32 L'outil de suppression d'URL empêche-t-il vraiment Google de crawler vos pages ?
  10. 9:02 Pourquoi l'outil de suppression d'URL de Google ne retire-t-il pas vraiment vos pages de l'index ?
  11. 13:13 Faut-il vraiment ajouter nofollow sur chaque lien d'une page noindex ?
  12. 13:38 Les pages en noindex bloquent-elles vraiment la transmission de valeur via leurs liens ?
  13. 16:37 Canonical ou redirection 301 : comment gérer proprement la migration de contenu entre plusieurs sites ?
  14. 26:00 Pourquoi x-default est-il obligatoire sur une homepage avec redirection linguistique ?
  15. 28:34 Faut-il craindre une pénalité SEO en apparaissant dans Google News ?
  16. 31:57 Faut-il vraiment supprimer vos vieux contenus ou les améliorer pour le SEO ?
  17. 32:08 Faut-il vraiment supprimer votre vieux contenu de faible qualité pour améliorer votre SEO ?
  18. 33:22 L'outil de suppression d'URL retire-t-il vraiment vos pages de l'index Google ?
  19. 35:37 Les traits d'union cassent-ils vraiment le matching exact de vos mots-clés ?
  20. 35:37 Les traits d'union dans les URLs et le contenu nuisent-ils vraiment au référencement ?
  21. 38:48 L'API Natural Language de Google reflète-t-elle vraiment le fonctionnement de la recherche ?
  22. 41:49 Pourquoi Google refuse-t-il d'indexer les images sans page HTML parente ?
  23. 42:56 Faut-il vraiment soumettre les pages HTML dans un sitemap images plutôt que les fichiers JPG ?
  24. 45:08 Le duplicate content technique nuit-il vraiment au référencement de votre site ?
  25. 45:41 Le duplicate content technique pénalise-t-il vraiment votre site ?
  26. 53:02 Faut-il détailler chaque URL dans une demande de réexamen après pénalité manuelle ?
📅
Official statement from (5 years ago)
TL;DR

Google claims that Googlebot's HTTP/2 crawl does not affect Core Web Vitals or any user speed metrics. Performance reports measure the real experience of visitors, not that of the bot. For SEO professionals, this means one thing: optimizing your server for HTTP/2 regarding Googlebot won't change your CWV score — you need to work on the user experience itself.

What you need to understand

Google has officially confirmed that its bot has been crawling in HTTP/2 for several years. Many professionals have wondered whether this technical change has an effect on Core Web Vitals, the performance metrics that weigh into rankings since the introduction of Page Experience.

John Mueller's response is unequivocal: no impact. The speed metrics visible in Search Console or via CrUX measure what your real visitors experience, not what Googlebot sees.

Why does this confusion exist?

The HTTP/2 protocol does indeed improve server-side loading speed: multiplexing, header compression, request prioritization. When a site switches from HTTP/1.1 to HTTP/2, the gain can be measurable for users — especially on pages with many resources.

Some have therefore made the shortcut: if Googlebot crawls faster with HTTP/2, maybe Google will take it into account in its performance reports. Except that it won't. Core Web Vitals come from the Chrome User Experience Report (CrUX), which aggregates real user browsing data from Chrome. Googlebot never contributes to this database.

What does this mean for crawl budget?

The HTTP/2 crawl can theoretically improve the crawl efficiency of Googlebot, especially on high-volume sites. The bot retrieves pages faster, consumes less bandwidth, and can handle more simultaneous requests.

But — and this is a crucial 'but' — Google has always said that crawl budget is only an issue for very large sites. For the vast majority of projects, even with thousands of pages, it is not a limiting factor. So don't count on HTTP/2 on the server-side to solve an indexing problem you probably don't have.

What exactly do Core Web Vitals measure?

Core Web Vitals consist of three metrics: LCP (Largest Contentful Paint), FID (First Input Delay, soon to be replaced by INP), and CLS (Cumulative Layout Shift). They measure the loading time of main content, responsiveness to interactions, and visual stability.

These measurements come solely from real user browsers — via CrUX, which collects data from Chrome on millions of sites. Googlebot does not load CSS, does not render JavaScript on the client-side like a browser, and never contributes to CWV reports.

  • The HTTP/2 crawl of Googlebot has no effect on Core Web Vitals or any user performance reports.
  • The CWVs come exclusively from the Chrome User Experience Report (CrUX), based on real browsing data.
  • Optimizing HTTP/2 on the server can help the crawl budget on very large sites, but will change nothing regarding your speed score.
  • To improve CWVs, you must optimize the real user experience: server response time, resource size, client-side rendering.
  • Googlebot in HTTP/2 can crawl faster, but this does not translate into a speed-related ranking advantage.

SEO Expert opinion

Is this statement consistent with what we observe in the field?

Yes, completely. Sites that have migrated to HTTP/2 on the server-side without touching the front-end have seen no variation in their Core Web Vitals reports. The performance gains of HTTP/2 are real, but they benefit users only if their browser supports the protocol — and all modern browsers do.

Googlebot benefits from the protocol to crawl faster, but this remains invisible in public metrics. If your LCP stagnates at 3.5 seconds, switching your server to HTTP/2 won't bring it down to 2.0. You need to optimize images, reduce blocking JavaScript, improve caching, and speed up TTFB.

What nuances should be added?

There is a nuance rarely mentioned: if your site is not yet on HTTPS, you cannot take advantage of HTTP/2 — and then, yes, you have a problem. Not only is HTTPS a direct ranking factor, but you miss out on all the performance gains of the modern protocol.

Another point: HTTP/2 improves the speed of loading multiple resources (CSS, JS, images). If your site loads 50 requests in HTTP/1.1, switching to HTTP/2 may reduce the total time by a few hundred milliseconds. This might be enough to drop your LCP below the 2.5-second threshold. But it's the user who benefits, not Googlebot in its reports.

[To be verified]: Google has never released specific figures on crawl budget improvement related to HTTP/2. We know that the bot crawls faster, but it's impossible to quantify how many additional pages are indexed due to this gain. For most sites, the effect remains marginal.

In what cases does this rule not apply?

This rule applies in all cases: Core Web Vitals always measure user experience, never that of Googlebot. It doesn't matter the size of the site, the industry, or the server configuration.

However, there are situations where HTTP/2 has an indirect impact on SEO. If your HTTP/2 server is misconfigured (for example, incorrect prioritization of resources), you may degrade the user experience — and thus your CWVs. Likewise, if you aggressively enable HTTP/2 Server Push, you can overload the browser and slow down rendering. In such cases, it's HTTP/2 itself that poses the problem, not its lack of effect on Googlebot.

Warning: Never confuse crawling and indexing with user experience. Googlebot can crawl your site in 200 milliseconds and your users may take 5 seconds to see the content. These are two completely distinct pathways. Optimize for the user first — the crawling will follow.

Practical impact and recommendations

What should you actually do to improve Core Web Vitals?

Focus on the real user experience, not on Googlebot. This means: reduce server response time (TTFB), optimize images (WebP, lazy loading), eliminate blocking JavaScript, and stabilize layout (reserve space for ads and images).

Use real-world measurement tools: PageSpeed Insights, Lighthouse, and especially the Core Web Vitals report in Search Console. The latter shows you problematic URLs based on real CrUX data. Don't rely solely on lab tests — it's the real browsing data that counts for Google.

What mistakes to avoid?

The first mistake: believing that enabling HTTP/2 on your server will automatically improve your CWVs. No. HTTP/2 is a modern prerequisite, but it is not a magic solution. If your LCP is poor, it's likely due to an unoptimized 2MB hero image, not an outdated HTTP protocol.

The second mistake: neglecting server-side caching and CDN. HTTP/2 speeds up transfer, but if every request hits your database, you remain slow. A good cache (Varnish, Redis, Cloudflare) can divide your TTFB by 10 — far more than any HTTP/2 gain.

The third mistake: optimizing for Googlebot instead of optimizing for the user. Googlebot has no speed issues. Your visitors do. Focus on them.

How to check that your site is well configured?

First, ensure that your server is serving HTTPS and that HTTP/2 is enabled. You can test this with a tool like HTTP/2 Test or simply by inspecting the response headers in Chrome DevTools (Network tab, Protocol column).

Next, analyze your Core Web Vitals in Search Console. If you don't have enough traffic for CrUX data, use PageSpeed Insights and Lighthouse to identify bottlenecks. Focus on the URLs receiving the most organic traffic — that's where SEO impact will be strongest.

Finally, monitor your crawl budget in the exploration reports from Search Console. If you see thousands of pages not crawled for months, that's a signal — but HTTP/2 probably won't be the solution. Instead, look at your internal link structure, your sitemap, and your robots.txt directives.

  • Enable HTTPS and HTTP/2 on your server (nginx, Apache, or via a CDN like Cloudflare).
  • Optimize images: WebP format, compression, lazy loading, and appropriate sizes.
  • Reduce blocking JavaScript: defer, async, and eliminate unnecessary scripts.
  • Improve TTFB: server caching, CDN, optimize database.
  • Stabilize the layout: reserve space for images, ads, and dynamic elements.
  • Monitor Core Web Vitals in Search Console and PageSpeed Insights.
These technical optimizations can quickly become complex, especially if your site relies on a heavy technology stack or if you're managing thousands of pages. In this case, personalized support from a specialized SEO agency can help you diagnose blockages and deploy the right optimizations without breaking your site. Sometimes, outsourcing this part can save you months — and avoid costly mistakes.

❓ Frequently Asked Questions

Le passage de Googlebot en HTTP/2 améliore-t-il mon ranking ?
Non, pas directement. Googlebot crawle plus vite en HTTP/2, mais cela n'affecte ni vos Core Web Vitals ni votre score de vitesse utilisateur. Le ranking dépend de l'expérience réelle de vos visiteurs, pas du bot.
Dois-je activer HTTP/2 sur mon serveur pour le SEO ?
Oui, mais pas pour Googlebot. HTTP/2 améliore la vitesse de chargement pour vos utilisateurs réels, ce qui peut améliorer vos Core Web Vitals et donc votre ranking. C'est un prérequis moderne, pas une optimisation pour le bot.
Les Core Web Vitals sont-ils mesurés par Googlebot ?
Non. Les Core Web Vitals proviennent du Chrome User Experience Report (CrUX), qui agrège des données de navigation réelle d'utilisateurs Chrome. Googlebot ne contribue jamais à ces métriques.
HTTP/2 peut-il résoudre mes problèmes de crawl budget ?
Peut-être, mais seulement sur de très gros sites. HTTP/2 permet à Googlebot de crawler plus vite, mais pour la majorité des sites, le crawl budget n'est pas un facteur limitant. Concentrez-vous d'abord sur la structure de liens internes et le sitemap.
Quelle est la différence entre HTTP/2 et HTTP/3 pour le SEO ?
HTTP/3 (basé sur QUIC) améliore encore la vitesse et la fiabilité, surtout sur connexions mobiles instables. Googlebot supporte HTTP/3, mais comme pour HTTP/2, cela n'affecte pas vos Core Web Vitals. L'impact SEO reste indirect, via l'amélioration de l'expérience utilisateur réelle.
🏷 Related Topics
Domain Age & History Crawl & Indexing HTTPS & Security Web Performance Search Console

🎥 From the same video 26

Other SEO insights extracted from this same Google Search Central video · duration 1h01 · published on 15/01/2021

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.