Official statement
Other statements from this video 4 ▾
- 0:34 Les redirections 307 HSTS sont-elles réellement invisibles pour le SEO ?
- 0:34 Googlebot ignore-t-il vraiment vos redirections HTTPS forcées ?
- 1:05 Googlebot suit-il vraiment les redirections HTTP vers HTTPS comme un navigateur classique ?
- 1:05 Les redirections 307 HSTS peuvent-elles nuire au référencement de votre site ?
Googlebot does not follow 307 redirects generated by HSTS because they are not true server-side redirects but rather browser-side behaviors in Chrome. For SEO, this means relying solely on HSTS to redirect HTTP to HTTPS is insufficient: Googlebot will never see the HTTPS version without a traditional 301/302 server redirect. Configure your HTTPS redirects at the server level, not just through HSTS.
What you need to understand
What is HSTS and why does it generate 307 redirects?
HSTS (HTTP Strict Transport Security) is a security mechanism that forces browsers to always use HTTPS for a given domain. When a user attempts to access the HTTP version of a site configured with HSTS, Chrome automatically generates a 307 internal redirect to switch to HTTPS.
This redirect is never sent by the server: it is created locally by the browser even before the HTTP request is made. The server never sees this HTTP attempt, and no trace of the 307 redirect appears in the server logs. It is a purely client-side behavior.
Why doesn't Googlebot follow these 307 redirects?
Googlebot is not a typical web browser. It does not implement the same client-side security mechanisms as Chrome, including HSTS handling. When Googlebot crawls an HTTP URL on a domain with HSTS enabled, it never sees the 307 redirect generated by Chrome.
In practical terms: if your only method of redirecting HTTP to HTTPS is HSTS, Googlebot will continue to crawl the HTTP version without ever being redirected to HTTPS. To it, the HTTP version is the only one that exists. This is a major issue for indexing and duplicate content.
Is HSTS compatible with a clean HTTPS SEO strategy?
HSTS is an excellent security complement, but it never replaces a traditional server redirect. For Google to properly index your site in HTTPS, you need to set up a permanent 301 redirect at the server level (Apache, Nginx, CDN) to force HTTP to HTTPS.
HSTS then acts as an additional layer for browsers: once the server redirect is done and the HSTS header is received, Chrome stores the information and will never attempt HTTP again. But this security only concerns human users, not bots.
- Googlebot does not follow HSTS 307 redirects because they are browser redirects, not server redirects
- HSTS is a client-side security mechanism, invisible to crawlers
- A 301 server redirect from HTTP → HTTPS remains mandatory for SEO
- HSTS complements an HTTPS strategy, but never replaces it
- Server logs never show traces of HSTS 307 redirects
SEO Expert opinion
Is this statement consistent with real-world observations?
Absolutely. In practice, we regularly see sites enabling HSTS thinking that it is enough to migrate to HTTPS, only to end up with massive duplicate content between HTTP and HTTPS. Google indexes both versions, sometimes favoring the legacy HTTP version from the site's history.
What is less known is that some third-party crawlers (SEMrush, Ahrefs) also only follow server redirects. Relying on HSTS to manage HTTP → HTTPS redirection exposes you to inconsistencies in SEO audits and ranking tracking tools. [To be confirmed]: some modern crawlers might start interpreting HSTS, but this is not officially documented.
What nuances should be applied to this rule?
The main nuance concerns sites that have already migrated to HTTPS with server redirects in place. If your Apache/Nginx configuration properly redirects HTTP to HTTPS with a permanent 301, HSTS has no negative SEO impact. On the contrary, it provides welcome security for users.
The problem arises when developers add HSTS via a Strict-Transport-Security header, thinking it exempts them from configuring the server redirect. Or worse, when a site is listed on Chrome's HSTS preload list without having implemented real server redirects. In this case, Chrome users see HTTPS, but Googlebot crawls HTTP.
In what cases could this rule evolve?
Google could theoretically evolve Googlebot to interpret HSTS like modern browsers do. However, this is unlikely in the short term, as Googlebot is designed to crawl the web as it is served by servers, not as it is transformed by browsers.
A more realistic evolution would be for Google to automatically detect sites with active HSTS but without server redirects, and apply a ranking penalty or partial deindexing of the HTTP version. But for now, Google merely crawls what it is served. The responsibility remains with the webmaster.
Practical impact and recommendations
What concrete steps should be taken to avoid SEO issues?
The first action: audit your redirect configuration. Manually test the HTTP URL of your homepage with curl or a tool like Redirect Checker. Make sure the server correctly returns a 301 (permanent) or 302 (temporary, but less recommended) pointing to the HTTPS version.
If you see a 200 OK on the HTTP version, it means you do not have a server redirect. HSTS will not save your indexing. Immediately configure a redirect at the Apache (.htaccess), Nginx (config file), or via your CDN. Never rely on HSTS alone.
How to check that Googlebot accesses the HTTPS version properly?
Use Google Search Console to inspect a URL and see which version Googlebot crawls. If the inspected URL is HTTP while you believe HSTS is enabled, that’s proof that Googlebot is ignoring your HSTS. Also check the server logs to see which version (HTTP or HTTPS) Googlebot is actually requesting.
An additional simple test: manually submit an HTTP URL via the URL inspection tool in Search Console and request indexing. If Google indexes the HTTP version instead of following to HTTPS, your server redirect is absent or misconfigured. [To be verified]: some proxies or firewalls can interfere with redirects; test from multiple locations.
What mistakes to avoid when configuring HTTPS + HSTS?
Classic error: activating HSTS with the includeSubDomains flag without configuring HTTPS on all subdomains. Googlebot will crawl the subdomains in HTTP, and Chrome users will see certificate errors. Result: chaotic indexing and loss of traffic.
Another trap: adding your domain to the HSTS preload list before migrating the entire site to HTTPS with server redirects in place. Once on the preload list, it is impossible to easily revert. Chrome will enforce HTTPS even if your server does not yet support it, creating user-side errors without affecting Googlebot (which will continue to crawl HTTP).
- Configure a server redirect 301 HTTP → HTTPS (Apache, Nginx, CDN)
- Test the redirect with curl or a dedicated tool before enabling HSTS
- Check in Google Search Console that Googlebot is indeed crawling the HTTPS version
- Never rely on HSTS alone to manage the HTTPS migration
- Activate HSTS only after confirming that server redirects are functioning
- Avoid includeSubDomains if not all subdomains are in HTTPS
❓ Frequently Asked Questions
HSTS peut-il remplacer une redirection serveur 301 pour le SEO ?
Googlebot suit-il les redirections 307 en dehors du contexte HSTS ?
Comment savoir si mon site utilise HSTS ?
Faut-il désactiver HSTS pour éviter les problèmes avec Googlebot ?
Un site sur la HSTS preload list est-il mieux crawlé par Google ?
🎥 From the same video 4
Other SEO insights extracted from this same Google Search Central video · duration 1 min · published on 28/10/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.