Official statement
Other statements from this video 4 ▾
- 0:03 Googlebot ignore-t-il vraiment les redirections 307 HSTS ou y a-t-il un piège ?
- 0:34 Les redirections 307 HSTS sont-elles réellement invisibles pour le SEO ?
- 0:34 Googlebot ignore-t-il vraiment vos redirections HTTPS forcées ?
- 1:05 Googlebot suit-il vraiment les redirections HTTP vers HTTPS comme un navigateur classique ?
Googlebot does not interpret 307 HSTS redirects the same way a regular browser would. In practical terms, these automatic redirects to HTTPS mandated by the browser for security reasons do not impact crawling or indexing. For SEO, this means you should not worry excessively about these temporary redirects, but rather ensure that 301 permanent redirects to HTTPS are properly configured on the server side.
What you need to understand
What are 307 HSTS redirects really about?
307 Internal Redirects related to HSTS (HTTP Strict Transport Security) are a security mechanism managed by the browser, not by the server. When a site activates HSTS, it tells browsers that all future requests must be made using HTTPS only for a specified duration.
The browser then creates automatic 307 redirects before even querying the server. This process is invisible on the server side: there are no traces in the logs, and no actual HTTP requests sent. The browser simply bypasses any connection attempt over HTTP.
Why doesn’t Googlebot see these redirects?
Googlebot operates differently from a standard browser. It does not store HSTS cache between crawl sessions, unlike Chrome or Firefox, which remember this directive for months.
When Googlebot crawls a site, it interprets the HSTS directives returned by the server, but does not apply automatic 307 redirects on subsequent visits. It continues to follow the traditional HTTP redirects (301, 302) configured on the server, which are the only ones truly relevant for SEO.
How does this distinction impact SEO?
The distinction between browser redirects and server redirects is crucial. The redirects that Google considers for indexing, PageRank, and ranking signals are only those it physically encounters during its crawl.
If your server does not return any 301 redirects from HTTP to HTTPS, but only the browser performs a 307 HSTS, Google will continue to crawl the HTTP version. This can create issues with duplicate content, dilution of PageRank between HTTP and HTTPS, and mixed signals for ranking algorithms.
- 307 HSTS redirects are managed solely by the browser, invisible to Googlebot
- Googlebot relies on classic server redirects (301, 302) for crawling and indexing
- A HTTPS site must always implement permanent 301 redirects on the server side, even with HSTS active
- Enabling HSTS enhances user security but does not replace a proper SEO configuration for redirects
- Server logs will never show traces of 307 HSTS, which can create confusion during audits
SEO Expert opinion
Does this statement truly reflect on-the-ground behavior?
Yes, this assertion from Mueller aligns with what we observe in crawl audits. When analyzing the server logs of a site with HSTS enabled, we find no traces of 307 redirects. These redirects exist solely within the user's browser lifecycle.
The risk is that some SEOs may think that enabling HSTS is sufficient for a clean migration to HTTPS. This is a mistake. Without 301 redirects configured at the Apache, Nginx, or CDN level, Googlebot will continue to index HTTP URLs, creating a massive duplicate content scenario that is particularly damaging. [To verify] : the extent to which Google automatically detects HSTS as a signal for HTTPS preference remains unclear.
What are the interpretative pitfalls of this statement?
The wording "this poses no problem for SEO" can be misleading. Mueller is only stating that the absence of visibility of 307 HSTS by Googlebot is not a problem in itself. He does not say that HSTS alone is sufficient for a successful HTTPS migration.
Some practitioners may have interpreted this statement as a green light to rely solely on HSTS. This is a major tactical error. Server redirects remain the foundational pillar of any migration, and HSTS should be viewed as an additional layer of security, not as a replacement for good SEO practices.
Are there cases where this rule does not apply?
This rule universally applies to Googlebot. However, other bots (Bingbot, social crawlers, audit tools) may have different behaviors in relation to HSTS. Some modern SEO audit tools emulate a complete browser and will therefore see the 307 HSTS.
This can create discrepancies between what your monitoring tools report and what Google actually indexes. When a Screaming Frog or OnCrawl tool shows 307s but Search Console reports none, that is normal. The key is to check the raw server logs to understand what Googlebot is truly seeing.
Practical impact and recommendations
How can you properly set up HTTPS without solely relying on HSTS?
The fundamentals remain critical: implement permanent 301 redirects on the server for all HTTP URLs to their HTTPS equivalents. This configuration should take place at the web server level (Apache .htaccess, Nginx conf) or the CDN if you are using one.
Only then activate HSTS with a Strict-Transport-Security header having an appropriate duration (start with 6 months, then increase to 2 years). Ensure that the header includes the includeSubDomains and preload directives if you plan to join the preload list of browsers.
What mistakes should you avoid during an HTTPS migration?
The classic mistake is to enable HSTS, see that the site functions perfectly in HTTPS for visitors, and neglect server redirects. The result: Google continues to index the HTTP versions, creating a massive duplication that dilutes ranking signals.
Another trap: configuring temporary 302 redirects instead of 301s. The 302s do not send a clear signal to Google that the move is permanent, slowing down the consolidation of signals to the HTTPS version. Finally, forgetting to update canonical tags, XML sitemaps, and internal links to point to HTTPS creates inconsistencies that Google may take weeks to resolve.
How can you verify that your configuration is optimal?
Use your server logs to analyze the actual requests from Googlebot. You should see initial HTTP requests followed by 301 codes to HTTPS, and then gradually, a majority of requests directly in HTTPS. This is a sign that Google has understood and adopted your secure version.
Also, check in Search Console that the indexed URLs are indeed in HTTPS. Use the index coverage report and the URL inspection tool to confirm that Google is processing your redirects correctly. If many HTTP URLs persist for weeks after migration, it's a red flag.
- Implement permanent 301 redirects from HTTP to HTTPS at the server level
- Activate HSTS with a gradual max-age (6 months then 2 years)
- Update all canonical tags to point to the HTTPS URLs
- Generate and submit a new XML sitemap with only the HTTPS URLs
- Analyze the server logs to confirm that Googlebot is following the 301 redirects properly
- Monitor Search Console to verify the gradual shift in indexing towards HTTPS
❓ Frequently Asked Questions
HSTS remplace-t-il les redirections 301 pour le SEO ?
Pourquoi mes outils SEO montrent des 307 mais pas Search Console ?
Les redirections 307 HSTS ont-elles un impact sur le crawl budget ?
Faut-il désactiver HSTS pour faciliter l'exploration de Google ?
Comment savoir si Google indexe bien mes URLs HTTPS et non HTTP ?
🎥 From the same video 4
Other SEO insights extracted from this same Google Search Central video · duration 1 min · published on 28/10/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.