Official statement
Other statements from this video 11 ▾
- 1:47 Pourquoi Google modifie-t-il les données Discover dans Search Console ?
- 2:09 Votre site perd-il du trafic parce que votre version mobile cache du contenu ?
- 2:09 L'indexation mobile-first exclut-elle vraiment tout contenu absent de votre version mobile ?
- 3:42 Faut-il vraiment migrer data-vocabulary.org vers schema.org pour éviter une pénalité ?
- 3:42 Pourquoi Google abandonne-t-il définitivement le balisage data-vocabulary.org pour les fils d'Ariane ?
- 4:46 BERT change-t-il vraiment la façon dont Google comprend vos pages ?
- 4:46 Comment BERT transforme-t-il réellement la manière dont Google évalue vos contenus ?
- 5:49 Faut-il renoncer au featured snippet pour garder votre position organique ?
- 5:49 Faut-il vraiment viser les Featured Snippets si Google supprime le résultat classique ?
- 6:45 Le contenu mixte HTTPS menace-t-il vos positions Google ?
- 7:23 Faut-il modifier votre détection de Googlebot suite à la mise à jour du user agent ?
Chrome is now strictly blocking all non-HTTPS content (images, scripts, CSS) loaded on secure pages. For SEO, this means visually broken pages, failing scripts, and potentially a negative signal for Google that values user experience. Check your Search Console: mixed content errors appear there and directly impact your Core Web Vitals.
What you need to understand
What exactly is mixed content?
We refer to mixed content when a page served over HTTPS loads resources (images, JS scripts, CSS stylesheets, iframes) via unprotected HTTP. For years, browsers tolerated this situation with just a discreet warning in the address bar — a crossed-out or grayed-out padlock.
Chrome has radically changed its stance. The browser now blocks these insecure resources by default. In practical terms: an image served over HTTP on your HTTPS page no longer appears. An external non-HTTPS script won't execute. Your Bootstrap carousel that calls jQuery via HTTP CDN? Dead.
Why is Chrome tightening its stance now?
The answer is one word: security. Mixed content opens a gaping hole. An attacker can intercept these unencrypted HTTP requests and inject malicious code — invasive ad tracking, redirections, or even phishing.
Google has been pushing towards a 100% HTTPS web for years. The ranking boost given to HTTPS dates back to 2014. The "Not Secure" label on HTTP sites in Chrome dates back to 2018. The strict blocking of mixed content is just the logical next step. And it pushes developers who were dragging their feet.
What are the concrete consequences for your site?
If your HTTPS site still loads HTTP resources, Chrome users will see a degraded page: missing images, broken layout, non-functioning JavaScript features. This destroys user experience. Bounce rates soar, session times plummet — and Google tracks these behavioral signals.
Worse: if your tracking scripts (Google Analytics, ad pixels) are blocked, you lose data. Your conversions become unverifiable. And if your main CSS doesn't load, Chrome displays a blank page or unreadable raw HTML. The Cumulative Layout Shift can skyrocket if elements disappear afterward.
- Automatic blocking of all HTTP resources on HTTPS pages in Chrome
- Catastrophic user experience: missing images, dead scripts, broken CSS
- Indirect SEO impact via behavioral signals (bounce rate, session time) and Core Web Vitals (especially CLS)
- Loss of analytics data if tracking tags are blocked
- Need for a comprehensive audit of all loaded resources (CDN, third-party widgets, ads)
SEO Expert opinion
Is this statement consistent with what we're observing on the ground?
Absolutely. Feedback from developers and SEO experts confirms: Chrome has indeed been blocking mixed content for several versions. This is not a future threat — it is already active. The browser console displays explicit red errors like "Mixed Content: The page was loaded over HTTPS, but requested an insecure resource. This request has been blocked."
Where it gets tricky is that many legacy sites still carry these HTTP dependencies. I've seen e-commerce sites lose 15-20% of conversions because their third-party payment script loaded a Google Fonts font over HTTP. The visitor saw a broken banking form and left. The diagnosis took three weeks because no one systematically tested under Chrome.
What is the real SEO impact beyond security?
Google has never explicitly said "mixed content is a negative ranking factor." But let's be honest: the indirect impact is massive. A broken page translates to a soaring bounce rate and a plunging session time. Google interprets these signals as "low-quality content". The result: a drop in rankings, even if technically mixed content isn't a direct ranking criterion.
Then there are the Core Web Vitals. If your CSS loads over HTTP and gets blocked, the browser must recalculate the entire layout. Catastrophic CLS. If your images disappear afterward, same punishment. And LCP can degrade if Chrome spends time trying to load resources it will eventually block. [To be checked]: we still lack quantified data on how many exact milliseconds mixed content blocking costs on LCP, but field observations show a measurable impact.
In what cases does this problem go unnoticed?
If your site is 100% under your control — no external CDN, no third-party widgets, no programmatic ads — you’ve probably migrated cleanly to HTTPS and never faced this issue. Modern CMSs (recent WordPress, Shopify, etc.) handle this natively by rewriting internal URLs to relative HTTPS.
The trap lies in third-party resources: an old WordPress plugin that hardcodes an HTTP URL, a copied-and-pasted YouTube embed from an old tutorial, an ad pixel set up three years ago and never revisited. And poorly configured HTTP → HTTPS redirections: if your .htaccess redirects the homepage but not the subfolders, you can end up with mixed pages without knowing it.
Practical impact and recommendations
How to detect mixed content on your site?
First step: open Chrome DevTools (F12), go to the Console tab, and navigate your site. Any blocked mixed content will appear in red with an explicit message. Do this across multiple page templates — don't limit yourself to the homepage. Product pages, blog articles, contact pages might load different resources.
Next, use Google Search Console, under "Security and Manual Actions". Google reports mixed content errors detected during crawl here. And run a Lighthouse audit (integrated in Chrome DevTools, Lighthouse tab): it explicitly flags non-HTTPS resources. For a comprehensive audit, Screaming Frog can crawl your site and list all resource URLs — then filter for "http://" to spot the culprits.
What corrective actions should you implement immediately?
Replace all hardcoded HTTP URLs with their HTTPS equivalent. Search in your database (SQL query like SELECT * FROM posts WHERE post_content LIKE '%http://%' in WordPress) and in your templates. For external resources (CDN, Google Fonts, jQuery), check that they support HTTPS — 99% do now.
If you are using a CMS, enable relative protocol for resource URLs: //cdn.example.com/script.js instead of http://cdn.example.com/script.js. The browser will automatically use the same protocol as the page. And configure your .htaccess or Nginx to force HTTPS site-wide — not just on certain sections.
Should you worry about CDNs and third-party services?
Yes, that's where it gets complicated. Third-party widgets (live chat, customer reviews, display ads) may load resources that you don’t control. Contact providers to ensure they serve everything over HTTPS. If a service refuses to migrate, switch providers — this is non-negotiable in 2025.
For programmatic ads, it’s trickier: some networks still serve HTTP content. Work with your ad ops to filter out non-HTTPS creatives. And test regularly: a new ad partner can reintroduce mixed content without your knowledge. These technical optimizations — auditing third-party resources, rewriting URLs in the database, advanced server configuration — can quickly become complex to orchestrate by yourself, especially on sites with thousands of pages. Calling a specialized SEO agency can speed up diagnostics and ensure a clean migration without breaking user experience or losing rankings.
- Audit all pages with Chrome DevTools (Console) to detect mixed content blocks
- Check Google Search Console security section for errors reported by Googlebot
- Run a Screaming Frog crawl with a filter on HTTP URLs to identify non-HTTPS resources
- Replace all hardcoded HTTP URLs with HTTPS (database, templates, widgets)
- Enable the relative protocol (//) for compatible external resources
- Force HTTPS site-wide via .htaccess or Nginx configuration (301 redirect)
- Contact third-party widget providers to check their HTTPS compatibility
- Test after migration on multiple browsers and page types (homepage, product, blog)
❓ Frequently Asked Questions
Le contenu mixte affecte-t-il directement mon classement Google ?
Comment vérifier rapidement si mon site a du contenu mixte ?
Les autres navigateurs bloquent-ils aussi le contenu mixte ?
Que faire si un service tiers refuse de passer en HTTPS ?
Le protocole relatif (//) est-il toujours recommandé pour les ressources externes ?
🎥 From the same video 11
Other SEO insights extracted from this same Google Search Central video · duration 8 min · published on 30/01/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.