What does Google say about SEO? /

Official statement

Mixed content (HTTP within a HTTPS page) does not impact ranking, but Chrome displays a security warning because it may expose session information. It is recommended to clean up these HTTP URLs, for example via search/replace in the database.
45:27
🎥 Source video

Extracted from a Google Search Central video

⏱ 57:01 💬 EN 📅 13/05/2020 ✂ 22 statements
Watch on YouTube (45:27) →
Other statements from this video 21
  1. 1:43 Does Google really rewrite your meta descriptions if they contain too many keywords?
  2. 4:20 Why does altering the Analytics code hinder Search Console verification?
  3. 5:58 Why does your hreflang markup still not work despite your efforts?
  4. 5:58 Should you choose hreflang language only or language+country for your international versions?
  5. 9:09 Does Hreflang really only affect displayed URLs while Google insists on indexing just one version?
  6. 12:32 Why does your site completely disappear from Google's index, and how can you recover it?
  7. 15:51 Does the URL Parameter Tool really consolidate all signals as Google claims?
  8. 19:03 Do core updates really not penalize any technical errors?
  9. 23:00 Does the outdated content tool really just hide the snippet instead of affecting indexing?
  10. 23:56 Is the site: command really useless for diagnosing indexing?
  11. 23:56 Does the URL removal tool truly deindex your pages?
  12. 26:59 The 50,000 URLs in a sitemap: why does this limit not mean what you think it does?
  13. 30:10 Is it true that BERT penalizes websites that lose traffic after its rollout?
  14. 32:07 Does Google Images really select the right image for your pages?
  15. 33:50 Should you really include details like price, reviews, and ratings in your anchor texts?
  16. 35:26 What happens when your internal linking isn't bidirectional?
  17. 38:03 Why does Google refuse to index all your pages, and how can you fix it?
  18. 40:12 Is repetitive internal anchor text really a concern for Google?
  19. 42:48 Do UTM parameters really cause Google to index duplicate content?
  20. 47:16 Does hreflang in HTML really weigh down your pages, or is that just a myth?
  21. 53:53 Why do old URLs stay indexed after a 301 redirect?
📅
Official statement from (5 years ago)
TL;DR

Mueller confirms that mixed content (HTTP loaded on a HTTPS page) does not directly affect Google ranking. However, Chrome displays a security warning that can ruin conversion rates and user trust. Simple action: search/replace all orphan HTTP URLs in the database with their HTTPS equivalent.

What you need to understand

What is mixed content and why does Chrome warn against it?

Mixed content refers to loading HTTP resources (images, scripts, CSS, iframes) within a page served over HTTPS. Chrome blocks or warns users because an HTTP resource can be intercepted in clear, exposing session information, cookies, or sensitive data even if the parent page is encrypted.

In practice? An e-commerce site migrated to HTTPS that forgets to update a product image URL to http:// instead of https:// triggers this warning. The browser then considers that the connection is no longer fully secure, which can drive visitors away — especially on mobile.

Mueller says 'no ranking impact': what does that mean in practice?

Google will not diminish the page solely because it loads a script or an image over HTTP. The crawler does not directly penalize this technical detail in the ranking algorithm. In other words, the presence of mixed content is not a negative ranking factor like duplicate content or catastrophic loading times.

Crucial nuance: this does not mean the problem is without consequence. Chrome displays a locked padlock or alert message in the address bar, undermining trust and possibly degrading Core Web Vitals if blocked resources break the layout or critical script.

Why does Google still recommend cleaning up these URLs?

Because user experience is paramount. A visitor who sees a security warning tends to bounce more often — increased bounce rate, decreased conversions. Google measures these behavioral signals, and even if mixed content itself does not demote, the indirect consequences can affect positioning.

Moreover, Chrome is becoming increasingly strict: recent versions completely block certain types of mixed content (scripts, iframes) instead of just alerting. A site that does not fix it faces broken features for an increasing share of its audience.

  • Mixed content = HTTP resources within a HTTPS page, detected and blocked/warned by Chrome.
  • No direct ranking penalty, but indirect impact through degraded UX and Core Web Vitals.
  • Recommended action: search/replace all HTTP URLs in the database to avoid warnings and traffic loss.
  • Real security risk: session interception or sensitive data through unencrypted resources.
  • Chrome is progressively tightening its blocking — what is tolerated today may break tomorrow.

SEO Expert opinion

Is this statement consistent with what we observe in the field?

Yes, in that no A/B test has ever shown an immediate drop in positions following the emergence of mixed content. Sites that migrate to HTTPS and forget a few images in HTTP do not tumble overnight. This validates Mueller's position: Google does not directly penalize this technical point in its ranking algorithm.

However, we regularly observe drops in organic click-through rates or post-click conversions on affected pages. The Chrome warning scares users, especially during transactional journeys. Therefore, the business impact does indeed exist, even if it is not purely algorithmic.

What nuances should we add to this assertion by Google?

Mueller only talks about ranking in the strict sense. He does not state that the problem is negligible. A site triggering security warnings will experience trust erosion, higher bounce rates, and potentially an impact on Core Web Vitals if blocked resources break rendering or scripting.

Another nuance: HTTPS has been a slight positive ranking factor for years. If Google detects that a site is

Practical impact and recommendations

What should I do practically to eliminate mixed content?

First step: audit all URLs in the database. A simple search/replace of http://yoursite.com with https://yoursite.com in tables containing editorial content (posts, products, pages) resolves 80% of cases. Use a tool like Better Search Replace (WordPress) or a direct SQL script if you are familiar with it.

Next, inspect external resources: image CDNs, third-party scripts (Google Analytics, Facebook Pixel, chatbots). Ensure that each URL starts with https:// or uses the relative protocol //cdn.example.com/script.js which automatically inherits the protocol from the parent page.

What mistakes should I avoid when correcting mixed content?

Don't just force HTTPS via a Content-Security-Policy (CSP) upgrade-insecure-requests without verifying that all resources actually exist in HTTPS. If a CDN does not support HTTPS for some legacy URLs, the browser will attempt to load them over HTTPS, fail, and break the display.

Another pitfall: forgetting inline resources (tags style or script with background-image: url(http://...)). These cases are not always detected by automated tools and require a manual review of the source code or templates.

How do I check that my site is completely free of mixed content?

Open Chrome DevTools (F12), go to the Console tab, and load a page in private browsing. Any HTTP resource will trigger a yellow or red warning mentioning “Mixed Content.” Test the strategic pages: home, product sheets, checkout tunnel, paid landing pages.

Also use tools like Why No Padlock or JitBit SSL Check which automatically scan a page and list all non-HTTPS resources. For a complete site audit, Screaming Frog can crawl and detect mixed content at scale, but you need to enable JavaScript rendering to capture dynamically loaded resources.

  • Launch a search/replace http://https:// in the database on content fields.
  • Verify that all CDNs and third-party services (analytics, pixels, widgets) are called over HTTPS.
  • Test key pages in Chrome DevTools Console to detect mixed content warnings.
  • Use a crawler like Screaming Frog with JS rendering enabled for a comprehensive audit.
  • Check Search Console: Security and Manual Actions section, although it does not systematically notify for mixed content.
  • Implement a CSP upgrade-insecure-requests only after verifying that all resources exist in HTTPS.
Cleaning up mixed content is a technically simple task on the surface but can reveal hidden dependencies — outdated CDNs, unmaintained third-party widgets, legacy templates. If your infrastructure is complex or if you manage multiple environments (dev, staging, prod) with divergent configurations, the intervention of a specialized SEO agency will save you time and avoid functional breaks. Personalized support ensures a complete audit, a prioritized correction roadmap, and post-migration follow-up to validate that each resource loads correctly in HTTPS without degrading user experience.

❓ Frequently Asked Questions

Le mixed content peut-il entraîner une pénalité manuelle Google ?
Non. Google ne délivre pas d'action manuelle pour mixed content. C'est un problème de sécurité navigateur, pas une violation des guidelines Search. Seuls les signaux UX indirects (rebond, conversions) peuvent affecter le ranking.
Est-ce que forcer HTTPS via .htaccess résout le problème de mixed content ?
Non. Rediriger le site en HTTPS ne change pas les URLs codées en dur en HTTP dans le contenu ou les templates. Il faut corriger chaque ressource à la source, en base ou dans le code.
Chrome bloque-t-il tous les types de mixed content de la même façon ?
Non. Chrome bloque activement les scripts et iframes HTTP (mixed content actif) mais affiche seulement un avertissement pour les images et médias (mixed content passif). Les versions récentes durcissent progressivement ce blocage.
Le protocole relatif (//) est-il une bonne pratique pour éviter le mixed content ?
Oui, pour les ressources externes dont tu ne contrôles pas la config HTTPS. L'URL hérite du protocole de la page parente. Mais vérifie toujours que la ressource existe bien en HTTPS avant de basculer.
La Search Console signale-t-elle le mixed content comme un problème ?
Non, la Search Console ne notifie pas spécifiquement le mixed content. Il faut utiliser Chrome DevTools, des crawlers avec rendu JS, ou des services tiers comme Why No Padlock pour détecter ces ressources HTTP orphelines.
🏷 Related Topics
Domain Age & History Content HTTPS & Security AI & SEO Domain Name Local Search

🎥 From the same video 21

Other SEO insights extracted from this same Google Search Central video · duration 57 min · published on 13/05/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.