Official statement
Other statements from this video 11 ▾
- 2:03 Les featured snippets génèrent-ils vraiment plus de trafic qualifié que les positions classiques ?
- 4:06 Google cherche-t-il vraiment à envoyer du trafic vers votre site ou à le garder pour lui ?
- 7:00 Faut-il arrêter de tweeter à Google et utiliser le bouton 'Submit Feedback' de Search Console ?
- 7:42 Chrome et Android influencent-ils vraiment le classement Google ?
- 9:46 AMP est-il vraiment un facteur de classement dans les résultats Google ?
- 10:48 AMP sert-il vraiment les utilisateurs ou verrouille-t-il le web au profit de Google ?
- 12:12 Google teste-t-il vraiment ses mises à jour avant de les déployer en production ?
- 15:12 Pourquoi Google refuse-t-il de révéler comment il détecte le spam ?
- 16:02 Pourquoi les Developer Advocates de Google ignorent-ils volontairement les détails du ranking ?
- 16:02 Pourquoi Google refuse-t-il de révéler ses centaines de facteurs de classement ?
- 16:54 Les tests utilisateurs sont-ils vraiment indispensables pour réussir son SEO ?
Martin Splitt confirms that HTTPS and speed remain sustainable SEO technical priorities, as they stem from a 'user-first' logic. Even though Google continuously evolves its ranking criteria, these technical fundamentals retain their importance because they directly impact browsing experience. In practical terms: a fast and secure site is no longer a 'plus'; it’s a prerequisite to avoid being left behind.
What you need to understand
Why does Google consistently link technical aspects with user experience?
Splitt's statement is based on a simple principle: Google does not promote technical criteria just to complicate the lives of SEOs. Each official technical recommendation (HTTPS, speed, mobile-first, Core Web Vitals) arises from a clear desire to encourage webmasters to enhance the real experience of visitors.
HTTPS protects user data—this is obvious for an e-commerce site, but Google now mandates it everywhere. Loading speed reduces frustration and abandonment—a site that takes 5 seconds to load loses half of its mobile traffic before the content even displays. This is not an engineer's whim: it's solid business.
What really changes if Google 'regularly changes its criteria'?
Splitt acknowledges that ranking criteria evolve. However, taken literally, this statement can be misleading. Google does not reinvent the wheel every six months—it adjusts weights, introduces new signals (like the Helpful Content Update), and removes obsolete factors.
Technical fundamentals like HTTPS and speed, however, remain stable. Why? Because they address timeless human needs: security, comfort, speed. An algorithm can evolve, but a user who waits 8 seconds for a page to load will always abandon it.
Does 'important' mean 'decisive for ranking'?
Beware of vocabulary. Splitt says these factors are 'important', not that they are the most powerful. An ultra-fast site with mediocre content will never beat a slower competitor with highly relevant answers. Technique does not replace relevance.
However, with equivalent content, speed and HTTPS can make the difference. The issue is that we never know exactly how much these criteria weigh in the algorithmic mix—Google does not disclose weights. This ambiguity creates a form of anxiety among practitioners.
- HTTPS has been a confirmed ranking signal since 2014, but its weight remains modest compared to content relevance.
- Speed (and more recently Core Web Vitals) is an officially recognized ranking factor, but Google insists: a slow site with excellent content can still rank.
- The 'user-first' approach is the common thread throughout all major updates—from Panda to Helpful Content.
- Technical criteria evolve in their measurement (e.g., from Speed Update to Core Web Vitals), but the intention remains stable.
- A site that neglects HTTPS or speed will not be strictly penalized, but it will lose competitiveness compared to optimized competitors.
SEO Expert opinion
Is this statement consistent with real-world observations?
Yes, broadly speaking. For competitive queries, a pure HTTP site has little chance of ranking in the top 3 against HTTPS competitors. The same applies to a desktop-only site with a catastrophic LCP on mobile: it mechanically slips in mobile-first SERPs.
Where it gets tricky is in the actual impact amplitude. We regularly see sites with average (or even poor) Core Web Vitals ranking very well—because their content, domain authority, and link profile overwhelm the technical handicap. Google isn’t lying: these criteria count. But it doesn’t say how much. [To be verified] in each niche, each SERP.
What nuances should be added regarding the notion of 'user-first'?
'User-first' is a Google mantra, but it is sometimes applied… selectively. Take the example of intrusive interstitials: Google has officially penalized them since 2017, yet we still see sites with aggressive popups comfortably ranking in the top 5.
Likewise, speed: a news site loaded with ads and third-party scripts can have an LCP of 4 seconds and remain dominant due to its editorial authority and backlinks. The 'user-first' mantra is not absolute—it’s just one criterion among many, and Google weighs it according to the query context. If you're searching for urgent information on a current event, Google will prioritize freshness and authority, even if the site is technically mediocre.
In what cases is this rule not fully applicable?
For niche queries with very little competition, a slow HTTP site can perfectly rank #1 if it's the only one comprehensively covering the topic. Google does not have the luxury of choice: it displays what it has.
Another case: sites with overwhelming domain authority (like Wikipedia, government sites, historical media). They receive implicit tolerance on certain technical criteria. Wikipedia is not a model of speed, but it dominates informational SERPs. The reality is that Google applies its criteria with variable flexibility depending on the level of competition and type of query.
Practical impact and recommendations
What should you do concretely to align technique with user experience?
Start by auditing your site on the two priority axes: security (HTTPS) and performance (Core Web Vitals). Use PageSpeed Insights, Lighthouse, and Search Console to identify friction points. An LCP over 2.5 seconds, a CLS above 0.1, or a slow FID/INP are warning signs.
Then, prioritize quick wins: image compression (WebP, lazy loading), browser caching, removal of blocking resources (non-critical CSS/JS). For HTTPS, if you’re still on pure HTTP, migrate immediately—it's become an absolute prerequisite to instill trust (and avoid the 'Not Secure' tag in Chrome).
What mistakes should be avoided during technical optimization?
Common mistake: over-optimizing speed at the expense of conversion. Removing all third-party scripts to gain 0.3 seconds of LCP, but losing your analytics tracking or lead gen tool, is counter-productive. Balance is crucial.
Another pitfall: thinking that a PageSpeed score of 100/100 guarantees good ranking. Google does not rank sites based on their Lighthouse score—it measures actual user experience through real-world Core Web Vitals (CrUX). A site with a perfect lab score can have mediocre real-world metrics if actual traffic comes from 3G mobiles or old devices.
How can you verify that your site meets these criteria in the long term?
Set up continuous monitoring of Core Web Vitals via Search Console and RUM (Real User Monitoring) tools like SpeedCurve or Cloudflare Web Analytics. Performances fluctuate with CMS updates, the addition of new features, and ad campaigns that inject third-party scripts.
Regularly test on mobile (slow network, mid-range device)—this is where problems reveal themselves. And document your optimizations in a technical log: this allows tracing impacts and avoiding regression during a redesign or migration. These technical optimizations can quickly become complex to manage in-house, especially if your stack is hybrid (CMS + CDN + third-party services). In this case, engaging a specialized SEO agency allows you to benefit from expert insights, advanced tools, and personalized tracking to maximize impact without risking critical features.
- Migrate the entire site to HTTPS with clean 301 redirects and a valid SSL certificate
- Audit Core Web Vitals (LCP, CLS, INP) via PageSpeed Insights and Search Console
- Optimize images: compression, modern formats (WebP/AVIF), lazy loading
- Reduce the weight of blocking resources (critical inline CSS/JS, defer/async for the rest)
- Implement a CDN to speed up the delivery of static assets
- Monitor real metrics (CrUX) and continuously adjust according to traffic developments
❓ Frequently Asked Questions
HTTPS est-il vraiment obligatoire pour ranker sur Google ?
Un site lent avec un excellent contenu peut-il quand même bien se positionner ?
Les Core Web Vitals ont-ils remplacé l'ancien signal de vitesse ?
Faut-il viser un score PageSpeed de 100/100 pour être compétitif ?
Peut-on compenser un site lent par un meilleur profil de liens ?
🎥 From the same video 11
Other SEO insights extracted from this same Google Search Central video · duration 19 min · published on 23/09/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.