Official statement
Other statements from this video 49 ▾
- 1:38 Google suit-il vraiment les liens HTML masqués par du JavaScript ?
- 1:46 JavaScript peut-il masquer vos liens aux yeux de Google sans les détruire ?
- 3:43 Faut-il vraiment optimiser le premier lien d'une page pour le SEO ?
- 3:43 Google combine-t-il vraiment les signaux de plusieurs liens pointant vers la même page ?
- 5:20 Les liens site-wide dans le menu et le footer diluent-ils vraiment le PageRank de vos pages stratégiques ?
- 6:22 Faut-il vraiment nofollow les liens site-wide vers vos pages légales pour optimiser le PageRank ?
- 7:24 Faut-il vraiment garder le nofollow sur vos liens footer et pages de service ?
- 10:10 Search Console Insights sans Analytics : pourquoi Google rend-il impossible l'utilisation solo ?
- 11:08 Le nofollow influence-t-il encore le crawl sans transmettre de PageRank ?
- 11:08 Le nofollow bloque-t-il vraiment l'indexation ou Google crawle-t-il quand même ces URLs ?
- 13:50 Pourquoi Google refuse-t-il de communiquer sur tous ses incidents d'indexation ?
- 15:58 Faut-il vraiment indexer toutes les pages paginées pour optimiser son SEO ?
- 15:59 Faut-il vraiment indexer toutes les pages de pagination pour optimiser son SEO ?
- 19:53 Les paramètres d'URL sont-ils encore un problème pour le référencement naturel ?
- 19:53 Les paramètres d'URL sont-ils vraiment devenus un non-sujet SEO ?
- 21:50 Google bloque-t-il vraiment l'indexation des nouveaux sites ?
- 23:56 Les liens dans les tweets embarqués influencent-ils vraiment votre SEO ?
- 25:33 Les sitemaps sont-ils vraiment indispensables pour l'indexation Google ?
- 26:03 Comment Google découvre-t-il vraiment vos nouvelles URLs ?
- 27:28 Pourquoi Google impose-t-il un canonical sur TOUTES les pages AMP, même standalone ?
- 27:40 Le rel=canonical est-il vraiment obligatoire sur toutes les pages AMP, même standalone ?
- 28:09 Faut-il vraiment déployer hreflang sur l'intégralité d'un site multilingue ?
- 28:41 Faut-il vraiment implémenter hreflang sur toutes les pages d'un site multilingue ?
- 29:08 AMP est-il vraiment un facteur de vitesse pour Google ?
- 29:16 Faut-il encore miser sur AMP pour optimiser la vitesse et le ranking ?
- 29:50 Pourquoi Google mesure-t-il les Core Web Vitals sur la version de page que vos visiteurs consultent réellement ?
- 31:23 Faut-il manuellement désindexer les anciennes URLs de pagination après un changement d'architecture ?
- 31:23 Faut-il vraiment désindexer manuellement vos anciennes URLs de pagination ?
- 32:08 La pub sur votre site tue-t-elle votre SEO ?
- 32:48 La publicité sur un site nuit-elle vraiment au classement Google ?
- 34:47 Le rel=canonical en syndication est-il vraiment fiable pour contrôler l'indexation ?
- 34:47 Le rel=canonical protège-t-il vraiment votre contenu syndiqué du vol de ranking ?
- 38:14 Les alertes de sécurité dans Search Console bloquent-elles vraiment le crawl de Google ?
- 38:14 Un site hacké perd-il son crawl budget suite aux alertes de sécurité Google ?
- 39:20 Les liens dans les guest posts ont-ils vraiment perdu toute valeur SEO ?
- 39:20 Les liens issus de guest posts ont-ils vraiment une valeur SEO nulle ?
- 40:55 Pourquoi Google ignore-t-il les dates de modification identiques dans vos sitemaps ?
- 40:55 Pourquoi Google ignore-t-il les dates lastmod de votre sitemap XML ?
- 42:00 Faut-il vraiment mettre à jour la date lastmod du sitemap à chaque modification mineure ?
- 42:21 Un sitemap mal configuré réduit-il vraiment votre crawl budget ?
- 43:00 Un sitemap mal configuré peut-il vraiment réduire votre crawl budget ?
- 44:34 Faut-il vraiment choisir entre réduction du duplicate content et balises canonical ?
- 44:34 Faut-il vraiment éliminer tout le duplicate content ou miser sur le rel=canonical ?
- 45:10 Faut-il vraiment configurer la limite de crawl dans Search Console ?
- 45:40 Faut-il vraiment laisser Google décider de votre limite de crawl ?
- 47:08 Les redirections 301 en interne diluent-elles vraiment le PageRank ?
- 47:48 Les redirections 301 internes en cascade font-elles vraiment perdre du jus SEO ?
- 49:53 L'History API JavaScript peut-elle vraiment forcer Google à changer votre URL canonique ?
- 49:53 JavaScript et History API : Google peut-il vraiment traiter ces changements d'URL comme des redirections ?
Google measures Core Web Vitals on the version of the page that the user actually sees: the AMP version if it displays, the classic HTML version otherwise. This real experience-oriented approach changes the game for sites maintaining multiple versions of the same page. In practical terms, optimizing an HTML version has no impact if your users are consistently landing on AMP.
What you need to understand
Which page version does Google consider for Core Web Vitals?
Google measures Core Web Vitals on the version that actually displays in the user's browser. If your site offers a AMP version and that’s the one that loads from the search results, it’s this AMP version that will be evaluated. If the user lands on the classic HTML version, that’s what counts.
This logic follows the principle of real user experience: Google does not measure a theoretical or technical version, but rather what people actually see. The data reported via the Chrome User Experience Report (CrUX) reflects this on-the-ground reality, not a controlled test environment.
Why is this distinction important for AMP sites?
Many sites have deployed AMP to benefit from priority display in mobile carousels or to improve their perceived speed. But if most organic traffic lands on AMP, then performance optimizations on the classic HTML version become secondary for the ranking related to Core Web Vitals.
Conversely, if your strategy is to abandon AMP and go all in on a highly optimized HTML version, you must ensure that it’s this version that users are loading. Otherwise, you are optimizing the wrong target. The consistency between technical setup and actual traffic becomes critical.
How does Google collect this real performance data?
Core Web Vitals are measured through CrUX, which aggregates anonymous browsing data from Chrome users. Each visit generates metrics (LCP, FID, CLS) that are associated with the URL actually loaded. If a user arrives at example.com/amp/article, it is this URL that will be evaluated.
This ground-level approach means that performance variations based on devices, connections, and geographies are taken into account. A site may perform differently on 4G mobile in India compared to fiber desktop in France—Google measures both realities, not a single synthetic score.
- AMP Version: measured if that’s what the user actually loads from the SERPs
- Classic HTML Version: measured if it is the default displayed version
- CrUX: collects data on the actual URL visited, not a theoretical version
- Real Traffic: metrics reflect the experience of real users, not lab tests
- Strategic Consistency: optimizing a version that no one sees does nothing for ranking
SEO Expert opinion
Is this statement consistent with observed practices on the ground?
Yes, and it is even one of the few cases where Google applies exactly what it states. The publicly available CrUX data clearly shows that AMP and non-AMP URLs are tracked separately. If you compare the metrics for example.com/article and example.com/amp/article in the CrUX report, you will see two distinct lines with often very different performances.
What complicates things is that many sites assumed optimizing their HTML version would suffice, while 90% of their mobile traffic was landing on AMP without their realization. The awakening was harsh when Core Web Vitals became a ranking factor— their AMP version, never optimized, dragged the entire site down.
What nuances should be brought to this rule?
Mueller's statement is clear on the surface, but it says nothing about the traffic thresholds required for a URL to be included in CrUX. A page with 10 visits per month does not generate enough data to be evaluated—Google then relies on metrics at the domain or origin level.
Another vague point: what happens when a site offers both AMP and classic HTML, but users are split 50/50 between the two versions? Does Google aggregate the scores? Does it take the dominant version? [To be verified]—the official documentation remains vague on this point, and ground feedback is contradictory.
In what cases can this logic create problems?
The most insidious case: a site that abandons AMP without properly redirecting the old AMP URLs to the HTML versions. If Google continues to serve cached AMP URLs (or if backlinks point to them), these ghost pages continue to be measured and negatively impact the overall Core Web Vitals of the domain.
Another trap: sites that test their performance with Lighthouse or PageSpeed Insights on the HTML version, achieve perfect scores, and don’t understand why their ranking isn’t changing. However, their real users are loading a non-optimized AMP version—and that’s what CrUX measures. Testing the wrong version is the classic error here.
Practical impact and recommendations
What should you do if your site uses AMP?
First, check which version is actually served to your users. Consult your server logs or filtered Google Analytics for mobile organic traffic: how many URLs contain /amp/ or an AMP parameter? If it’s the majority, it’s this version that should be prioritized for optimization.
Next, test the Core Web Vitals of your AMP version with PageSpeed Insights by directly pasting the AMP URL. Do not solely rely on the HTML version scores—they do not reflect what Google measures if no one visits it. Compare LCP, CLS, and FID between the two versions and focus your efforts where the real traffic lies.
How to manage a migration from AMP to classic HTML without losing performance?
If you decide to abandon AMP, the transition must be gradual and monitored. First, remove the rel="amphtml" tags from your canonical pages so that Google stops offering the AMP version. Set up clean 301 redirects from the AMP URLs to the HTML URLs.
Then, monitor CrUX for at least a full month. Data takes time to switch—Google continues to measure the old AMP URLs as long as they receive residual traffic (backlinks, cache, bookmarks). If your Core Web Vitals drop after migration, it’s often because the HTML version is not as performant as you thought.
What errors should you avoid when optimizing Core Web Vitals?
The number one mistake: optimizing the version that no one sees. Before launching an optimization project, confirm which URL is primarily loaded by your users. Do not assume it’s the HTML version—check with real data.
Second trap: believing that lab tests (Lighthouse) are sufficient. These tools measure a theoretical version under ideal conditions. The CrUX data reflects the real world: slow connections, low-end devices, outdated browsers. A Lighthouse score of 95 does not guarantee green Core Web Vitals if your real users are on 3G with a budget smartphone.
- Identify which version (AMP or HTML) is actually receiving your organic traffic
- Test the Core Web Vitals of the version that is actually served, not the one you prefer
- If migrating from AMP to HTML, implement clean 301 redirects
- Monitor CrUX for 28 days after any migration to confirm the switch
- Do not solely rely on Lighthouse scores—consult the CrUX ground data
- Compare performance between desktop and mobile, as they can diverge significantly
❓ Frequently Asked Questions
Si mon site propose AMP et HTML, quelle version Google mesure-t-il pour les Core Web Vitals ?
Comment savoir quelle version de ma page est réellement mesurée par CrUX ?
Les scores Lighthouse reflètent-ils les Core Web Vitals utilisés pour le classement ?
Que se passe-t-il si j'abandonne AMP sans rediriger les anciennes URLs ?
Combien de temps faut-il pour que CrUX bascule vers une nouvelle version de page ?
🎥 From the same video 49
Other SEO insights extracted from this same Google Search Central video · duration 55 min · published on 21/08/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.