What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Data from the Chrome User Experience Report (CrUX) is maintained at the origin level, which means the hostname including the subdomain and the protocol. If you have different subdomains, they will be treated separately regarding Core Web Vitals.
1:02
🎥 Source video

Extracted from a Google Search Central video

⏱ 55:29 💬 EN 📅 19/02/2021 ✂ 26 statements
Watch on YouTube (1:02) →
Other statements from this video 25
  1. 4:14 Pourquoi Search Console n'affiche-t-elle pas toutes les données de vos sitemaps indexés ?
  2. 4:47 Les erreurs serveur tuent-elles vraiment votre crawl budget ?
  3. 5:48 Le temps de réponse serveur ralentit-il vraiment le crawl Google plus que la vitesse de rendu ?
  4. 7:24 Google reconnaît-il vraiment le contenu syndiqué et privilégie-t-il l'original ?
  5. 10:36 Google privilégie-t-il vraiment la géolocalisation pour classer le contenu syndiqué ?
  6. 14:28 Comment Google gère-t-il vraiment la canonicalisation et le hreflang sur les sites multilingues ?
  7. 16:33 Pourquoi Google affiche-t-il l'URL canonique au lieu de l'URL locale dans Search Console ?
  8. 18:37 Faut-il vraiment localiser chaque page produit pour éviter le duplicate content ?
  9. 20:11 Pourquoi Google peine-t-il à comprendre vos balises hreflang sur les gros sites internationaux ?
  10. 20:44 Faut-il vraiment afficher une bannière de sélection pays sur un site multilingue ?
  11. 21:45 Comment identifier et corriger le contenu de faible qualité après une Core Update ?
  12. 23:55 Le passage ranking est-il vraiment indépendant des featured snippets ?
  13. 24:56 Les liens en nofollow dans les guest posts sont-ils vraiment obligatoires pour Google ?
  14. 25:59 Les PBN sont-ils vraiment détectés et neutralisés par Google ?
  15. 27:33 Le nombre de backlinks est-il vraiment sans importance pour Google ?
  16. 28:37 Le duplicate content est-il vraiment sans danger pour votre SEO ?
  17. 29:09 Faut-il vraiment s'inquiéter si la page d'accueil surclasse les pages internes ?
  18. 29:40 Le maillage interne est-il vraiment le signal prioritaire pour hiérarchiser vos pages ?
  19. 31:47 Faut-il encore désavouer les liens spammy en SEO ?
  20. 32:51 Le fichier disavow peut-il pénaliser votre site ?
  21. 35:30 Les Core Web Vitals affectent-ils déjà votre classement ou faut-il attendre leur activation ?
  22. 36:13 Pourquoi Google peine-t-il à comprendre les pages saturées de publicités ?
  23. 37:05 Faut-il vraiment indexer moins de pages pour éviter le thin content ?
  24. 52:23 Le trafic et les signaux sociaux influencent-ils vraiment le référencement naturel ?
  25. 53:57 La longueur d'un article influence-t-elle vraiment son classement Google ?
📅
Official statement from (5 years ago)
TL;DR

Google confirms that CrUX data — and thus Core Web Vitals — are measured at the full origin level: protocol + subdomain + domain. Each subdomain is evaluated in isolation. Specifically, if blog.example.com showcases excellent performance while www.example.com is slow, only the fast subdomain benefits from the boost. This granularity allows for performance segmentation… but complicates overall diagnosis.

What you need to understand

What do we actually mean by 'origin' in the context of Core Web Vitals?

In the world of browsers and web security, origin is defined by three components: the protocol (http or https), the full hostname (including the subdomain), and the port. For Core Web Vitals, Google strictly applies this logic through the Chrome User Experience Report (CrUX).

This means that https://www.example.com, https://blog.example.com, and https://shop.example.com are three distinct origins. Each collects its own user experience metrics, independently of the others. This separation is not a decision made by Google — it is a technical constraint of the web security model.

Why is there granularity at the subdomain level instead of the root domain?

The CrUX collects real user data from Chrome users worldwide. This data is aggregated by origin to comply with privacy standards and the segmentation logic of browsers. Google did not build CrUX specifically for SEO — it is primarily a generic user experience measurement tool.

As a result, each subdomain inherits its own technical stack, sometimes its own hosting, and therefore may have potentially very different performances. A WordPress blog hosted on a modest VPS and a React application on a premium CDN cannot be judged with the same metrics. Granularity by origin reflects this technical reality.

How does this separation affect Google search rankings?

Google uses CrUX data as a ranking signal within the "Page Experience" framework. Each origin is evaluated separately, so one subdomain can receive a ranking boost related to Core Web Vitals while another subdomain within the same root domain does not benefit from this.

Let’s be honest: this is not a dominant signal. However, in competitive contexts where other factors are equivalent, this difference can tip the scales of a position. A high-performing subdomain does not compensate for a lagging main subdomain if the majority of SEO traffic goes through the latter.

  • Each subdomain is measured as a distinct entity for Core Web Vitals
  • The protocol (http vs https) also matters — two different origins
  • CrUX data is never aggregated at the root domain level
  • A subdomain without sufficient Chrome traffic will not have actionable CrUX data
  • This logic applies to both desktop and mobile, with separate datasets

SEO Expert opinion

Is this statement consistent with what’s observed on the ground?

Yes, and it's easily verifiable. If you query the CrUX API or check the PageSpeed Insights report for different subdomains of the same site, you will find totally independent metrics. One subdomain may show "Good" for LCP while another is "Needs Improvement" — and this is the case across thousands of e-commerce sites where the blog is on a separate subdomain.

What’s less clear is the actual impact on ranking. Google remains very vague about the exact weight of the Page Experience signal. In my audits, I have seen sites with catastrophic Core Web Vitals on their main subdomain continue to rank well due to other strong signals (authority, content, backlinks). The boost exists, but it’s not miraculous. [To be confirmed]: Google has never published numerical data on the exact weighting of this signal.

What nuances should we add to this rule?

First point: if a subdomain does not receive enough Chrome traffic, it simply will not have CrUX data. Google requires a minimum threshold of data collected over a rolling 28 days. In this case, the subdomain is neither penalized nor favored — it’s just absent from the dataset.

Second nuance: individual pages can also have their own CrUX data if they generate enough traffic. However, the ranking signal generally applies at the origin level. If your origin is “Poor,” an isolated page with good performance does not suffice to overturn the trend for the entire subdomain.

When does this logic pose problems?

For multi-subdomain architectures, it’s a headache. Imagine an e-commerce site with www for the catalog, account for customer space, blog for editorial content, and m for mobile. Each has its own stack, its own hosting, and its own third-party scripts. The result: heterogeneous performances that translate into fragmented ranking signals.

This is where it gets tricky: you cannot compensate for a slow origin with a fast origin. If 80% of your SEO traffic comes through www and that subdomain is failing on Core Web Vitals, the fact that your blog is impeccable won’t save you. Each origin needs to be treated as a distinct optimization project.

Note: Migrating content from one subdomain to another to "inherit" better CrUX metrics does not work instantly. CrUX data is calculated based on rolling 28 days — any migration takes at least a month before seeing any effect on public data.

Practical impact and recommendations

What should be done practically to optimize Core Web Vitals by subdomain?

Start by mapping your origins. List all subdomains that receive significant SEO traffic. For each one, query the CrUX API or use PageSpeed Insights to retrieve actual metrics. Focus first on the subdomains generating the most organic page views — that’s where the impact will be most direct.

Then, treat each origin as a distinct technical project. A slow WordPress blog requires specific optimizations (lazy loading, server cache, CDN for images). A React application on another subdomain will need code-splitting, strategic preloading, and JavaScript optimization. Do not look for a one-size-fits-all solution — each stack has its own bottlenecks.

What mistakes should be avoided in this context?

A classic mistake: blindly optimizing the wrong subdomain. I’ve seen teams spend weeks improving a staging subdomain or an old abandoned blog that receives no traffic, while the main subdomain remained catastrophic. Always check the distribution of organic traffic before prioritizing your efforts.

Another pitfall: believing that a good Lighthouse score is sufficient. Lighthouse tests in a controlled environment — Core Web Vitals are measured with real-world CrUX data. A site can score 95/100 on Lighthouse and be "Poor" in CrUX if real users have slow connections, weak devices, or interact with heavy elements that Lighthouse does not test.

How to ensure each subdomain is optimally configured?

Use the Search Console — it aggregates CrUX data by origin and shows you which URLs are "Good", "Needs Improvement", or "Poor". This is your prioritization dashboard. For each failing origin, identify the problematic metrics (LCP, INP, CLS) and their root causes through tools like WebPageTest or Chrome DevTools in throttling mode.

Also monitor trends over 28 days. Core Web Vitals are rolling averages — an improvement is not reflected instantly. If you deploy a fix, wait at least 2-3 weeks before judging its effectiveness in CrUX. And keep an eye on regressions: a third-party script added by marketing can ruin your efforts in 24 hours.

  • Map all subdomains receiving significant organic traffic
  • Query the CrUX API or PageSpeed Insights for each distinct origin
  • Prioritize optimizations based on organic traffic volume by subdomain
  • Treat each technical stack separately — no one-size-fits-all multi-origin solution
  • Monitor the Search Console for origin-level regressions
  • Wait 28 days after a deployment to assess real impact in CrUX
Core Web Vitals applied at the origin level require a segmented approach: each subdomain must be audited, optimized, and monitored independently. This granularity may seem cumbersome to manage, especially for complex multi-subdomain architectures. If your infrastructure includes several critical origins with heterogeneous technical stacks, enlisting the help of a specialized web performance SEO agency may be wise to orchestrate these optimizations coherently and measure their real impact on ranking.

❓ Frequently Asked Questions

Si je migre mon contenu d'un sous-domaine lent vers un sous-domaine rapide, est-ce que je récupère immédiatement les bonnes métriques ?
Non. Les données CrUX sont calculées sur 28 jours glissants. Même après migration, il faut attendre au minimum un mois pour que les nouvelles métriques se reflètent dans le dataset public utilisé par Google pour le ranking.
Un sous-domaine sans données CrUX est-il pénalisé pour les Core Web Vitals ?
Non, il n'est ni pénalisé ni favorisé. Si un sous-domaine ne génère pas assez de trafic Chrome pour atteindre le seuil minimum de CrUX, Google n'applique tout simplement pas le signal Page Experience pour cette origine.
Est-ce que http://exemple.com et https://exemple.com sont considérés comme deux origines distinctes ?
Oui. Le protocole fait partie de la définition de l'origine. Si tu as encore du contenu en HTTP et en HTTPS sur le même hostname, chaque protocole aura ses propres métriques CrUX séparées.
Peut-on agréger les données CrUX de plusieurs sous-domaines pour avoir une vue d'ensemble du domaine racine ?
Non via l'API publique CrUX. Google ne propose pas d'agrégation au niveau du domaine racine. Tu peux le faire manuellement en récupérant les données de chaque origine et en les consolidant dans ton propre outil, mais ce ne sera jamais le dataset utilisé pour le ranking.
Si mon sous-domaine principal a de mauvais Core Web Vitals mais que mes pages les plus importantes sont bonnes individuellement, suis-je protégé ?
Partiellement. Google peut utiliser les données au niveau page si elles existent dans CrUX, mais le signal dominant reste celui de l'origine. Une poignée de pages rapides ne compense pas une origine globalement lente si la majorité du trafic subit de mauvaises performances.
🏷 Related Topics
AI & SEO JavaScript & Technical SEO Domain Name Web Performance

🎥 From the same video 25

Other SEO insights extracted from this same Google Search Central video · duration 55 min · published on 19/02/2021

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.