Official statement
Other statements from this video 32 ▾
- 1:07 Comment Google décide-t-il vraiment quelles pages crawler en priorité sur votre site ?
- 2:07 Les pages de catégories sont-elles vraiment plus crawlées par Google ?
- 5:21 Faut-il vraiment optimiser les titres de pages produits pour Google ou pour les utilisateurs ?
- 5:22 Plusieurs pages peuvent-elles avoir le même H1 sans risque SEO ?
- 6:54 Les liens en mouseover sont-ils vraiment crawlables par Google ?
- 9:54 Googlebot suit-il vraiment les liens internes masqués au survol ?
- 10:53 Faut-il bloquer les scripts JavaScript dans le robots.txt ?
- 13:07 Comment exploiter Search Console pour piloter son SEO mobile de façon optimale ?
- 16:01 Faut-il vraiment rendre vos fichiers JavaScript accessibles à Googlebot ?
- 18:06 Faut-il vraiment garder son fichier Disavow même avec des domaines morts ?
- 21:00 JavaScript et indexation Google : jusqu'où peut-on vraiment pousser le curseur côté client ?
- 23:24 Combien d'articles faut-il afficher par page de catégorie pour optimiser le SEO ?
- 23:32 La balise canonical transfère-t-elle vraiment autant de signal qu'une redirection 301 ?
- 29:00 Le contenu dupliqué est-il vraiment un problème SEO à traiter en priorité ?
- 29:12 Le fichier Disavow neutralise-t-il vraiment tous les backlinks désavoués ?
- 29:32 Les balises canonical transmettent-elles réellement les signaux SEO comme une redirection 301 ?
- 30:26 Faut-il vraiment nettoyer son fichier Disavow des URLs mortes et redirigées ?
- 33:21 Le JavaScript est-il vraiment un problème pour le crawl de Google ?
- 36:20 Faut-il vraiment mettre en noindex les pages de catégorie peu peuplées ?
- 40:50 Faut-il vraiment passer son site en HTTPS pour le SEO ?
- 41:30 HTTPS booste-t-il vraiment votre SEO ou est-ce un mythe Google ?
- 45:25 Google retire-t-il vraiment les pages trompeuses ou se contente-t-il de les déclasser ?
- 46:12 Faut-il vraiment éviter les balises canonical sur les pages paginées ?
- 47:32 Comment accélérer la désindexation des pages orphelines qui plombent votre index Google ?
- 48:06 Le contenu dupliqué impacte-t-il vraiment le crawl budget de votre site ?
- 53:30 Les signalements de spam Google garantissent-ils vraiment une action ?
- 57:26 Le contenu descriptif sur les pages catégorie règle-t-il vraiment le problème d'indexation ?
- 59:12 Les pages de catégorie vides nuisent-elles vraiment à l'indexation ?
- 63:20 Faut-il vraiment réécrire toutes les descriptions produit pour ranker en e-commerce ?
- 70:51 Google peut-il fusionner vos sites internationaux si le contenu est trop similaire ?
- 77:06 Faut-il vraiment éviter les canonicals vers la page 1 sur les séries paginées ?
- 80:32 Faut-il vraiment compter sur le 404 pour nettoyer l'index Google des URLs orphelines ?
Google confirms that Search Console allows segmentation of organic traffic by subdomain or mobile version, provided dedicated properties are set up. For an SEO, this means multiplying properties in the interface to obtain granular, usable data. Without this segmentation, it's impossible to accurately measure the performance of each technical variation of a site.
What you need to understand
Why is this segmentation of properties essential?
Google does not provide native filters in Search Console to isolate traffic from a subdomain or a mobile version within a single property. The only officially supported method is to create distinct properties from the initial setup.
Specifically, if you manage blog.example.com and shop.example.com, you need to register three properties: one for the main domain, one for each subdomain. The same logic applies for the m.example.com mobile versions. This is the only way to obtain isolated performance, indexing, and Core Web Vitals reports.
What is the difference between domain property and URL property?
Search Console offers two types of properties. The domain property aggregates all variations: HTTP, HTTPS, www, non-www, all subdomains. Useful for an overview, but it overshadows the granularity you seek.
The URL property targets a specific address: https://blog.example.com only. This is the one you need to multiply for segmentation. The downside: the more active subdomains you have, the more interfaces you need to monitor.
What are the risks if we skip this configuration?
Without segmentation, you are operating in the dark. If your blog subdomain loses 40% of traffic but the shop compensates, the aggregated view hides the issue. You discover the problem weeks too late, when revenues drop or a client raises an alert.
Worse, indexing error signals or degraded Core Web Vitals get lost in a global report. You have no idea which part of the site to prioritize for correction. It's wasted time and decisions based on statistical noise.
- Create a distinct URL property for each subdomain or mobile version you want to measure independently.
- Also maintain a domain property for the consolidated view, useful for global audits.
- Document the structure of your properties in an internal wiki: who has access, what scope each property covers.
- Regularly check that new variations (m.example.com launched in a hurry, new marketing subdomain) are being tracked properly.
- Anticipate the workload: 10 active subdomains = 10 dashboards to consult daily.
SEO Expert opinion
Is this statement consistent with observed field practices?
Yes, but it overlooks a major irritation: the Search Console interface is not designed to handle easily dozens of properties. Agencies managing clients with 15 subdomains spend a lot of time juggling accounts, without a smart consolidated view.
Google promotes the domain property as a miracle solution, but as soon as you want to isolate a specific mobile version or a test subdomain, you revert to the URL property system. The result: you multiply accesses, permissions, exports. The Search Console API exists, but not all practitioners have the skills to automate.
What nuances should be added to this recommendation?
Mueller does not clarify when it becomes counterproductive to multiply properties. If your site has 50 geographic subdomains (fr.example.com, de.example.com…), creating 50 properties becomes unmanageable without custom tooling. In this case, a hybrid strategy is necessary: URL properties for the 5-10 critical subdomains, domain property for the rest, and data extraction via the API to reconstruct business views.
Another blind spot: AMP versions. If you serve AMP content from a dedicated subdomain (amp.example.com), you also need to create a distinct property. But Google does not document how to properly cross-reference AMP and non-AMP data to avoid double counting in your analytics dashboards. [To be verified] by cross-referencing with server logs.
In what situations does this approach become an operational bottleneck?
When you manage a brand ecosystem with dozens of sites and subdomains, multiplying Search Console properties creates a human bottleneck. Teams spend more time exporting and concatenating CSVs than analyzing trends.
Moreover, automatic alerts from Search Console (indexing errors, security issues) are sent property by property. If you have not set up a centralization system (webhook, IFTTT, Zapier, or custom script), you miss critical alerts buried in 40 different inboxes. This is real experience.
Practical impact and recommendations
What should you concretely do to correctly segment your traffic?
Start by mapping the architecture of your site: list all active subdomains (blog, shop, m., amp., api., etc.). Identify those serving indexable content, and ignore purely technical ones (cdn., admin.). For each indexable subdomain, create a URL property in Search Console.
Ensure that the verification files (HTML, DNS TXT) are properly in place for each property. If you're using Google Tag Manager, check that the verification tag is deployed across all environments, including mobile versions. An unverified subdomain = zero usable data.
What mistakes should be avoided during multi-property configuration?
Do not create a property for canonicalized variants. If m.example.com redirects systematically to www.example.com, there is no need to multiply: Google will see only one version. Conversely, if both versions coexist (configuration error), you will observe fragmented data and signals of duplicated content.
Avoid mixing URL properties and domain properties without clear documentation. Teams lose track, grant incorrect accesses, and end up working with incomplete datasets. A shared file (Google Sheets, Notion) listing each property, its exact URL, and the people who have access is a minimum requirement.
How can you check if the segmentation is working properly?
Compare the total clicks in each URL property with the sum displayed in the domain property. If the figures do not match (discrepancy > 5%), it means a subdomain is not being tracked, or a redirect is skewing the counts. Cross-check with Google Analytics (GA4 property segmented by hostname) to validate the consistency.
Test the coverage reports: deliberately trigger a 404 error on a subdomain, wait 48 hours, then check that it appears correctly in the corresponding property and not elsewhere. If the alert appears in the wrong property, it indicates that your DNS configuration or redirects are creating leaks.
- Map all indexable subdomains of the site
- Create a distinct URL property for each critical subdomain
- Verify each property via DNS TXT or HTML file
- Document the structure in a shared file with access and scopes
- Cross-check Search Console / GA4 data to validate consistency
- Test error alerts to ensure they appear in the correct property
❓ Frequently Asked Questions
Peut-on fusionner plusieurs propriétés Search Console a posteriori ?
La propriété de domaine agrège-t-elle automatiquement tous les sous-domaines sans limite ?
Combien de propriétés peut-on créer par compte Search Console ?
Les données d'une propriété URL mobile sont-elles identiques à celles du desktop ?
Faut-il créer une propriété distincte pour les pages AMP hébergées sur un sous-domaine ?
🎥 From the same video 32
Other SEO insights extracted from this same Google Search Central video · duration 54 min · published on 24/08/2017
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.