Official statement
Other statements from this video 12 ▾
- 2:12 Google traite-t-il vraiment les directives d'indexation ajoutées en JavaScript ?
- 3:16 Pourquoi les modifications de site provoquent-elles des chutes temporaires de classement ?
- 5:20 Pourquoi vos dates d'affichage dans la Search Console ne correspondent-elles pas à la réalité ?
- 12:45 Le duplicate content entre domaines géographiques est-il vraiment sans risque pour le SEO ?
- 18:44 Les promotions croisées nuisent-elles au SEO si elles dérivent du sujet principal ?
- 23:20 Pourquoi Google refuse-t-il d'indexer toutes vos pages même avec un crawl budget optimal ?
- 28:35 Les chaînes de canoniques complexes compromettent-elles vraiment l'indexation de votre site ?
- 28:35 Les chaînes de canoniques ralentissent-elles vraiment la consolidation de vos signaux SEO ?
- 29:50 Les commentaires spam ruinent-ils vraiment votre SEO ?
- 34:54 Le mobile-first indexing est-il vraiment un aller sans retour pour votre site ?
- 44:30 Peut-on indexer ses pages de résultats de recherche interne sans risque de pénalité ?
- 47:04 Les données structurées peuvent-elles vraiment vous éviter des complications en SEO ?
Google recommends keeping all versions of your site in Search Console even after implementing redirects. This practice allows you to monitor redirect behavior and quickly detect indexing errors. Specifically, this means maintaining HTTP, HTTPS, www, and non-www properties for comprehensive migration signal tracking.
What you need to understand
Why does Google emphasize keeping multiple Search Console properties?
Google treats each URL variation as a distinct entity in its indexing system, even when 301 redirects are properly set up. An HTTP property may continue to receive crawl activity for weeks after a redirect. Backlinks may still point to the old version. Users may have cached old URLs.
The Search Console displays crawl, indexing, and performance data based on the version of the URL actually visited by Googlebot. If you delete an HTTP property after migrating to HTTPS, you lose visibility into crawl attempts towards the old version. You won’t see accidental 404 errors, broken redirect chains, or SSL certificate issues.
What versions of a site should be declared in Search Console?
For a standard site, there are technically four distinct properties: http://example.com, https://example.com, http://www.example.com, and https://www.example.com. Each can be crawled independently by Google, depending on the link source or user query.
Mueller’s recommendation aims to maintain complete visibility across all these entry points. Even if your technical setup redirects everything to HTTPS + www, signals can still arrive on the older variations for months. Partial monitoring creates blind spots in your analysis.
Does this multiplication of properties complicate daily management?
Yes, it is a trade-off between comprehensive monitoring and operational simplicity. Monitoring four properties multiplies dashboards, alerts, and data exports. For a medium-sized site, this represents a considerable management effort.
Google offers property sets (Domain Property) to group all variations under a single view. This feature aggregates data while retaining access to individual reports by version. It’s a practical compromise between granularity and operational efficiency.
- Always declare the 4 main versions (HTTP/HTTPS × www/non-www) as distinct properties in Search Console
- Use a property set (Domain Property) to obtain an aggregated view without losing detail by version
- Especially monitor non-canonical properties during the 3-6 months following a migration to detect anomalies
- Keep old properties even years after migration if they still receive residual crawl
- Export and archive historical data before any property deletion
SEO Expert opinion
Is this recommendation consistent with real-world observations?
Absolutely. HTTPS migrations and URL structure changes generate residual signals for months, sometimes years. I've observed on e-commerce sites that HTTP URLs continued to receive 15-20% of the crawl budget six months after migration, primarily through outdated external backlinks.
Without an HTTP property in Search Console, these crawl attempts remain invisible. You won't detect accidental redirect chains, loops, or soft 404 errors that dilute your crawl budget. Mueller’s advice is pragmatic and reflects the technical reality of the web: old URLs are persistent.
What nuances should be added to this directive?
The recommendation assumes sufficient monitoring resources. For a small business with a 50-page site, managing four Search Console properties may be disproportionate. In this case, prioritize the current canonical version and the aggregated domain property. [To verify] on very small sites whether the information gain justifies the management overhead.
Another nuance: the duration of retention. Mueller says to "keep" but does not specify for how long. Based on experience, a property that receives no crawl for 12 consecutive months can be archived. Keep historical data exports but relieve yourself of active monitoring. The residual risk becomes negligible after this time on most sites.
In what cases does this rule not strictly apply?
For new sites launched directly in HTTPS with a single URL variation, the issue doesn’t arise. If you have never had an HTTP version in production, there’s no need to artificially create that property. The recommendation targets migrations and consolidations, not launches from scratch.
For very large sites with dozens of subdomains, a selective approach is necessary. Monitor strategic subdomains and create automated alerts rather than manually monitoring each variation. Theoretical completeness gives way to operational pragmatism.
Practical impact and recommendations
What should you concretely do after reading this statement?
Immediately audit your current Search Console setup. List all URL variations of your site: HTTP vs HTTPS, www vs non-www, and possibly specific subdomains. Ensure there is a distinct property for each, even those that redirect to the canonical version.
If any properties are missing, add them using the "Add a property" button and validate the property via DNS, HTML file, or Google Analytics. Then create a property set (Domain Property) to group all these variations under a unified view. This dual approach ensures granularity and efficiency.
What mistakes should you avoid when managing multiple properties?
Never delete a Search Console property just because it redirects to another. The redirect is precisely the reason you need to monitor it. Server configuration errors, expired SSL certificates, or accidental .htaccess modifications can break your redirects without your knowledge.
Avoid relying solely on the aggregated domain property. It can sometimes mask version-specific anomalies. A spike in 404 errors on the HTTP version might go unnoticed in the overall figures. Regularly check individual reports, especially after technical changes.
How can you verify that your Search Console setup is optimal?
Go to Search Console and list all your active properties. For each URL variation of your site, ensure a corresponding property exists and that it has received recent crawl data. If a property shows zero impressions for several months, it’s either a good sign (no residual traffic) or a sign of a validation issue.
Manually test each URL variation in a browser to confirm that redirects are working. Use the URL inspection tool in Search Console on each version to see how Googlebot processes it. Inconsistencies between your intentions and Google’s actual behavior can be detected at this level.
- Create a distinct Search Console property for each URL variation (HTTP/HTTPS, www/non-www)
- Set up a property set (Domain Property) for a consolidated view
- Monthly check coverage reports on non-canonical properties to detect anomalies
- Export and archive historical data before deleting any outdated property
- Manually test redirects and use the URL inspection tool after each server change
- Document which URL version is canonical and why, to avoid confusion within the team
❓ Frequently Asked Questions
Dois-je créer une propriété Search Console pour chaque sous-domaine de mon site ?
Que se passe-t-il si je supprime une propriété Search Console après migration ?
La propriété de domaine (Domain Property) remplace-t-elle les propriétés individuelles ?
Combien de temps faut-il conserver les anciennes propriétés après une migration ?
Comment savoir si mes redirections fonctionnent correctement dans Search Console ?
🎥 From the same video 12
Other SEO insights extracted from this same Google Search Central video · duration 54 min · published on 29/11/2018
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.