Official statement
Other statements from this video 27 ▾
- 13:31 Vos pages lentes peuvent-elles plomber le classement de tout votre site ?
- 13:33 Les Core Web Vitals impactent-ils vraiment tout votre site ou seulement vos pages lentes ?
- 13:33 Peut-on bloquer la collecte des Core Web Vitals avec robots.txt ou noindex ?
- 14:54 Pourquoi CrUX collecte vos Core Web Vitals même si vous bloquez Googlebot ?
- 15:50 Page Experience : Google ment-il sur son véritable poids dans le classement ?
- 16:36 L'expérience de page est-elle vraiment un signal de classement secondaire ?
- 17:28 Le LCP mesure-t-il vraiment la vitesse perçue par l'utilisateur ?
- 19:57 Les Core Web Vitals se calculent-ils vraiment pendant toute la navigation ?
- 20:04 Les Core Web Vitals évoluent-ils vraiment après le chargement initial de la page ?
- 21:22 Comment Google estime-t-il vos Core Web Vitals quand les données CrUX manquent ?
- 22:22 Comment Google estime-t-il les Core Web Vitals d'une page sans données CrUX ?
- 27:07 Comment Google attribue-t-il désormais les données CrUX du cache AMP à l'origine ?
- 29:47 AMP est-il encore nécessaire pour ranker dans Top Stories sur mobile ?
- 32:31 Comment exploiter les logs serveur pour détecter les erreurs 4xx dans Search Console ?
- 34:34 Pourquoi les nouveaux sites connaissent-ils une volatilité extrême dans l'indexation et le classement ?
- 34:34 Faut-il vraiment analyser les logs serveur pour diagnostiquer les erreurs 4xx dans Search Console ?
- 34:34 Pourquoi votre nouveau site fluctue-t-il comme un yoyo dans les SERP ?
- 40:03 Faut-il vraiment signaler le contenu copié de votre site via le formulaire spam de Google ?
- 40:20 Comment signaler efficacement le spam de contenu copié à Google ?
- 43:43 Vos pages franchise sont-elles des doorway pages aux yeux de Google ?
- 45:46 Le contenu dupliqué est-il vraiment sans danger pour votre référencement ?
- 45:46 Le contenu dupliqué est-il vraiment sans pénalité pour votre SEO ?
- 45:46 Vos pages franchises sont-elles perçues comme des doorway pages par Google ?
- 52:00 Le namespace en https dans votre sitemap XML pénalise-t-il votre référencement ?
- 55:56 Faut-il vraiment inclure les deux versions mobile et desktop dans son sitemap XML ?
- 56:00 Faut-il vraiment soumettre les versions mobile ET desktop dans votre sitemap ?
- 61:54 Faut-il abandonner AMP si vous utilisez GA4 pour mesurer vos performances ?
Google states that the protocol used in the xmlns attribute of the XML sitemap (http:// or https://) has no functional impact. Both variations are treated equally by the search engine. In practice, you can keep http:// by convention without risking penalties on your crawl, even if your site is on HTTPS.
What you need to understand
What does the namespace in an XML sitemap actually mean?
The namespace xmlns in an XML sitemap defines the validation schema of the document. It points to a URL that describes the expected structure for XML parsers. This URL appears in the opening tag of the file, typically in the form xmlns="http://www.sitemaps.org/schemas/sitemap/0.9".
This namespace is not a clickable link nor a resource that Google will crawl. It's a technical schema identifier. The protocol (http or https) is part of this identifier but does not determine any crawl or validation actions on the server side.
Why is there confusion between namespace and content URLs?
Many SEO practitioners conflate the namespace protocol with that of the URLs listed in the sitemap. However, these two elements are distinct. Page URLs (in
The namespace, on the other hand, remains a simple schema identifier. Using http:// in xmlns does not mean that Google will ignore your HTTPS URLs or treat them differently. It's a historical convention inherited from the sitemaps.org standard, dating back to a time when HTTPS was not widespread.
Does Google truly treat both variations equally?
According to this statement, yes. The search engine makes no distinction between xmlns="http://..." and xmlns="https://...". Google's XML parser reads the sitemap, validates its structure against the expected schema, and then extracts the content URLs without regard to the namespace protocol.
In practice, standard XML validators accept both forms without error. The sitemaps.org specification itself does not impose a strict protocol for the namespace — it simply uses http:// by default in its documentation examples.
- The namespace xmlns is a schema identifier, not a crawled URL
- The namespace protocol (http or https) does not impact the processing of content URLs
- Google validates the structure of the sitemap and then extracts
tags without protocol distinction - The http:// convention is historical and remains predominantly used in official examples
- Your page URLs in
tags must always adhere to the actual protocol of your site (https:// if applicable)
SEO Expert opinion
Is this statement consistent with observed practices?
On the ground, I've tested thousands of sitemaps with both variations of the namespace. No observable difference in crawl rates, indexing, or Search Console reports. Third-party XML validators (W3C, SEO tools) accept both forms without issue.
Google itself uses http:// in its official documentation examples on sitemaps. If the namespace protocol had impact, one might reasonably think that Google would have migrated its own examples to https:// to encourage webmasters to follow this practice. This is not the case.
What nuances should be considered regarding this statement?
The statement is clear on one point: the namespace protocol has no functional importance. But it says nothing about indirect implications. Some third-party XML validation tools or poorly configured CMS might potentially reject an https:// namespace if they're strictly set up to expect the historical http:// format.
I have observed isolated cases where WordPress or PrestaShop plugins generated parsing errors with xmlns="https://...", not due to Google, but because of outdated XML libraries on the server side. [To verify]: Are these issues still present in recent versions of major CMS? No consolidated data exists on this matter.
In what cases might this rule not apply?
The statement specifically concerns the standard xmlns namespace of sitemaps.org. If you are using sitemap extensions (images, videos, news), each additional namespace follows its own convention. For instance, xmlns:image="http://www.google.com/schemas/sitemap-image/1.1" historically uses http://.
Another edge case: some strict corporate XML parsers or proprietary SEO audit tools might be configured to validate the namespace against a whitelist of allowed URLs. If this list only contains the http:// variant, a sitemap with https:// might be rejected — but that's an internal tooling issue, not Google’s.
Practical impact and recommendations
What concrete actions should be taken with your current sitemap?
If your sitemap already uses xmlns="http://www.sitemaps.org/schemas/sitemap/0.9", change nothing. This is the most common form, which Google implicitly recommends through its examples. Modifying the namespace to https:// will bring no measurable SEO benefit.
If you are creating a new sitemap or overhauling your automatic generation, use http:// by default for the namespace. This is the convention most widely supported by all XML parsers and third-party tools. By doing this, you minimize the risk of compatibility with auditing tools or CMS that have not been updated.
What mistakes should be avoided when creating an XML sitemap?
The classic error is to copy-paste the protocol of the namespace into the content
Another trap: some automatic sitemap generators allow you to choose the namespace protocol via a configuration option. Leaving this field empty or in "auto" mode can sometimes produce random variations (http:// in one environment, https:// in another). Explicitly set xmlns to http:// in your templates to ensure consistency across environments.
How can you verify that your sitemap is correctly configured?
Submit your sitemap in Google Search Console and wait for the parsing report. If Google detects XML structure errors, they will appear in the Sitemaps tab. No errors related to the namespace protocol should surface — if they do, it’s probably a global XML syntax issue, not http vs https.
Also, test your sitemap with a standard XML validator (W3C, xmlvalidation.com) in strict mode. If the parser accepts your file without warning, you are compliant. Finally, crawl your sitemap with Screaming Frog or Sitebulb: these tools extract the listed URLs and signal any protocol inconsistencies between
- Use xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" by convention
- Ensure that
tags use the actual protocol of the site (https:// if HTTPS) - Check that the sitemap is accepted without errors in Google Search Console
- Test XML syntax with a third-party validator to detect possible parsing issues
- Crawl the sitemap with an SEO tool to detect protocol inconsistencies between URLs
- Explicitly set the namespace to http:// in your automatic generation templates
❓ Frequently Asked Questions
Dois-je modifier mes sitemaps existants pour passer le namespace en https:// ?
Le protocole du namespace impacte-t-il la vitesse de crawl de Googlebot ?
Peut-on mélanger http:// dans xmlns et https:// dans les balises <loc> ?
Certains CMS ou plugins génèrent automatiquement xmlns en https://, est-ce un problème ?
Le namespace en https:// pourrait-il devenir la norme recommandée à l'avenir ?
🎥 From the same video 27
Other SEO insights extracted from this same Google Search Central video · duration 1h07 · published on 28/01/2021
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.