Official statement
Other statements from this video 16 ▾
- 6:25 Faut-il vraiment ajouter nofollow sur les liens footer entre sites d'un même groupe ?
- 10:04 Pourquoi le nouvel outil de test des données structurées prend-il jusqu'à 30 secondes pour analyser une page ?
- 13:43 Google Discover utilise-t-il vraiment les mêmes algorithmes de qualité que la recherche classique ?
- 15:50 Pourquoi Google fusionne-t-il vos pages multilingues en une seule URL canonique ?
- 22:00 Faut-il encore baliser vos liens d'affiliation avec rel=sponsored ?
- 24:14 Les liens d'affiliation nuisent-ils vraiment au référencement de votre site ?
- 28:00 Faut-il vraiment abandonner display:none pour différencier mobile et desktop ?
- 30:05 Peut-on vraiment prioriser certaines pages dans Google sans balise méta dédiée ?
- 34:28 Google peut-il vraiment bloquer un site en position 11 pour le bannir de la page 1 ?
- 35:56 Faut-il encore remplir les attributs priority et changefreq dans vos sitemaps XML ?
- 40:17 Peut-on vraiment régler un litige de contenu dupliqué via Google Search Console ?
- 44:38 Google classe-t-il toujours le contenu original en premier ?
- 45:49 Google peut-il vraiment déclasser un site entier pour cause de duplication systématique ?
- 47:03 Les plaintes DMCA automatisées peuvent-elles nuire à votre visibilité dans Google ?
- 48:49 Quelle taille de pop-up échappe réellement à la pénalité Google pour interstitiels intrusifs ?
- 54:47 L'indexation mobile-first offre-t-elle vraiment un avantage SEO ou est-ce un mythe ?
Google confirms that one implementation of structured data is sufficient, even if the content appears twice (desktop + mobile hidden). Doubling up on markers increases the risk of inconsistency between versions without providing SEO benefits. In practical terms: prioritize a unique server-side integration rather than multiplying schema.org tags in the DOM.
What you need to understand
Why is the issue of duplication arising in the first place?
Many sites still use architectures where the same content appears twice in the DOM — one desktop version and one mobile version hidden via CSS (display:none). This approach, inherited from old responsive design practices, naturally raises the question: should structured data also be duplicated?
There is a strong temptation to integrate schema.org tags into each version of the template, out of concern for apparent consistency. However, this logic does not hold up against the reality of Google's crawl: the bot does not distinguish between "mobile version" and "desktop version" when it comes to structured data. It reads the complete DOM, regardless of CSS rules.
What does Google actually read in this scenario?
When Googlebot analyzes a page that contains the same block of content twice (one visible, one hidden), it sees both in the raw HTML. If you have duplicated your JSON-LD or microdata, the engine detects two identical sets of structured data for the same item.
The main risk is not a direct penalty, but gradual desynchronization — the editorial team updates one version, forgets the other, and you end up with contradictory markers. Google then has to arbitrate, which can lead to erratic or absent rich snippets.
In what contexts is this issue critical?
This situation mainly affects e-commerce and media sites that have retained an outdated templating architecture. CMSs that generate mobile and desktop rendering separately (an approach that has become rare but not extinct) are particularly concerned.
Sites built with pure responsive design, where the HTML is unique and only the CSS changes based on screen size, obviously do not face this problem. Likewise, modern architectures using React/Vue components with unified SSR avoid this duplication at the source.
- One inclusion is sufficient: even if the content appears twice in the DOM, structured data should only appear once.
- The main risk is desynchronization: maintaining two versions increases editorial and technical errors.
- Google reads the complete DOM: CSS rules (display:none) do not prevent the crawling of structured data.
- Prioritize server-side integration: generate JSON-LD only once in the backend instead of duplicating it in the frontend templating.
SEO Expert opinion
Is this recommendation aligned with observed practices in the field?
Yes, and it is even one of the few statements from Google that is perfectly consistent with what we observe in audits. Sites that duplicate their schema.org tags between mobile and desktop versions indeed face unstable rich snippet problems — sometimes displayed, sometimes absent, with no apparent logic.
Google Search Console often raises warnings for "Duplicate markup" in these configurations. Although Google does not directly penalize, the algorithm struggles to arbitrate between two identical sources, which degrades the reliability of rich snippets. A/B tests conducted during migrations show a stability gain after cleanup.
What nuances should be added to this statement?
Mueller refers here to a specific case: strictly identical content duplicated for mobile compatibility reasons. If your mobile version actually displays different content (condensed, reorganized, with different prices or availability), then yes, the structured data should reflect this difference.
Another point: this rule applies to on-page structured data (JSON-LD, microdata), not to external files like generated sitemaps or merchant feeds. The latter follow their own logic and can legitimately contain variations depending on the context. [To be verified]: Google has never clarified whether this recommendation applies to duplicated Open Graph tags, although the logic is likely similar.
In what cases does this rule not apply?
If you are using distinct URLs for mobile and desktop (architecture m.example.com vs www.example.com), each URL must obviously carry its own structured data. Mueller's statement only targets responsive sites with a single HTML containing two CSS renderings.
Similarly, for complex JavaScript applications that dynamically mount/dismount components based on screen size, the question does not arise in the same terms — the final server-side rendering (what Googlebot sees) should contain only one version of the markers, generated on the fly.
Practical impact and recommendations
What should I do if my site is currently duplicating structured data?
First step: audit all your templates to identify where the duplications are located. Look for <script type="application/ld+json"> tags that are present twice within the same DOM, or repeated microdata attributes (itemscope, itemprop) on blocks hidden in CSS.
Next, decide which version to keep — generally, the desktop version contains the most complete data, but verify on a case-by-case basis. On some e-commerce sites, the mobile version has been kept up to date while the desktop version has outdated markers. Once the source of truth is identified, remove the other and test.
How can I verify that the modification hasn't disrupted the indexing of rich snippets?
Use Google's rich results test tool before and after the modification. Compare the number of detected items and the warnings raised — you should see "Duplicate" alerts disappearing if they existed.
Next, monitor the Search Console for 2-3 weeks, particularly the "Rich Results" and "Enhancements" reports. A sudden drop in rich snippet visibility would indicate that you've removed the wrong version or introduced a syntax error. At the same time, track organic CTR on the modified pages — a well-detected rich snippet should maintain or improve performance.
What mistakes should be avoided during this refactoring?
Do not just mechanically remove all blocks marked "mobile" — some sites have historically better maintained this version. Make a precise diff of both data sets before making a decision.
Also, avoid leaving orphan schema.org tags after removing hidden content — if you remove a product block in display:none but forget its associated JSON-LD, you create reverse inconsistency. Finally, be cautious with front-end automatic structured data generators (WordPress plugins, PrestaShop modules) that may reintroduce duplication on the next deployment if you don't modify the configuration at the source.
- Audit all templates to identify JSON-LD or microdata duplications
- Compare the two versions and determine which is the most complete and up to date
- Test with Google's tool before/after to verify the absence of regression
- Monitor the Search Console (Rich Results reports) for 2-3 weeks post-migration
- Document the unique source of truth to avoid future regressions during CMS updates
- Ensure that automatic generators (plugins, modules) do not reintroduce duplication
❓ Frequently Asked Questions
Si mon site utilise du lazy-loading pour charger le contenu mobile, dois-je quand même éviter de dupliquer les données structurées ?
Est-ce que Google pénalise activement les sites qui dupliquent leurs schema.org entre mobile et desktop ?
Si j'ai des URLs séparées pour mobile (m.example.com) et desktop, cette règle s'applique-t-elle toujours ?
Comment savoir quelle version des données structurées conserver si les deux diffèrent légèrement ?
Les balises Open Graph et Twitter Cards sont-elles concernées par cette recommandation ?
🎥 From the same video 16
Other SEO insights extracted from this same Google Search Central video · duration 56 min · published on 21/08/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.