Official statement
Other statements from this video 9 ▾
- □ Faut-il vraiment doubler les données produits entre Schema et Merchant Center ?
- □ Faut-il vraiment empiler trois couches de données produits pour plaire à Google ?
- □ Comment la Search Console détecte-t-elle réellement les erreurs dans vos données structurées produits ?
- □ Pourquoi Google veut-il que vous affichiez des prix plus élevés dans vos données structurées ?
- □ Faut-il vraiment utiliser une requête site: pour vérifier vos données de prix ?
- □ Faut-il vraiment surestimer les frais de port pour plaire à Google Shopping ?
- □ Pourquoi Google exige-t-il des identifiants produits GTIN, MPN ou marque pour le référencement marchand ?
- □ Les identifiants produits sont-ils vraiment la clé du matching multi-marchands dans Google Shopping ?
- □ Faut-il abandonner Merchant Center au profit des données structurées pour le e-commerce ?
Google reminds us that Schema.org is not exclusive to its search engine — Bing, Yandex and others use it too. For e-commerce sites, consulting schema.org directly helps identify structured data properties exploitable beyond Google Rich Results alone, with potentially broader SEO benefits.
What you need to understand
Why does Google emphasize this multi-engine dimension?
Alan Kent's statement aims to broaden the perspective of SEO practitioners who often focus exclusively on Google recommendations. Schema.org is a collaborative project initiated by Google, Bing, Yahoo and Yandex — so structured data benefits the entire ecosystem.
In e-commerce, this means that Product, Offer or Review markup implemented correctly can enrich display across multiple engines simultaneously. Thinking multi-engine from the start of technical design avoids having to redo the work later.
Which Schema.org properties often fly under the radar?
Google's documentation on Rich Results presents a restricted subset of Schema.org — only what triggers rich display. The schema.org website, on the other hand, lists hundreds of properties rarely mentioned in official guides.
For e-commerce: aggregateRating, shippingDetails, hasMerchantReturnPolicy, gtin, mpn, brand. Some produce nothing visible on Google today, but structure data for other engines or voice assistants that consume them differently.
Does this approach actually change SEO strategy?
Adopting a multi-engine logic means verifying schemas not just through Search Console, but by consulting schema.org directly. This avoids missing out on exploitable properties elsewhere — especially Bing Shopping, which displays structured data sometimes ignored by Google.
In practice, an e-commerce site that documents its products with complete Schema.org coverage gains technical robustness and future compatibility, with no significant marginal cost if implementation is automated.
- Schema.org is a shared standard — don't reduce structured data to Google alone
- Bing, Yandex and other engines exploit properties ignored by Google Search Console
- Consulting schema.org directly identifies missing e-commerce fields in Google guides
- Comprehensive markup today avoids a technical redesign tomorrow when an engine changes format
SEO Expert opinion
Is this statement consistent with practices observed in the field?
Yes, and it's even a necessary reminder. Most e-commerce SEO audits focus on Google criteria — Product snippet, Merchant Center, structured reviews — while neglecting Bing or aggregators that consume JSON-LD. Yet Bing still accounts for 3 to 8% of organic traffic depending on sector, sometimes with higher conversion rates.
Sites that implement only the bare minimum to trigger Google Rich Results miss visibility elsewhere. This isn't marginal optimization — it's work already done left incomplete through lack of standard knowledge.
What nuances should be added to this advice?
Beware of swinging too far the other way: marking up all Schema.org properties without strategy creates technical noise and complicates maintenance. The goal isn't adding 50 fields for the sake of it, but identifying the 5 to 10 missing properties that have real impact elsewhere.
Concretely? For an e-commerce site, shippingDetails and hasMerchantReturnPolicy are used by Bing Shopping. gtin and mpn improve product recognition in Google Images and Shopping Graph. But marking up depth or weight without real use cases remains wasted effort.
[To verify]: no public data currently proves that Google uses all Schema.org properties internally for ranking or semantic understanding. Some properties appear purely decorative on Google's side, useful only for other engines.
In what cases does this rule not apply?
If your organic traffic comes 98% from Google and you have no plans to expand to markets where Yandex or Baidu dominate, prioritizing Google remains rational. Expanding to complete schema.org brings marginal gains unless you're anticipating future changes.
Conversely, for B2B sites, marketplaces or product feed aggregators, ignoring schema.org in favor of Google guides alone is a mistake. Third-party partners, comparison engines and voice assistants often consume extended JSON-LD that Google doesn't reward visually.
Practical impact and recommendations
What should you actually do for an e-commerce site?
First step: audit current schemas not just through Search Console, but by comparing deployed markup with complete schema.org/Product and schema.org/Offer documentation. Identify missing properties with real use cases — shipping, return policy, product identifiers.
Next, automate injection of these fields into existing JSON-LD. If your product sheets already contain GTIN, MPN, return policy in your database, you just need to expose them in the markup. No redesign needed: just clean mapping between business data and Schema.org properties.
What mistakes should be avoided during implementation?
Don't mark up for the sake of marking up. Each property added should correspond to real, verifiable data on the page or in your database. Google, Bing and validators detect fictitious or contradictory schemas — risk of manual penalty or Rich Results invalidation.
Also avoid duplicating schemas between JSON-LD, Microdata and RDFa. One format is enough — JSON-LD remains the cleanest technically and simplest to maintain. Mixing syntaxes creates conflicts that engines struggle to resolve.
How do you verify markup works beyond Google?
Use third-party validators: schema.org Validator, Bing Webmaster Tools Markup Validator, or Yandex Structured Data tool. Each engine exposes its own alerts — often stricter than Google on certain properties.
Additionally, monitor server logs to spot crawlers from secondary engines (Bingbot, YandexBot) and verify they access marked-up pages properly. A crawler that never sees your JSON-LD can never exploit it, regardless of quality.
- Compare current markup with complete schema.org/Product documentation — not just Google guides
- Add missing properties with real impact elsewhere: shippingDetails, hasMerchantReturnPolicy, gtin, mpn
- Automate JSON-LD injection from business data already present in your database
- Validate markup via schema.org Validator + Bing Webmaster Tools, not Search Console alone
- Avoid fictitious or contradictory schemas — each property must match real data
- Monitor Bing/Yandex crawlers in logs to confirm they access the markup
❓ Frequently Asked Questions
Schema.org est-il vraiment exploité par Bing et Yandex ou est-ce théorique ?
Faut-il baliser toutes les propriétés schema.org ou seulement celles validées par Google ?
Le balisage étendu Schema.org améliore-t-il le ranking Google ?
Quel outil utiliser pour valider un balisage multi-moteurs ?
Les assistants vocaux exploitent-ils Schema.org différemment des moteurs de recherche ?
🎥 From the same video 9
Other SEO insights extracted from this same Google Search Central video · published on 17/01/2023
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.