Official statement
Other statements from this video 15 ▾
- 8:05 Comment Google affiche-t-il vraiment vos produits dans les résultats de recherche ?
- 13:03 Comment Google Images exploite-t-il les données produit pour améliorer la visibilité ?
- 21:25 Google Maps peut-il vraiment booster vos ventes locales avec l'inventaire de proximité ?
- 37:43 Les données structurées produit améliorent-elles vraiment la précision de Google sur vos fiches ?
- 47:34 Pourquoi Google Shopping est-il gratuit et qu'est-ce que ça change pour votre SEO e-commerce ?
- 52:54 Merchant Center améliore-t-il vraiment vos positions organiques ?
- 56:00 Faut-il vraiment envoyer TOUS vos produits à Google maintenant ?
- 60:09 Pourquoi Google refuse-t-il d'afficher certains résultats enrichis malgré vos données structurées ?
- 72:42 Les données structurées sont-elles vraiment indispensables pour que Google comprenne vos produits ?
- 86:42 Les données structurées améliorent-elles vraiment la précision du crawl Merchant Center ?
- 90:52 Les flux supplémentaires sont-ils la clé pour éviter les délais de crawl sur les données volatiles ?
- 111:38 Google compare-t-il vraiment vos flux produits avec vos pages pour exclure vos fiches ?
- 117:02 Faut-il vraiment activer les mises à jour automatiques de prix et stock dans Merchant Center ?
- 126:23 L'API Content de Google Merchant peut-elle vraiment indexer vos produits en quelques minutes ?
- 151:30 Le SEO classique reste-t-il vraiment prioritaire face à l'essor de l'IA et des nouvelles interfaces de recherche ?
Google offers three methods to feed Merchant Center: direct site crawl, periodic full feed, and API for individual updates. Each approach caters to different technical and operational constraints — the choice significantly influences data freshness and responsiveness to stock changes. The statement remains silent on performance trade-offs or the selection criteria for one method over another.
What you need to understand
Why does Google allow three distinct feeding methods?
Google historically started with periodic product feeds (XML, CSV) — the simplest method for e-commerce merchants with static catalogs. However, this approach has a freshness issue: between two exports, your product data can become outdated (stockouts, price changes).
The automatic crawl emerged to circumvent this friction: Google analyzes your site as it would for organic indexing, extracts structured data (Schema.org Product), and updates Merchant Center. It’s theoretically smoother — but you lose the granular control of a feed.
The API Content, on the other hand, addresses the need for real-time responsiveness: you push updates product by product, as they occur in your system. It's the most technical method, but also the most suited for volatile catalogs (fashion, electronics, marketplace).
How does Google crawl product data on a site?
The product crawl relies on the Schema.org Product tags present on your pages. Google scans your XML sitemap (ideally, a sitemap dedicated to product listings), discovers the URLs, then extracts prices, availability, images, descriptions via structured markup.
In practice, if you have already optimized your product listings for organic rich snippets, you’ve done 80% of the work. The Merchant Center crawl utilizes the same data — but with stricter quality requirements (standardized price formats, GTINs mandatory in some sectors).
When should you prefer a feed over crawl or API?
The periodic feed remains relevant for stable catalogs, updated once or twice a day at most. If you manage 5,000 products with little price and stock fluctuation, exporting a daily XML via your CMS or ERP is the simplest solution to maintain.
The crawl suits sites looking to fully automate feeding without managing an intermediary file — but beware, you depend on the goodwill and crawling frequency of Googlebot. There's no guarantee on update latency.
The API becomes essential when you need to synchronize in near real-time: fast fashion, multi-vendor marketplace, limited-stock products. It's also the only viable method if you manage over 50,000 SKUs with frequent variations.
- The periodic feed simplifies management but limits data freshness between two submissions.
- The automatic crawl eliminates feed maintenance but subjects you to Google’s timing.
- The API Content offers the best responsiveness but requires advanced technical integration.
- Nothing stops you from combining multiple methods (base feed + API for urgent updates).
- The quality of the Schema.org markup is crucial for the crawl — structured errors = ignored data.
SEO Expert opinion
Is this statement consistent with observed practices on the ground?
Yes, and it's even a welcome reminder. Many SEOs still believe that Merchant Center = mandatory XML feed, while automatic crawl has existed for several years. Google has gradually opened the gates to simplify onboarding for e-commerce merchants, especially small sites lacking the resources to generate a clean feed.
But — and this is where it falters — Google says nothing about the performance criteria among these three methods. Does the crawl offer the same indexing priority in Shopping as a manual feed? Does the API provide a measurable freshness advantage in terms of CTR or conversions? [To be verified] — no public data on that.
What nuances should be added to this presentation of the three methods?
Alan Kent does not mention a critical point: not all product categories are eligible for automatic crawl. Some verticals (notably regulated products: health, finance) require a manual feed with specific attributes. The crawl can silently fail if your markup does not cover mandatory fields.
Another blind spot is data governance. With a feed, you maintain control over what goes to Merchant Center — you can exclude certain products, manually enrich descriptions, and adjust margins. With the crawl, you delegate to Google the interpretation of your markup. If your Schema.org contains errors or ambiguous values, you won’t necessarily have immediate visibility on rejections.
Finally, the API Content is presented as one option among others, but for marketplaces or multi-vendor sites, it is in practice the only scalable method. Generating a consolidated feed of 200,000 products with 30% changing daily quickly becomes unmanageable.
In what cases do these recommendations not fully apply?
If you are on a rigid e-commerce platform (some proprietary SaaS), you may not have access to the API Content — or the integration may cost more than the responsiveness gained. In that case, the feed remains the only realistic option, even if it involves a gap in freshness.
International multi-country sites must also nuance: the automatic crawl does not always manage price variations by currency or availability rules by region well. You often end up reverting to a segmented feed by market, with explicit targeting rules.
Practical impact and recommendations
Which method should you choose based on your technical infrastructure?
If your CMS already generates a clean product feed (Shopify, WooCommerce, Prestashop with dedicated modules), don’t complicate things: use the feed. It’s stable, predictable, and you retain control over the data sent. Plan for at least a daily update — ideally every 6 hours if your stocks move quickly.
If you have a dev team available and a large catalog (> 20,000 products), invest in integrating the API Content. The ROI can be measured through reduced displayed stockouts and better price synchronization. Plan for an asynchronous queue system to manage update spikes without overloading the API.
The automatic crawl is relevant if you already have impeccable Schema.org Product markup and are looking to eliminate any feed maintenance. But first, test in parallel with a control feed — verify that Google is extracting all critical fields (GTIN, MPN, availability).
How to verify that your product data is correctly ingested by Merchant Center?
Log into Merchant Center and check the Diagnostics tab. Google lists errors by product there: missing fields, invalid formats, inaccessible URLs. If you’re using the crawl, also review the coverage reports — some products may be discovered but not indexed due to incomplete data.
Utilize the Merchant Center Search Console (the ‘Product data quality’ section) to detect non-blocking but penalizing warnings: poor-quality images, overly short descriptions, missing recommended attributes. An accepted product is not necessarily a well-ranked product in Shopping.
If you use the API, log each HTTP response. Errors 400 or 429 (rate limit) should trigger alerts — a product not updated for 24 hours can lose visibility in Shopping results if a competitor updates faster.
What errors should be avoided when implementing these methods?
Classic error with the feed: sending a file of 100,000 lines without segmentation by category or availability. If 30% of your products are permanently out of stock, you pollute Merchant Center and degrade your quality score. Filter upstream — only send sellable products.
Error with the crawl: relying on Google to discover everything without submitting a dedicated XML sitemap for products. The Merchant Center crawl is not prioritized over organic crawling — if your product listings are deep within the hierarchy or poorly linked, they will never be seen.
Error with the API: sending individual updates for every product variation (size, color) instead of grouping into batches. You saturate your API quotas and slow down your entire synchronization. Use batch requests to push up to 1,000 modifications in a single request.
- Audit your Schema.org Product markup with Google's structured data testing tool before enabling the crawl.
- Set up automatic alerts on Merchant Center errors (Slack, email) to quickly react to product rejections.
- If you use a feed, add a custom_label field to segment your Shopping campaigns by margin or stock rotation.
- Test the API Content in sandbox with 100 products before switching the entire catalog.
- Do not mix methods without a clear strategy — a product sent by feed AND crawled can create duplicates and data conflicts.
- Document your update frequency (daily feed, real-time API) to align internal expectations and avoid misunderstandings with business teams.
❓ Frequently Asked Questions
Peut-on combiner plusieurs méthodes d'alimentation de Merchant Center sur un même compte ?
Le crawl automatique de Merchant Center consomme-t-il du crawl budget organique ?
L'API Content offre-t-elle un avantage de classement dans Google Shopping par rapport au flux ?
Que se passe-t-il si Google crawle une fiche produit avec un balisage Schema.org incomplet ?
Faut-il privilégier le flux XML ou le flux CSV pour Merchant Center ?
🎥 From the same video 15
Other SEO insights extracted from this same Google Search Central video · duration 161h23 · published on 23/03/2021
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.