What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

It is recommended not to create individual pages for each product variant if they have low search demand, a noindex can be used to avoid indexing.
71:50
🎥 Source video

Extracted from a Google Search Central video

⏱ 54:42 💬 EN 📅 06/06/2019 ✂ 11 statements
Watch on YouTube (71:50) →
Other statements from this video 10
  1. 7:34 Faut-il vraiment nettoyer tous vos paramètres d'URL pour améliorer le crawl ?
  2. 8:44 Faut-il bloquer le crawl des paramètres d'URL qui n'affectent pas le contenu principal ?
  3. 18:27 Google applique-t-il vraiment le même score de qualité à tous les sites web ?
  4. 18:57 Google évalue-t-il vraiment chaque article de votre site d'actualités ?
  5. 28:21 Le 301 détermine-t-il vraiment quelle URL Google va canoniser ?
  6. 40:03 Faut-il vraiment rediriger vos images en 301 lors d'un changement de domaine ?
  7. 43:46 Les backlinks vers une page en noindex perdent-ils vraiment leur valeur ?
  8. 53:32 Les duplicatas dans Search Console sont-ils vraiment un problème pour votre SEO ?
  9. 77:01 Pourquoi l'API Jobs surpasse-t-elle les sitemaps pour indexer vos offres d'emploi ?
  10. 82:36 Les sitemaps accélèrent-ils vraiment le crawling de vos pages ?
📅
Official statement from (6 years ago)
TL;DR

Google advises against creating separate pages for each low-demand product variant. The risk: diluting crawl budget and generating conflicting quality signals. The preferred strategy combines consolidation on a master page and targeted use of noindex for minor variations, concentrating authority on URLs with high traffic potential.

What you need to understand

Why does Google discourage the proliferation of variant pages?

The creation of pages for each product variant (color, size, material) leads to what is called thin content at scale. Even if each page has unique content, their individual search volume is often negligible.

The engine has to crawl, index, and evaluate dozens or even hundreds of URLs that do not generate any user demand signals. This fragments internal PageRank and dilutes the relevance signals that Google uses to determine which pages deserve ranking.

What does "low demand" really mean?

Google does not provide any numerical threshold — this is typical of their communication. A practitioner must interpret "low demand" based on their own data: monthly search volume under 10-20 in tools, zero organic clicks over 6-12 months in Search Console, and complete absence of mention in brand queries.

The real indicator? If a variant generates neither organic traffic, nor direct conversions, nor backlinks, there is no reason for it to exist as an indexed standalone URL. It becomes a dead weight for the crawl budget, especially on an e-commerce site with thousands of references.

What is the difference between consolidation and noindex?

Consolidation involves grouping all variants into a single master page with a selector (dropdown, buttons) allowing users to choose size, color, etc. The URL remains unique, the content is dynamic but served server-side for SEO.

Noindex, on the other hand, keeps variant URLs (necessary for user navigation, cart, paid campaigns) but excludes them from indexing. This is the solution when you need to maintain distinct URLs for technical or marketing reasons while avoiding cannibalization.

  • Consolidation: A single indexed URL, all variants accessible through user interface (recommended for small variants without their own search volume)
  • Noindex: Multiple crawlable but non-indexed URLs, used when variants must exist technically for the purchase journey or ad campaigns
  • Canonical: An alternative to noindex if you want to indicate a preferred version while allowing Google to discover the variants (less drastic)
  • Decision criteria: Specific search volume + potential backlinks to the variant + inherent SEO value (if zero on all three, consolidate or noindex)
  • Risk to avoid: Creating 50 pages for "red shoe", "blue shoe", etc. without any content differentiation or identifiable search demand

SEO Expert opinion

Is this directive consistent with field observations?

Absolutely, and it is even a recurring problem in e-commerce. Websites that index all their product variants often see a decrease in overall visibility, not because Google penalizes them, but because their crawl budget is wasted on URLs without value.

Platforms like Shopify, PrestaShop, and WooCommerce generate distinct URLs for each variant by default. Without manual configuration, this creates thousands of indexed pages that cannibalize each other and dilute ranking signals. Google's crawlers spend time on these pages instead of exploring strategic categories and product sheets.

What nuances should be added to this recommendation?

The directive from Mueller is valid for low-differentiation variants (standard colors, sizes). However, some cases require dedicated pages: car models by model year, smartphones by storage capacity when each has its own EAN and reviews, luxury brand clothing where each color has distinct campaigns.

The real criterion: does the variant have a distinct search identity? "iPhone 15 Pro 256GB" generates distinct volume from "iPhone 15 Pro 512GB" — here, separate pages are justified. "Red T-shirt" vs "Blue T-shirt"? Rarely. [To verify] via your Search Console and Google Trends data.

In what cases does this rule not apply?

When variants have different technical characteristics that are specifically sought: engine power, screen resolution, material composition. If users type "Gore-Tex jacket" vs "polyester jacket", each variant deserves its page.

Another exception: variants with external backlinks. If a blogger has linked to your "black ergonomic office chair", de-indexing that URL breaks the link. In this case, a canonical to the master page preserves link juice while avoiding duplication.

Warning: The noindex blocks indexing but does not prevent crawling. If you have 10,000 variants in noindex, Google continues to visit them, consuming crawl budget. To truly save, combine noindex AND robots.txt disallow, or better: remove the URLs and consolidate.

Practical impact and recommendations

What should be done on an e-commerce site?

First reflex: Search Console audit. Extract all indexed URLs, filter by page type (product variants), and cross-reference with traffic data over 12 months. Any URL with zero clicks and zero impressions is a candidate for consolidation or noindex.

Next, analyze your URL structure. If each variant generates a parameter (?color=red, ?size=L), configure Google Search Console to ignore these parameters. If they are distinct URLs (/red-product/, /blue-product/), decide: canonical to the master page, noindex, or pure merging.

How to technically implement consolidation?

The cleanest method: a single product page with JavaScript selectors (or better, server-side rendering) that modify price, visuals, and availability without changing the URL. The schema.org Product includes all variants via "offers" with distinct itemOffered for each variation.

If you need to maintain variant URLs for tracking or ad campaigns, add <meta name="robots" content="noindex, follow"> on each variant and a canonical to the master page. The "follow" allows Google to crawl internal links without indexing the page.

What errors should be absolutely avoided?

Never place noindex on the master page thinking you are consolidating — you would de-index the entire product. Ensure your canonicals point to an indexable URL (not in noindex, not in 404).

Another pitfall: applying a global noindex via robots.txt. The robots.txt file blocks crawling, not indexing. Google can index a URL it does not crawl if it receives backlinks. To de-index, you MUST have a meta tag or an HTTP X-Robots-Tag header.

  • Extract all variant URLs from Search Console and identify those with zero traffic over 12 months
  • Decide for each variant: consolidation (merge URL), canonical (preference signal), or noindex (index exclusion)
  • Implement meta robots or canonical tags server-side, not in client-side JavaScript (Google processes them with delay)
  • Check in Search Console that noindexed URLs gradually disappear from the index (typically 6-8 weeks)
  • Configure URL parameters in GSC to ignore query strings related to variants (?color, ?size, etc.)
  • Monitor crawl budget via crawl reports: the number of pages crawled per day should stabilize or decrease, indicating that Google is focusing on the essentials
Product variant consolidation is often an underestimated lever. By concentrating authority on master pages with high potential, you optimize crawl budget, enhance relevance signals, and facilitate ranking. Let's be honest: these decisions require careful analysis of Search Console data, an understanding of crawl mechanics, and impeccable technical implementation. If your catalog exceeds a few hundred references or if you see stagnation in organic traffic despite quality content, the support of a specialized SEO agency can prove crucial in diagnosing inefficiencies and deploying a tailored consolidation strategy.

❓ Frequently Asked Questions

Le noindex suffit-il à économiser du crawl budget ?
Non. Le noindex empêche l'indexation mais Google continue de crawler les URLs pour détecter d'éventuels changements de balise. Pour vraiment réduire le crawl, combinez noindex et robots.txt disallow, ou supprimez les URLs.
Peut-on utiliser une canonical au lieu d'un noindex pour les variantes ?
Oui, c'est même préférable si les variantes reçoivent des backlinks externes. Le canonical concentre le PageRank sur la page maître tout en permettant à Google de découvrir les variantes. Le noindex, lui, bloque totalement l'indexation.
Comment savoir si une variante est "peu recherchée" ?
Croisez trois sources : volume de recherche mensuel dans les outils (SEMrush, Ahrefs), données de clics dans la Search Console sur 12 mois, et présence dans Google Trends. Si les trois sont à zéro, la variante n'a pas d'identité de recherche propre.
Faut-il désindexer les variantes même si elles convertissent via le payant ?
Oui. Les performances en SEA n'ont aucun lien avec la pertinence organique. Une URL peut très bien rester crawlable et noindexée pour vos campagnes ads, tout en étant exclue de l'index Google organique.
Combien de temps faut-il pour que Google supprime les variantes noindexées de l'index ?
Entre 4 et 8 semaines en moyenne, selon la fréquence de crawl de votre site. Vous pouvez accélérer en demandant une suppression temporaire via la Search Console, mais ce n'est pas une solution permanente.
🏷 Related Topics
Domain Age & History Content Crawl & Indexing E-commerce AI & SEO

🎥 From the same video 10

Other SEO insights extracted from this same Google Search Central video · duration 54 min · published on 06/06/2019

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.