Official statement
Other statements from this video 9 ▾
- 2:10 Googlebot soumet-il vraiment vos formulaires tout seul ?
- 6:59 La structure d'URL de vos pages AMP impacte-t-elle réellement votre référencement ?
- 9:07 Faut-il vraiment mettre tous les liens d'articles invités en nofollow ?
- 11:11 Faut-il vraiment utiliser la balise canonical sur des fiches produits aux descriptions longues et identiques ?
- 15:21 Faut-il vraiment supprimer toutes les redirections internes de votre site ?
- 18:06 Pourquoi Google masque-t-il les requêtes de vos nouvelles URLs dans la Search Console ?
- 21:32 Les balises lastmod dans les sitemaps ont-elles vraiment un impact sur le crawl ?
- 23:41 Pourquoi Google n'affiche-t-il pas les backlinks vers vos pages 404 dans Search Console ?
- 35:28 L'indexation mobile-first ne regarde-t-elle vraiment plus la version desktop de votre site ?
Google claims that disallowing pages with few impressions typically has no significant impact on the overall SEO of a site. The real question to ask is not about the volume of impressions, but about the actual usefulness of these pages to the user and their role in the site structure. This statement reframes the debate: instead of chasing low metrics, focus on the added value of each URL.
What you need to understand
Why does Google downplay the impact of disallowing low-performance pages?
This statement comes in a context where many SEOs believe that by massively pruning weak pages, they will mechanically improve the overall "quality score" of their site. The underlying idea: Google would penalize sites with too much mediocre content by diluting their authority.
However, Google does not operate that way. The engine evaluates each page individually based on its relevance for a given query. A page with zero impressions today may very well rank tomorrow for an emerging long tail — if it addresses a real need. Disallowing by conditioned reflex based on impression volume amounts to confusing symptom and diagnosis.
What is the perceived usefulness of a page according to Google?
The “perceived usefulness” is a vague concept that Google never precisely defines — and that’s the problem. Concretely, it encompasses several dimensions: does the page meet a search intent, even marginally? Does it play a role in internal linking? Does it provide unique information or is it redundant?
A product page that has been out of stock for 18 months has no objective usefulness. A technical resource page consulted 10 times a month by sharp experts has value, even if Search Console shows ridiculous numbers. Usefulness is not measured solely by impressions — it is also judged in the business context and in information architecture.
In what cases can disallowing still have a positive effect?
There are situations where removing content does indeed improve performance. Typically: sites with constrained crawl budget (several hundred thousand pages, large e-commerce platforms, aggregators), where Googlebot spends time on irrelevant URLs. Or in cases of massive cannibalization, where 50 almost identical pages compete for the same query and undermine all their positions.
In these configurations, disallowing is not a magic lever that raises the rest — it’s a technical hygiene operation that allows Google to crawl strategic content more efficiently. The impact comes from optimizing the crawl budget, not from a hypothetical “quality bonus” redistributed to the remaining pages.
- The SEO impact of disallowing is generally low if the crawl budget is not a limiting factor
- The usefulness of a page is not measured solely by impressions — its role in architecture counts just as much
- Disallowing by conditioned reflex based on Search Console metrics risks removing content that captures emerging long-tail queries
- Real gains come from targeted pruning: duplicate content, irrelevant pages, outdated content with no residual value
- Prioritize improving existing content before considering disallowing as an easy solution
SEO Expert opinion
Does this statement truly reflect on-the-ground observations?
Yes and no. In principle, Google is right: mass disallowing weak pages does not trigger a mechanical increase in overall traffic. We’ve verified this in dozens of audits — removing 40% of a site's URLs doesn’t cause the rest to magically rise in rankings. No magical redistribution of "SEO juice" or premium for average quality.
But — and this is where it gets complicated — this statement overlooks the cases where disallowing solves an underlying structural problem. A site cluttered with 200,000 filtered facets with no added value isn’t just “low impression pages.” It’s a crawl budget sinkhole that prevents the indexing of strategic content. In this context, pruning has a measurable impact — but not for the reasons Google highlights here. [To be verified]: Google is likely simplifying the message deliberately to prevent webmasters from over-optimizing using this lever.
What is the gray area that Google does not clarify?
The “perceived usefulness” is a subjective criterion without a clear evaluation grid. Google provides no thresholds, no tangible metrics. The result: everyone projects their own analytical framework. One consultant might argue that a page with 5 impressions per month over 12 months is useless. Another might argue it captures a highly specialized query with high added value.
Concretely, Google likely assesses usefulness through aggregated behavioral signals (adjusted bounce rate, time spent, interactions) and the role within the internal link graph. But due to a lack of transparency, we operate in the dark. This opacity pushes SEOs to test empirically — which is precisely what Google wants: for you to make business decisions, not algorithmic decisions.
In what cases does this rule absolutely not apply?
On very large technical sites (several million pages, ultra-restricted crawl budget), intelligently disallowing changes the game. The same goes for poorly managed platform migrations, where thousands of old zombie URLs remain indexed and dilute signals. Or sites that have suffered from massive internal spam (automated comments, polluted UGC, automatically generated pages without supervision).
In these configurations, the impact of disallowing is not marginal — it is structural. But it’s not “disallowing low impression pages” that works, it’s “eliminating the noise that prevents Google from understanding and effectively crawling the strategic signal.” A critical nuance that Google’s statement completely glosses over.
Practical impact and recommendations
How to concretely decide if a page should be disallowed?
Start by segmenting your weak pages by type: products, categories, editorial content, SEO landing pages, technical pages. The decision criteria vary drastically depending on the nature of the content. A product sheet permanently out of stock? Disallow it. A deep article with 3 impressions/month on a highly specialized query? Keep it, improve it, strengthen internal linking.
Then cross-reference several signals: Search Console impressions, actual clicks, conversions or micro-conversions (time spent, scroll depth, interactions), role in architecture (orphan page or linking hub?), content age, potential seasonality. A page with zero impressions in January can explode in June if it deals with a cyclical topic. Never rely on a single isolated metric to make your decision.
What mistakes should you absolutely avoid during a content audit?
Mistake #1: automating disallowing based on an arbitrary impression threshold (“everything with less than 10 impressions/month goes”). You will kill profitable long-tail content and dormant content that can awaken on emerging trends. Mistake #2: not checking incoming backlinks — a “weak” page in impressions can receive quality links and play a role in internal PageRank.
Mistake #3: confusing “weak SEO performance” and “low business utility.” A page that converts little but structures the user journey or meets a legal/informational obligation has its place in the index. Last classic mistake: disallowing without redirecting or adjusting internal linking — you create holes in the architecture and broken links that degrade the experience.
What strategy should you adopt to maximize real impact?
Instead of massively disallowing, prioritize improvement and consolidation. Merge redundant content into stronger pillar pages. Rewrite weak pages with a more relevant or actionable angle. Strengthen internal linking to push PageRank towards underutilized content. Disallowing should be the last resort, not the first reflex.
On very large sites, work in test & learn mode: disallow a homogeneous segment (e.g., 1000 exhausted product sheets), measure the impact over 4-6 weeks (crawling, indexing of strategic pages, overall traffic), then decide to expand or adjust. This iterative approach limits risks and provides you with factual data rather than intuitions.
- Segment your weak pages by type before any disallowing decision
- Cross at least 4 signals: impressions, clicks, conversions, role in architecture
- Check incoming backlinks and the role in internal linking before disallowing
- Prefer improvement and merging of content rather than pure removal
- Test on a restricted segment and measure real impact before generalizing
- Always redirect disallowed URLs with residual value (backlinks, traffic history)
❓ Frequently Asked Questions
Une page avec zéro impression pendant 12 mois doit-elle être systématiquement désindexée ?
Désindexer 30 % de mes pages va-t-il booster le trafic des 70 % restantes ?
Comment identifier les pages qui méritent vraiment d'être désindexées ?
La désindexation impacte-t-elle le crawl budget sur un site de 50 000 pages ?
Dois-je rediriger les pages désindexées ou simplement les passer en noindex ?
🎥 From the same video 9
Other SEO insights extracted from this same Google Search Central video · duration 58 min · published on 09/04/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.