Official statement
Other statements from this video 8 ▾
- 7:31 Faut-il vraiment signaler la validation médicale de vos contenus santé en données structurées ?
- 9:02 L'équivalence AMP/mobile impacte-t-elle réellement le classement Google ?
- 10:08 Pourquoi bloquer une page par robots.txt empêche-t-il Google de voir votre balise noindex ?
- 11:07 Faut-il vraiment inclure un GTIN dans vos données structurées produit ?
- 14:30 Les images de stock plombent-elles vraiment votre référencement Google Images ?
- 17:38 Pourquoi votre site n'est-il toujours pas passé en indexation mobile-first ?
- 20:20 Comment Google gère-t-il vraiment le contenu dupliqué dans les résultats de recherche ?
- 36:10 L'indexation JavaScript à deux vagues est-elle vraiment en train de disparaître ?
Mueller states that authoritative sites can maintain their overall ranking even with imperfect pages. The algorithm assesses quality at the domain level, not just page by page. This means that an established site has some leeway on certain pages — but how far, Google never clarifies.
What you need to understand
Does Google evaluate quality site by site or page by page?
Mueller's statement introduces a nuance that is rarely made explicit: the algorithm does not treat all pages in isolation. A site with strong overall authority can afford to have underperforming sections without compromising its general ranking.
In other words, Google looks at the entire domain to determine its credibility. A newspaper like Le Monde can publish a mediocre article — it won't lose all its visibility because of it. However, a niche blog with only 12 total pages does not have this latitude.
What qualifies as a “suboptimal” page according to Google?
Mueller remains deliberately vague about what constitutes a “not perfect” page. No precise metrics are provided: poor loading times? Lightweight content? High bounce rates? Zero internal backlinks?
We know that Google hates outright terrible pages — such as automated thin content or pure duplication. But between “excellent” and “garbage,” there exists a vast gray area. This area is exactly where large sites can navigate with relative ease.
Does this tolerance apply to all types of sites?
No. And that's where many practitioners face challenges. A site with 5000 pages that has built authority over 10 years does not compete in the same league as a site with 150 pages that launched 18 months ago.
The “overall quality” that Mueller mentions is based on accumulated signals: backlink history, recurring traffic, brand mentions, user behavior. A young site simply hasn’t had the time to build this trust capital. Every page counts more — there is no safety net.
- Domain authority acts as a buffer: a few weak pages do not destroy the overall ranking of an established site.
- Smaller sites lack this leeway: every page carries proportionally more weight in the overall assessment.
- Google does not define where the threshold lies between “acceptable” and “penalizing” — it's a variable geometry system.
- Tolerance is not permission: accumulating too many mediocre pages will eventually erode even an authoritative site.
- Context matters: a weak page in a minor section of the site has less impact than a key page in the user journey.
SEO Expert opinion
Does this statement align with what we observe in the field?
Yes, but with nuances that Mueller does not delve into. We regularly see major sites rank with objectively weak pages: catastrophic loading times, outdated content, broken internal linking. They hold on due to their overall authority.
However, the effect is not symmetrical. An average site that tries to improve a few star pages does not necessarily rise — because Google looks at the whole. Overall quality is as much a floor as it is a ceiling. A mediocre domain does not save three excellent pages if the rest pulls it down.
How far can we push this tolerance before it breaks?
[To be confirmed] Mueller provides no figures. Is it 5% of weak pages? 20%? 40%? No one knows, and Google has no interest in clarifying it.
What we observe is that sites that accumulate too many zombie pages eventually suffer, even with strong authority. The Helpful Content Update hit hard for established domains that inflated their index with mass-generated content. Tolerance exists, but it is not infinite — and the threshold shifts with algorithm changes.
What should you do if you inherit a large site with entire problematic sections?
This is the classic scenario in agencies: a site with 8000 pages where 3000 are dusty archives or auto-generated content. The temptation is to keep everything thinking “we have the authority, it will work out.”
Except that it won't. The right approach is: brutal audit, sorting by real SEO value (organic traffic, conversions, backlinks), then decision page by page. Some pages deserve a redesign, others a merger, and many a 410 or a noindex. Domain authority is not an excuse to carry dead weight — it is a capital to protect, not squander.
Practical impact and recommendations
How can you audit a site to detect these “suboptimal” pages?
First step: cross-reference Search Console and Google Analytics data. Identify indexed pages that generate fewer than 10 organic clicks over 6 months. Then analyze their backlink profile — if a page has no strategic internal links and zero external backlinks, it’s a candidate for pruning.
Next, look at behavioral metrics: bounce rate over 80%, time on page under 15 seconds, abnormal exit rates. These signals indicate that Google is sending traffic, but users are fleeing — classic signs of a weak page that ranks due to authority inertia.
Should you always delete or noindex weak pages on a large site?
No. That would be a rookie mistake. Not every weak page is harmful: some serve ultra-specific queries, while others play a role in internal linking even without direct traffic.
The real question is: does this page add value to the site or the user? If the answer is no on both counts, then yes, noindex or delete. But if it serves as a bridge in your internal link architecture, or if it converts even with few visits, keep it and improve it rather than getting rid of it.
What strategy should be adopted for a new site that doesn’t yet have established authority?
Let’s be honest: you don’t get a second chance. Every published page must be solid — not perfect, but solid. No filler content, no mass-generated pages to “build volume.”
Focus on density rather than quantity. 50 excellent pages beat 300 average pages when you're starting out. Once you’ve built authority (natural backlinks, recurring traffic, brand mentions), you can afford to expand — but not before.
- Quarterly audit low-performing pages (less than 10 clicks/month over the last 6 months).
- Prioritize redesign or merging rather than systematic deletion.
- Check the role in internal linking before any noindexing decision.
- Monitor changes post-Core Update: pages tolerated today may become problematic tomorrow.
- For new sites: zero compromise on quality until authority is established.
- Document pruning decisions to avoid repeating the same mistakes.
❓ Frequently Asked Questions
Un site d'autorité peut-il ranker avec 30 % de pages faibles sans pénalité ?
Les pages faibles d'un gros site affectent-elles le crawl budget ?
Faut-il noindexer ou supprimer les pages faibles sur un site établi ?
Un nouveau site peut-il se permettre d'avoir quelques pages moyennes ?
Comment Google mesure-t-il la qualité globale d'un domaine ?
🎥 From the same video 8
Other SEO insights extracted from this same Google Search Central video · duration 43 min · published on 23/08/2019
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.