What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 5 questions

Less than a minute. Find out how much you really know about Google search.

🕒 ~1 min 🎯 5 questions

Official statement

Sometimes, large websites may have pages that aren't perfect, but the site as a whole is of good quality, allowing them to rank well despite some suboptimal pages.
2:07
🎥 Source video

Extracted from a Google Search Central video

⏱ 43:37 💬 EN 📅 23/08/2019 ✂ 9 statements
Watch on YouTube (2:07) →
Other statements from this video 8
  1. 7:31 Faut-il vraiment signaler la validation médicale de vos contenus santé en données structurées ?
  2. 9:02 L'équivalence AMP/mobile impacte-t-elle réellement le classement Google ?
  3. 10:08 Pourquoi bloquer une page par robots.txt empêche-t-il Google de voir votre balise noindex ?
  4. 11:07 Faut-il vraiment inclure un GTIN dans vos données structurées produit ?
  5. 14:30 Les images de stock plombent-elles vraiment votre référencement Google Images ?
  6. 17:38 Pourquoi votre site n'est-il toujours pas passé en indexation mobile-first ?
  7. 20:20 Comment Google gère-t-il vraiment le contenu dupliqué dans les résultats de recherche ?
  8. 36:10 L'indexation JavaScript à deux vagues est-elle vraiment en train de disparaître ?
📅
Official statement from (6 years ago)
TL;DR

Mueller states that authoritative sites can maintain their overall ranking even with imperfect pages. The algorithm assesses quality at the domain level, not just page by page. This means that an established site has some leeway on certain pages — but how far, Google never clarifies.

What you need to understand

Does Google evaluate quality site by site or page by page?

Mueller's statement introduces a nuance that is rarely made explicit: the algorithm does not treat all pages in isolation. A site with strong overall authority can afford to have underperforming sections without compromising its general ranking.

In other words, Google looks at the entire domain to determine its credibility. A newspaper like Le Monde can publish a mediocre article — it won't lose all its visibility because of it. However, a niche blog with only 12 total pages does not have this latitude.

What qualifies as a “suboptimal” page according to Google?

Mueller remains deliberately vague about what constitutes a “not perfect” page. No precise metrics are provided: poor loading times? Lightweight content? High bounce rates? Zero internal backlinks?

We know that Google hates outright terrible pages — such as automated thin content or pure duplication. But between “excellent” and “garbage,” there exists a vast gray area. This area is exactly where large sites can navigate with relative ease.

Does this tolerance apply to all types of sites?

No. And that's where many practitioners face challenges. A site with 5000 pages that has built authority over 10 years does not compete in the same league as a site with 150 pages that launched 18 months ago.

The “overall quality” that Mueller mentions is based on accumulated signals: backlink history, recurring traffic, brand mentions, user behavior. A young site simply hasn’t had the time to build this trust capital. Every page counts more — there is no safety net.

  • Domain authority acts as a buffer: a few weak pages do not destroy the overall ranking of an established site.
  • Smaller sites lack this leeway: every page carries proportionally more weight in the overall assessment.
  • Google does not define where the threshold lies between “acceptable” and “penalizing” — it's a variable geometry system.
  • Tolerance is not permission: accumulating too many mediocre pages will eventually erode even an authoritative site.
  • Context matters: a weak page in a minor section of the site has less impact than a key page in the user journey.

SEO Expert opinion

Does this statement align with what we observe in the field?

Yes, but with nuances that Mueller does not delve into. We regularly see major sites rank with objectively weak pages: catastrophic loading times, outdated content, broken internal linking. They hold on due to their overall authority.

However, the effect is not symmetrical. An average site that tries to improve a few star pages does not necessarily rise — because Google looks at the whole. Overall quality is as much a floor as it is a ceiling. A mediocre domain does not save three excellent pages if the rest pulls it down.

How far can we push this tolerance before it breaks?

[To be confirmed] Mueller provides no figures. Is it 5% of weak pages? 20%? 40%? No one knows, and Google has no interest in clarifying it.

What we observe is that sites that accumulate too many zombie pages eventually suffer, even with strong authority. The Helpful Content Update hit hard for established domains that inflated their index with mass-generated content. Tolerance exists, but it is not infinite — and the threshold shifts with algorithm changes.

Warning: Do not confuse “temporary tolerance” and “viable strategy.” Relying on domain authority to compensate for mediocre pages is playing with fire. Core updates can redefine what is acceptable overnight.

What should you do if you inherit a large site with entire problematic sections?

This is the classic scenario in agencies: a site with 8000 pages where 3000 are dusty archives or auto-generated content. The temptation is to keep everything thinking “we have the authority, it will work out.”

Except that it won't. The right approach is: brutal audit, sorting by real SEO value (organic traffic, conversions, backlinks), then decision page by page. Some pages deserve a redesign, others a merger, and many a 410 or a noindex. Domain authority is not an excuse to carry dead weight — it is a capital to protect, not squander.

Practical impact and recommendations

How can you audit a site to detect these “suboptimal” pages?

First step: cross-reference Search Console and Google Analytics data. Identify indexed pages that generate fewer than 10 organic clicks over 6 months. Then analyze their backlink profile — if a page has no strategic internal links and zero external backlinks, it’s a candidate for pruning.

Next, look at behavioral metrics: bounce rate over 80%, time on page under 15 seconds, abnormal exit rates. These signals indicate that Google is sending traffic, but users are fleeing — classic signs of a weak page that ranks due to authority inertia.

Should you always delete or noindex weak pages on a large site?

No. That would be a rookie mistake. Not every weak page is harmful: some serve ultra-specific queries, while others play a role in internal linking even without direct traffic.

The real question is: does this page add value to the site or the user? If the answer is no on both counts, then yes, noindex or delete. But if it serves as a bridge in your internal link architecture, or if it converts even with few visits, keep it and improve it rather than getting rid of it.

What strategy should be adopted for a new site that doesn’t yet have established authority?

Let’s be honest: you don’t get a second chance. Every published page must be solid — not perfect, but solid. No filler content, no mass-generated pages to “build volume.”

Focus on density rather than quantity. 50 excellent pages beat 300 average pages when you're starting out. Once you’ve built authority (natural backlinks, recurring traffic, brand mentions), you can afford to expand — but not before.

  • Quarterly audit low-performing pages (less than 10 clicks/month over the last 6 months).
  • Prioritize redesign or merging rather than systematic deletion.
  • Check the role in internal linking before any noindexing decision.
  • Monitor changes post-Core Update: pages tolerated today may become problematic tomorrow.
  • For new sites: zero compromise on quality until authority is established.
  • Document pruning decisions to avoid repeating the same mistakes.
Google's tolerance for weak pages on large sites is real, but vague and variable. It is neither permission nor a sustainable strategy. Regular auditing, prioritizing quality over quantity, and continuous adjustments remain essential — regardless of site size. These structural optimizations, combined with complex architectural choices, can quickly become time-consuming and technical. Engaging a specialized SEO agency allows you to benefit from expert insight and tailored support to navigate these trade-offs without risking domain authority.

❓ Frequently Asked Questions

Un site d'autorité peut-il ranker avec 30 % de pages faibles sans pénalité ?
Il n'existe aucun seuil officiel communiqué par Google. En pratique, la tolérance existe mais reste limitée et varie selon les mises à jour algorithmiques. Au-delà d'un certain volume de pages médiocres, même un site établi finit par souffrir.
Les pages faibles d'un gros site affectent-elles le crawl budget ?
Oui, indirectement. Google perd du temps à crawler des pages sans valeur, ce qui peut retarder l'indexation de contenu important. Un élagage stratégique améliore l'efficacité du crawl et concentre l'attention de Googlebot sur les pages à forte valeur.
Faut-il noindexer ou supprimer les pages faibles sur un site établi ?
Ça dépend de leur rôle. Si elles n'apportent rien (ni trafic, ni conversion, ni utilité dans le maillage interne), supprime-les avec une 410. Si elles servent l'architecture du site, améliore-les plutôt que de les désindexer aveuglément.
Un nouveau site peut-il se permettre d'avoir quelques pages moyennes ?
Non. Sans autorité établie, chaque page compte proportionnellement plus dans l'évaluation globale. Mieux vaut publier 30 pages solides que 100 pages dont 40 sont médiocres — la qualité globale prime dès le départ.
Comment Google mesure-t-il la qualité globale d'un domaine ?
Google ne détaille pas sa méthode, mais on sait qu'il croise plusieurs signaux : profil de backlinks, comportement utilisateur, historique du domaine, mentions de marque, récurrence du trafic organique. C'est un algorithme composite, pas une métrique unique.
🏷 Related Topics
Domain Age & History AI & SEO

🎥 From the same video 8

Other SEO insights extracted from this same Google Search Central video · duration 43 min · published on 23/08/2019

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.