What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

A manual action for 'thin content with little or no added value' is applied to sites containing a significant percentage of low-quality or superficial pages that do not provide much value to users.
1:37
🎥 Source video

Extracted from a Google Search Central video

⏱ 5:49 💬 EN 📅 18/06/2020 ✂ 6 statements
Watch on YouTube (1:37) →
Other statements from this video 5
  1. 0:31 Les actions manuelles Google : quelle part réelle du contrôle humain dans le classement de votre site ?
  2. 1:04 Le « Pure spam » de Google : comment éviter les sanctions Black Hat SEO qui coûtent cher ?
  3. 1:37 Google sanctionne-t-il vraiment les données structurées manipulatrices ?
  4. 3:11 Faut-il vraiment corriger TOUTES les pages pour lever une action manuelle Google ?
  5. 4:15 Actions manuelles vs problèmes de sécurité : savez-vous vraiment faire la différence ?
📅
Official statement from (5 years ago)
TL;DR

Google applies manual actions to sites with a significant percentage of superficial pages that provide little real value to the user. This penalty specifically targets thin content generated in bulk, not legitimate short pages. For an SEO, this means auditing your site to identify at-risk areas and prioritizing editorial depth over the multiplication of hollow pages.

What you need to understand

What is a manual action for low-value content?

A manual action is a penalty imposed by a human quality rater at Google, not by an algorithm. Unlike automated filters like Panda, this sanction requires human intervention after a site has been flagged or audited. The mention of 'thin content with little or no added value' targets sites that are stuffing their index with hollow pages.

Google does not penalize an isolated low-quality page. The key lies in the significant percentage: if a large part of your index consists of superficial content, you enter the danger zone. A site with 1,000 pages with 50 empty product listings is unlikely to be affected. A site with 800 automatically generated pages without unique content, however, will be.

Why does Google distinguish this type of manual action?

This specific penalty exists because certain patterns of editorial spam escape algorithmic filters. Content farms, aggregators with no added value, sites scraping RSS feeds with three lines of commentary — all these practices require a human eye to be identified with certainty.

The manual action also serves as a teaching message. Google explicitly signals the problem to you via Search Console, unlike an algorithmic drop where you have to guess what is wrong. It's a double-edged sword: you know what to fix, but you are branded until a reconsideration request is approved.

What types of content are specifically targeted?

Classic targets include satellite pages generated for every keyword combination ('plumber Paris 15', 'plumber Paris 16', etc. with the same template), affiliate sites without original content, aggregators of classifieds without curation, and blogs stuffing their index with micro-articles of 100 words to capture long-tail traffic.

More subtly: e-commerce sites with hundreds of empty product listings or nearly identical, or media outlets republishing AFP dispatches without analysis. The criterion is not the raw length, but the absence of distinctive value. A short but complete product listing can be legitimate. Thirty variations of the same generic text? Not so much.

  • Significant percentage: Google does not provide a specific threshold, but field observations suggest that beyond 30-40% of hollow pages, the risk increases
  • User value: the central criterion is not length, but whether the page genuinely addresses an intention or merely aims to capture traffic
  • Manual vs algorithmic action: Panda can downrank without notification, while manual action explicitly alerts you in Search Console
  • Reversibility: unlike some penalties, this one can be lifted after corrections and a reconsideration request approved by a quality rater
  • Scope of the penalty: it can affect the entire site or specific sections, depending on the extent of the problem

SEO Expert opinion

Is this policy consistently applied?

To be honest, the application is erratic. Sites with thousands of automatically generated pages thrive for years, while legitimate media outlets face manual actions for sparse archives. The issue is that human quality raters apply interpretable guidelines, and part of the process remains opaque.

Field observations suggest that highly competitive sectors (finance, health, e-commerce) are scrutinized more intensely. A small thematic blog with average content may pass under the radar, where a news aggregator with the same quality-to-volume ratio will be flagged. [To be verified]: no official data confirms this sector prioritization, but observed patterns of penalties suggest a targeted focus on high E-E-A-T verticals.

Is the threshold for 'significant percentage' truly measurable?

Google provides no precise numbers. The phrase 'significant percentage' is purposely vague to avoid gaming: if Google said 'over 30%', spammers would calibrate their farms to 29%. But for a practitioner wanting to assess their risk, it’s frustrating.

Based on successful reconsideration feedback, the threshold seems to be somewhere between 25% and 50% of the index. But caution: this is not a mathematical rule. A site with 20% of strategically placed hollow pages on competitive queries may be penalized, while another with 40% of old, rarely visited pages may pass. Context matters as much as the raw ratio. [To be verified]: these thresholds are field extrapolations, not official confirmations.

Can you be penalized for legitimate but short content?

Yes, and this is where it gets tricky. Google claims not to penalize length per se, but to evaluate user satisfaction. In practice, some legitimate pages — short definitions, currency converters, simple calculators — can be misjudged by a rushed quality rater.

I have seen sites with useful micro-tools (password generators, unit converters) receive this penalty because each tool was on a distinct page with little text. The human rater saw thin content where it was functionally justified. If you have this type of architecture, clearly document the intent in your reconsideration requests — and consider consolidating pages if they lack independent traffic.

Attention: if you receive this manual action, do not delete your lightweight content en masse without analysis. Google wants to see qualitative improvement, not just a reduction of the index. De-indexing 50% of your site can worsen the situation if the remaining pages add no more value.

Practical impact and recommendations

How can I identify at-risk content on my site?

First step: export your complete index via Search Console (Coverage report) or a Screaming Frog/Oncrawl crawl. Cross-reference these URLs with your Analytics data to isolate pages with low engagement (bounce rate >80%, time on page <20 seconds, zero conversions). These metrics often signal content that does not meet intention.

Next, segment by type of content: product sheets, blog articles, category pages, geolocalized landing pages, etc. Calculate for each segment the ratio of low-value pages to total pages. If a specific segment (e.g., 500 nearly identical geo landings) represents >30% of your index with poor engagement, that’s your point of weakness. Prioritize that for enrichment or consolidation.

What corrective actions truly work?

Consolidation is often more effective than massive enrichment. If you have 50 'plumber [city]' pages with the same template, merge them into a comprehensive regional page with a city selector. This reduces thin content and enhances the user experience.301 redirect the old URLs.

For content to retain but insufficient, the enrichment must be substantial. Simply going from 150 to 300 words with filler isn’t enough. Add distinctive elements: specific testimonials, FAQs, comparisons, local numerical data. The quality rater must see real editorial effort, not algorithmic padding. And if some pages have no traffic or potential, consider de-indexing them (noindex or removal + 410) rather than keeping them out of inertia.

Should you wait for a penalty or act preventively?

Acting preventively is infinitely more cost-effective. Once the manual action is applied, your visibility collapses, and the reconsideration process takes weeks or even months. If your audit reveals a structural risk (>40% low-value pages, patterns of automatic generation, heavy reliance on repetitive templates), plan a gradual redesign.

Prioritize sections with high organic traffic: it’s better to correct 100 strategic pages than to spread your efforts over 1,000 zombie pages. And document your changes in a tracking file — if you do receive a penalty later, you can demonstrate your qualitative intent in the reconsideration request. Google values sites that show a trajectory of continuous improvement.

  • Audit the complete index and cross-reference with user engagement metrics (Analytics, Search Console)
  • Identify at-risk content segments (ratio of low pages >30% within a category)
  • Prioritize consolidation over enrichment: grouping redundant content enhances structure and UX
  • For retained content, enrich significantly with distinctive elements (not just lengthening)
  • Properly de-index pages with no value or potential (noindex or 410, not crawlable orphans)
  • Document all changes to facilitate a possible reconsideration request
Correcting a risky content architecture requires a global strategic vision: quantitative and qualitative audit, prioritization by business impact, gradual redesign with rigorous tracking. These tasks are often complex to manage alone, especially on sites with thousands of pages. Bringing in a specialized SEO agency can be wise to structure the approach, avoid technical pitfalls (bad redirects, increased cannibalization), and speed up results without tying up your internal resources for months.

❓ Frequently Asked Questions

Une action manuelle pour contenu mince impacte-t-elle tout le site ou seulement certaines sections ?
Cela dépend de l'étendue du problème. Google peut appliquer la sanction à l'ensemble du site si le thin content est répandu, ou cibler des sections spécifiques (ex: uniquement les pages blog ou les fiches produits). La notification dans Search Console précise le scope.
Combien de temps faut-il pour lever une action manuelle après corrections ?
Une fois la demande de réexamen soumise, Google répond généralement sous 1 à 3 semaines. Si les corrections sont jugées insuffisantes, la demande est rejetée et vous devez corriger davantage avant de soumettre à nouveau. Plusieurs itérations sont parfois nécessaires.
Peut-on être pénalisé pour du contenu généré par IA considéré comme faible valeur ?
Oui, si le contenu IA est superficiel, répétitif ou n'apporte pas de valeur distinctive. Google ne pénalise pas l'IA en soi, mais évalue la qualité finale. Un contenu IA bien supervisé, enrichi et éditorialisé peut passer ; du spam IA générique, non.
Les pages paginées ou filtres e-commerce sont-elles considérées comme du thin content ?
Pas si elles sont techniquement bien gérées (canonical, noindex sur filtres non stratégiques). Le problème survient quand ces pages sont indexables en masse sans différenciation éditoriale, créant des milliers d'URLs quasi-identiques crawlables.
Faut-il supprimer ou noindexer les pages de faible valeur ?
Cela dépend. Si la page a du trafic ou un potentiel conversion, enrichissez-la. Si elle n'a ni trafic ni utilité, désindexez-la (noindex) ou supprimez-la avec une redirection 301 vers une page pertinente. Évitez de laisser des pages orphelines crawlables sans valeur.
🏷 Related Topics
Domain Age & History Content AI & SEO Penalties & Spam

🎥 From the same video 5

Other SEO insights extracted from this same Google Search Central video · duration 5 min · published on 18/06/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.