What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

To resolve a manual action, you need to fix the issue on ALL affected pages. Fixing only some pages will not resolve the issue. A good reconsideration request must explain the exact problem, detail the steps taken to fix it, and document the outcome of those efforts.
3:11
🎥 Source video

Extracted from a Google Search Central video

⏱ 5:49 💬 EN 📅 18/06/2020 ✂ 6 statements
Watch on YouTube (3:11) →
Other statements from this video 5
  1. 0:31 Les actions manuelles Google : quelle part réelle du contrôle humain dans le classement de votre site ?
  2. 1:04 Le « Pure spam » de Google : comment éviter les sanctions Black Hat SEO qui coûtent cher ?
  3. 1:37 Comment Google sanctionne-t-il réellement le contenu de faible valeur ajoutée ?
  4. 1:37 Google sanctionne-t-il vraiment les données structurées manipulatrices ?
  5. 4:15 Actions manuelles vs problèmes de sécurité : savez-vous vraiment faire la différence ?
📅
Official statement from (5 years ago)
TL;DR

Google confirms that a manual action can only be lifted by correcting all affected pages, not just a sample. The reconsideration request must accurately document the identified issue, the corrective actions taken on each page, and the results achieved. Partial fixes will delay the lifting of the penalty and may even result in a systematic rejection of the reconsideration request.

What you need to understand

What is a manual action and why does it require total correction?

A manual action occurs when a human reviewer at Google detects a clear violation of quality guidelines. Unlike automated algorithmic adjustments, these penalties target deliberate manipulations: link spam, automatically generated content, cloaking, hidden text.

Google's position is unequivocal: fixing 60% or 80% of the flagged pages is not enough. The signal sent to the engine is still that of a site violating the rules, even partially. An incomplete fix maintains the risk of manipulation and justifies the continuation of the penalty.

Why does Google reject partial fixes?

The logic is about trust. If a site demonstrates that it understands the problem, it must show its ability to eradicate it everywhere. A selective fix suggests either a misunderstanding of the issue or a desire to retain some manipulative practices.

In practice, Google's manual teams do not check page by page. They sample. If the sample still contains violations, the reconsideration request is rejected—even if 90% of the site is clean. It’s harsh, but consistent with the philosophy of zero tolerance on black-hat practices.

How to structure a reconsideration request that gets approved?

Google requires three documented elements: precise identification of the problem, description of the corrective actions taken, and proof of the result. Specifically? If the manual action concerns spam links, you need to list the toxic domains, document the removal requests sent, and submit a complete disavow file.

The classic trap: a vague request like "We cleaned the site." Google wants URLs, before/after examples, screenshots if relevant. The more detailed the documentation, the higher the chances of approval.

  • Comprehensive inventory: identify all affected pages via Search Console, not just a representative sample
  • Systematic correction: address every occurrence of the problem, even on low-traffic or orphan pages
  • Detailed documentation: timestamp actions, keep evidence (exports, screenshots), prepare a structured report
  • Internal validation: crawl the site after correction to ensure no problematic pages remain
  • Precise reconsideration request: explain the problem in Google's terms, describe actions page by page, attach evidence

SEO Expert opinion

Is this requirement realistic for large sites?

Let’s be honest: fixing 100% of the pages on a site with 50,000 URLs can take weeks, even months. Google knows this. However, the company believes that if a site has had the resources to generate spam on a large scale, it must have the resources to clean it up. This is as much an ideological stance as it is technical.

In practice, sites that manage to lift manual actions quickly are those that automate detection and correction. Python scripts to identify spam patterns, htaccess rules to purge automatically generated pages, crawling tools to validate comprehensiveness. Without automation, it’s mission impossible on a large site. [To be confirmed]: does Google accept a documented progressive correction with intermediate milestones? Reports from the field are contradictory.

What real tolerance exists in the validation?

The official statement says "all pages." Experience shows that Google tolerates a minimal margin of error—let’s say 1 to 2% residual pages on very large sites. But it’s case-by-case, and relying on this is risky.

What never passes: leaving entire sections uncorrected, only fixing indexed pages while ignoring those in noindex, or addressing main pages while neglecting annexes. Human reviewers sample randomly. If a problematic page appears in their sample, it’s an immediate rejection. The cost of a second reconsideration request is high: extended delays, prolonged traffic loss, negative signals sent to Google.

How to avoid the traps of the reconsideration request?

The first classic mistake: minimizing the problem. "We deleted a few questionable links" when Google detected a systematic link pattern. The reviewer interprets this as denial or misunderstanding. Result: rejection.

The second trap: overwhelming Google with irrelevant documentation. An effective reconsideration request should be 300–500 words maximum, structured into three blocks (identified problem, actions taken, result). Attaching a 10,000-line spreadsheet without clear explanation is useless. The reviewer has 2-3 minutes per case—it must be surgical.

Attention: submitting a reconsideration request before completing the corrections is counterproductive. Each rejection extends processing times and creates a negative history. It’s better to take an extra two weeks to validate comprehensiveness than to rush an incomplete request.

Practical impact and recommendations

How to identify ALL pages affected by a manual action?

Search Console lists sample pages, not exhaustively. You need to cross-reference several sources. The first reflex: crawl the entire site with Screaming Frog or Oncrawl in "all pages" mode, not just those indexed. Noindexed or robots.txt blocked pages may also be affected and need fixing.

Next, identify the common pattern. If the manual action concerns duplicate content, use textual similarity detection tools (Siteliner, Copyscape in bulk) to map all occurrences. If it's link spam, export the full profile from Google Search Console, Ahrefs, and Majestic—the three databases do not fully overlap. A toxic URL may be invisible in Search Console but glaring in Ahrefs.

What mistakes to avoid during correction?

Do not confuse deletion and correction. Massively deleting problematic pages can generate cascading 404 errors and worsen the situation. If pages have residual organic traffic or legitimate backlinks, it's better to rewrite or properly redirect them.

Another frequent mistake: making surface corrections without addressing the root cause. If the manual action stems from spam comments, disabling comments on new pages without purging the old ones does not solve anything. Google wants to see a complete eradication of the problem, not cosmetic patching.

How to document corrections effectively for the reconsideration request?

Google expects verifiable evidence. For a manual action on spam links: a list of disavowed domains (with disavow date), screenshots of removal request emails sent (showing that contact was attempted before disavowal), export of the disavow.txt file submitted via Search Console.

For problematic content: corrected URLs with before/after mentions, changes made (rewriting, deletion, merging), validation that the content now respects guidelines. Some add screenshots, but it’s not essential if the description is precise. The key is to prove that the problem has been understood and systematically addressed, not approached haphazardly.

  • Crawl the entire site (indexed, non-indexed, orphan pages) to map affected pages
  • Identify the pattern or root cause of the manual action through sampling and comparative analysis
  • Correct or delete 100% of occurrences, with no exceptions for low-traffic pages
  • Validate completeness through a second crawl post-correction and manual tests on sample pages
  • Document each step with timestamps, treated URLs, actions taken, and tangible evidence
  • Write a factual, structured reconsideration request, without minimizing the problem
Lifting a manual action is a rigorous process that tolerates no shortcuts. Comprehensive correction is an absolute prerequisite, not a recommendation. For large sites or complex manual actions (link networks, massively generated content), the task can be particularly time-consuming and technically demanding. In such situations, enlisting a specialized SEO agency can significantly speed up the process: precise diagnosis, automation of detection and correction, documentation compliant with Google’s expectations, and monitoring of the reconsideration request. Expert support reduces the risks of rejection and limits prolonged traffic loss.

❓ Frequently Asked Questions

Combien de temps faut-il pour qu'une demande de réexamen soit traitée par Google ?
Le délai varie de quelques jours à plusieurs semaines selon la complexité de l'action manuelle et la charge des équipes de Google. En général, compter entre 7 et 21 jours pour une première réponse.
Peut-on soumettre plusieurs demandes de réexamen successives ?
Oui, mais chaque refus rallonge les délais de traitement suivants. Il est préférable de s'assurer que la correction est totale avant de soumettre une nouvelle demande plutôt que de multiplier les tentatives incomplètes.
Faut-il corriger les pages non indexées ou en noindex affectées par l'action manuelle ?
Absolument. Google examine l'ensemble du site, pas uniquement les pages indexées. Des pages en noindex contenant du spam peuvent justifier le maintien de la pénalité.
Que se passe-t-il si une demande de réexamen est refusée ?
Google envoie une notification dans Search Console expliquant pourquoi la demande a été rejetée, souvent avec des exemples de pages encore problématiques. Il faut corriger ces nouvelles occurrences et soumettre une nouvelle demande.
Une action manuelle peut-elle être partiellement levée sur certaines sections du site ?
Non. Google lève l'action manuelle dans son intégralité ou la maintient. Il n'existe pas de levée partielle par section ou typologie de pages. C'est tout ou rien.
🏷 Related Topics
Domain Age & History AI & SEO PDF & Files Penalties & Spam

🎥 From the same video 5

Other SEO insights extracted from this same Google Search Central video · duration 5 min · published on 18/06/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.