Official statement
Other statements from this video 41 ▾
- 3:48 Google ignore-t-il vraiment les paramètres d'URL non pertinents automatiquement ?
- 3:48 Pourquoi Google ignore-t-il certains paramètres URL et comment choisit-il sa version canonique ?
- 4:34 Google ignore-t-il vraiment les paramètres d'URL non essentiels de votre site ?
- 8:48 Les erreurs 405 et soft 404 sont-elles vraiment traitées à l'identique par Google ?
- 8:48 Les soft 404 déclenchent-ils vraiment une désindexation sans pénalité ?
- 10:08 Faut-il vraiment préférer un soft 404 à une erreur 405 pour du contenu Flash retiré ?
- 18:07 Les actions manuelles pour liens sortants non naturels impactent-elles vraiment le classement d'un site ?
- 18:08 Les pénalités sur liens sortants impactent-elles vraiment le classement de votre site ?
- 18:08 Faut-il vraiment mettre tous ses liens sortants en nofollow pour protéger son SEO ?
- 19:42 Faut-il vraiment mettre tous ses liens sortants en nofollow pour protéger son PageRank ?
- 22:23 Pourquoi Google n'affiche-t-il pas toujours vos images dans les résultats de recherche ?
- 22:23 Comment Google choisit-il les images affichées dans les résultats de recherche ?
- 23:58 Combien de temps faut-il pour récupérer le trafic après un bug de redirections 301 ?
- 23:58 Les bugs techniques temporaires peuvent-ils définitivement plomber votre ranking Google ?
- 24:04 Un bug qui restaure vos anciennes URLs peut-il tuer votre SEO ?
- 24:08 Pourquoi Google crawle-t-il massivement votre site après une migration ?
- 27:47 Faut-il indexer une nouvelle URL avant d'y rediriger une ancienne en 301 ?
- 28:18 Faut-il vraiment attendre l'indexation avant de rediriger une URL en 301 ?
- 34:02 Pourquoi le test mobile-friendly donne-t-il des résultats contradictoires sur la même page ?
- 37:14 Pourquoi WebPageTest devrait-il être votre premier réflexe diagnostic en performance web ?
- 37:54 Les titres H1 sont-ils vraiment indispensables au classement de vos pages ?
- 38:06 Les balises H1 et H2 sont-elles vraiment importantes pour le ranking Google ?
- 39:58 Plugin ou code manuel : le structured data marque-t-il vraiment des points différents ?
- 39:58 Faut-il coder manuellement ses données structurées ou utiliser un plugin WordPress ?
- 41:04 Faut-il vraiment s'inquiéter d'une erreur 503 sur son site pendant quelques heures ?
- 41:04 Une erreur 503 peut-elle vraiment pénaliser le référencement de votre site ?
- 43:15 Pourquoi vos rich snippets FAQ disparaissent-ils malgré un balisage techniquement valide ?
- 43:15 Pourquoi vos rich results disparaissent-ils des SERP classiques alors qu'ils fonctionnent techniquement ?
- 43:15 Pourquoi vos rich snippets disparaissent-ils alors que votre balisage est techniquement correct ?
- 47:02 Pourquoi Search Console affiche-t-elle des URLs indexées mais absentes du sitemap ?
- 48:04 Faut-il vraiment modifier le lastmod du sitemap pour accélérer le recrawl après correction de balises manquantes ?
- 48:04 Faut-il modifier la date lastmod du sitemap après une simple correction de meta title ou description ?
- 50:43 Pourquoi le rapport Rich Results dans Search Console reste-t-il vide malgré un markup valide ?
- 50:43 Pourquoi Google affiche-t-il de moins en moins vos FAQ en rich results ?
- 50:43 Pourquoi le rapport Search Console n'affiche-t-il pas votre balisage FAQ validé ?
- 51:17 Pourquoi Google affiche-t-il de moins en moins les FAQ en résultats enrichis ?
- 54:21 Pourquoi Google choisit-il une URL canonical dans la mauvaise langue pour vos contenus multilingues ?
- 54:21 Googlebot ignore-t-il vraiment l'accept-language header de votre site multilingue ?
- 54:21 Google peut-il vraiment faire la différence entre vos pages multilingues ou risque-t-il de les canonicaliser par erreur ?
- 57:01 Hreflang mal configuré : incohérence langue-contenu, risque d'indexation réel ?
- 57:14 Googlebot envoie-t-il vraiment un en-tête accept-language lors du crawl ?
Google processes reconsideration requests strictly in chronological order: submitting multiple requests for the same site does not advance the case. The team always reviews the oldest request first, ignoring later duplicates. In practice, multiplying submissions is counterproductive and can even signal suspicious behavior to manual review teams.
What you need to understand
Why is this clarification on queues happening now?
Reconsideration requests are an administrative remedy for sites penalized by a manual action. When a webmaster discovers a penalty in the Search Console, panic often drives them to submit multiple requests in quick succession — one after a partial cleanup, then others by tweaking the message or hoping to speed up the process.
Google manages these requests through a strict chronological queue. If you submit a new request while an earlier one is still being processed, the team first reviews the old one and ignores the new one until the first is resolved. No priority system, no shortcuts.
What actually happens when you double the requests?
The system keeps all requests in its database, but a human reviewer only looks at the oldest unprocessed one. If you submitted a request on the 15th, then another on the 18th, and yet another on the 20th, the team examines the one from the 15th. If it is accepted or rejected, the one from the 18th moves to the front of the queue.
Direct consequence: your successive attempts do not change the total waiting time. Worse, if your first request was poorly formulated or premature (incomplete fixes), you waste time while Google reviews an imperfect version of your site.
Does this rule apply to all types of penalties?
The statement explicitly targets manual actions notified in the Search Console: artificial links, low-quality content, generated spam, cloaking, etc. Algorithmic penalties (core update rollouts, automated filters) do not fall under this framework — there is no reconsideration request for a downturn following a Core Update.
For link disavows, the logic differs: the disavow file is processed during the next crawl, not via a human queue. Multiplying uploads simply overwrites the previous file without any queuing notion.
- Reconsideration requests follow a strict chronological queue without priority
- Submitting multiple requests for the same site does not shorten the processing time
- Each request is reviewed by a human reviewer, not an algorithm
- Only manual actions are concerned — automatic filters have no reconsideration process
- A premature or incomplete request blocks the queue during its processing
SEO Expert opinion
Is this statement consistent with field observations?
Absolutely. Feedback from practitioners confirms that processing times vary between 3 days and 3 weeks depending on the complexity of the case and the team’s workload, but never based on the number of requests submitted. Several agencies have tested (knowingly or unknowingly) the scenario of multiple submissions: identical results to a single submission.
The system even seems to detect duplicates and consolidate them on the backend. Some webmasters report receiving a single response for several grouped requests, suggesting that Google merges redundant entries to avoid overwhelming reviewers. [To be verified]: no official documentation details this merging mechanism.
What real risks come from multiplying requests?
Beyond the futility, submitting 5 or 10 requests in quick succession can signal suspicious behavior to manual teams. A webmaster who spams the forms looks like someone who doesn’t understand the nature of their penalty — or worse, someone trying to pressure Google.
Let’s be honest: reviewers see hundreds of requests each week. A site that reappears every 48 hours in the queue doesn’t inspire confidence. It won’t trigger an additional penalty, but it can negatively bias the analyst’s view during the final examination. There is no official data on this, but it’s simple operational common sense.
In what cases does this rule not apply?
If your first request was rejected and you subsequently made substantial corrections, submitting a new request is perfectly legitimate. Google does not view this as a duplicate since the context has changed. The new request then enters the normal queue.
Similarly, if you have multiple distinct manual actions on the same domain (for example: artificial links + low-quality content), each reconsideration request addresses a specific penalty and follows its own queue. But for the same manual action, the FIFO (first in, first out) rule applies without exception.
Practical impact and recommendations
What steps should you take before submitting a request?
Before any submission, ensure you have completely fixed the issue identified in the manual action. Google provides examples of problematic URLs in the Search Console — it’s a sample, not an exhaustive list. Your job is to identify the pattern and then clean the entire site.
For artificial links, this means: a full audit of the backlink profile, identification of toxic domains, attempts at manual removal (contacting webmasters), and then disavowing irrecoverable links. For duplicate or spam content, massive removal or rewriting of the affected pages, then checking indexing via `site:` and server logs.
A premature request — submitted when 40% of the work remains to be done — will be rejected. You then lose 1 to 2 weeks of processing for nothing and have to restart the cycle. It’s better to wait 3 more days to finalize the cleanup than to jeopardize an attempt out of impatience.
What mistakes should you absolutely avoid in this process?
The classic error: submitting a vague request like “I’ve fixed the issues, please reconsider.” Reviewers want factual details: number of disavowed links, list of deleted pages, technical measures implemented (robots.txt, canonicals, redirects). A hollow form will be processed more quickly — with a rejection.
Another trap: submitting multiple requests with slightly different messages, hoping that a magical formulation will hit the mark. It doesn’t work. The system keeps track of all your submissions, and a reviewer who sees 4 attempts in 10 days understands that you’re floundering. This weakens your credibility.
How to effectively track the status of your request?
The Search Console displays the real-time status: “Pending”, “Approved”, “Rejected”. You also receive an email notification when the final decision is made. There’s no point in checking every 6 hours — processing usually takes between 5 and 15 business days depending on complexity.
If after 3 weeks you have no news, check that your Search Console access is still active (some webmasters lose their rights due to team changes or contractor switches). In case of an abnormal blockage, the Google Search Central help forum allows for escalation — but again, a single request is enough.
- Completely fix the issue before any submission — no premature attempts
- Precisely document the actions taken in the form (cleaned URLs, disavowed links, technical measures)
- Submit only one request per manual action — duplicates don’t speed anything up
- Wait for the complete response (5-15 days) before taking any additional action
- In case of rejection, analyze the detailed feedback provided by Google and address the missing points
- Do not confuse reconsideration requests with other processes (link disavow, hack reporting)
❓ Frequently Asked Questions
Combien de temps faut-il attendre avant de soumettre une nouvelle demande après un rejet ?
Peut-on annuler une demande de réexamen en cours si on réalise qu'elle était prématurée ?
Les demandes de réexamen pour plusieurs propriétés d'un même domaine (http/https, www/non-www) sont-elles traitées séparément ?
Faut-il désavouer les liens avant ou après la demande de réexamen pour une pénalité de liens artificiels ?
Un site peut-il recevoir une nouvelle action manuelle pendant qu'une demande de réexamen est en cours ?
🎥 From the same video 41
Other SEO insights extracted from this same Google Search Central video · duration 59 min · published on 11/08/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.