Official statement
Other statements from this video 25 ▾
- 1:41 Faut-il vraiment utiliser des canonical cross-domain pour consolider plusieurs sites thématiques ?
- 2:00 Les redirections 302 transmettent-elles le PageRank comme les 301 ?
- 2:00 Le canonical tag transfère-t-il vraiment 100% du PageRank sans aucune perte ?
- 14:00 Faut-il vraiment éviter de mettre tous ses liens sortants en nofollow ?
- 14:10 Faut-il vraiment éviter de mettre tous ses liens sortants en nofollow ?
- 16:16 L'outil de paramètres d'URL dans Search Console : mort-vivant ou encore utile pour votre SEO ?
- 16:36 L'outil URL Parameters de Google fonctionne-t-il encore malgré son interface cassée ?
- 20:01 Pourquoi bloquer le robots.txt empêche-t-il le noindex de fonctionner ?
- 22:03 Les Core Web Vitals sont-ils vraiment le seul critère de vitesse qui compte pour le classement ?
- 23:03 Core Web Vitals : pourquoi Google ignore-t-il les autres métriques de performance pour le Page Experience ?
- 25:15 Les tests PageSpeed mentent-ils sur vos Core Web Vitals ?
- 26:50 Le texte alternatif est-il vraiment décisif pour votre visibilité dans Google Images ?
- 26:50 Le texte alternatif des images sert-il vraiment au référencement naturel ?
- 28:26 Les redirections 302 transmettent-elles vraiment autant de PageRank que les 301 ?
- 30:17 Faut-il vraiment cacher les bannières de consentement cookies à Googlebot ?
- 30:57 Faut-il vraiment bloquer les cookie banners pour Googlebot ?
- 34:46 Pourquoi Google affiche-t-il encore d'anciens contenus dans vos meta descriptions ?
- 36:57 Faut-il vraiment afficher les cookie banners à Googlebot ?
- 37:56 Les redirections 302 deviennent-elles vraiment des 301 avec le temps ?
- 40:01 Faut-il vraiment renvoyer un 404 pour les produits définitivement indisponibles ?
- 40:01 Faut-il renvoyer un 404 ou un 200 sur une page produit en rupture de stock ?
- 43:37 Faut-il synchroniser les dates visibles et les dates techniques pour booster son crawl ?
- 43:38 Faut-il vraiment distinguer la date visible de celle des données structurées ?
- 46:46 Pourquoi Google crawle-t-il encore vos anciennes URLs supprimées ?
- 47:09 Pourquoi Google continue-t-il de crawler vos anciennes URLs en 404 ?
Google may temporarily display the text of an old meta description in its snippets, even after it's been updated, if that content was previously on the page and deemed more relevant for the user's query. This discrepancy usually resolves itself within a few weeks. Practically, this means that a change in meta description isn't always immediately visible in search results.
What you need to understand
Does Google index a page's content history?
The short answer is: yes, partially. Google keeps track of previous versions of a page's content, particularly through its cache. When Googlebot crawls your page, it doesn't instantly erase all traces of what was there before.
The engine can therefore associate a URL with multiple variations of text that it has indexed over time. If your old meta description was part of the body of the page (or in a previous meta tag), Google may consider it a valid candidate for generating a snippet — even if you've changed it since.
Why does Google sometimes prefer the old text over the new?
The answer lies in one word: contextual relevance. Google doesn't merely read the current meta description as an instruction to follow blindly. It evaluates it as a signal among others.
If the old text contained keywords or phrases that match better with a specific query, Google may decide to display it temporarily. This behavior is consistent with the fact that Google rewrites meta descriptions in about 60 to 70% of cases to better meet user intent.
How long does this discrepancy last?
Mueller mentions “a few weeks” as the normalization delay. This isn't a contractual obligation: the actual time depends on your site's crawl frequency, the page's depth in the architecture, and how quickly Google updates its index for that URL.
On a site with a good crawl budget and strategic pages recrawled daily, the delay can be reduced to a few days. Conversely, on a deep page of a large e-commerce site, it could extend to a month. So “a few weeks” is an averaged estimate — not an absolute rule.
- Google retains a partial history of your pages' content, including old meta descriptions.
- The perceived relevance for a given query may lead Google to temporarily display the old snippet instead of the new one.
- The normalization delay varies based on crawl budget, page depth, and index update frequency.
- This behavior is not a bug but a consequence of the dynamic functioning of intent-focused snippets.
- Forcing reindexing via Search Console can speed up the process, but with no immediate guarantee.
SEO Expert opinion
Is this statement consistent with field observations?
Yes, absolutely. I have observed this phenomenon many times during redesigning product sheets or updating strategic landing pages. The snippet displayed in the SERPs can take several days — even weeks — to reflect the new meta description, especially if the old one was well aligned with certain long-tail queries.
What's interesting is that Mueller confirms here that it’s not just a matter of crawl/indexing delay, but also an algorithmic choice by Google to maximize snippet relevance. In other words, even if Google has adequately crawled and indexed your new meta, it may deliberately choose the old version for certain specific queries.
What nuances should be added to this claim?
There's still ambiguity regarding how long exactly Google retains these old versions in its snippet generation systems. Mueller talks about “a few weeks,” but we lack precise data: is it 2 weeks, 6 weeks, or more depending on the site? [To be verified] on a large sample of sites with varying crawl budgets.
Another point: Mueller doesn't specify whether this behavior applies only to meta descriptions or also to other snippet elements (title, featured snippet, rich results). My experience suggests that titles are less subject to this phenomenon — Google seems to update them more quickly. But again, empirical data supporting this is lacking.
In what cases can this delay become problematic?
Imagine a one-time promotional operation (sales, Black Friday, flash promo) where you update the meta description to push a limited-time offer. If Google continues to display the old snippet for 3 weeks, you lose impact on your CTR at the critical moment.
Another case: a correction of factual errors or tone of voice following bad buzz. If the old phrasing remains visible in the SERPs for weeks, it can hinder crisis management. In these situations, “a few weeks” is not acceptable — you need to force the issue via Search Console and closely monitor it.
Practical impact and recommendations
What should you concretely do after changing a meta description?
First, request reindexing via Search Console to speed up the process. There’s no guarantee that Google will honor it immediately, but it increases your chances of having the page recrawled quickly. Then, monitor the SERPs for strategic queries — use a rank tracking tool or conduct manual searches in private browsing.
If you notice the old snippet persists beyond 3-4 weeks on high-traffic pages, dig deeper: check that the old text isn’t still somewhere in the source code (hidden tags, old commented text block, forgotten JSON-LD). Google could pick it up from there.
What mistakes should you avoid when updating a meta description?
Don’t delete or modify all elements of the old meta at once if it performed well on certain queries. If your old description generated good CTR on a specific query, incorporate relevant keywords into the new version — this will limit the risk that Google reverts to the old text.
Also, avoid changing the meta description without checking the impact on the page’s body content. If your new meta promises something that the content doesn’t deliver, Google may replace it with an excerpt from the body text — or even with the old meta if it deems it more consistent with the actual page content.
How can you verify that the change has been taken into account?
Use the “site:” command combined with a search for the exact title of your page to see which snippet Google is currently displaying. Complement this with monitoring using tools like SEMrush, Ahrefs, or Oncrawl that track displayed snippets in the SERPs over time.
If you manage a large site, automate this monitoring using Python scripts + Search Console API to detect discrepancies between the declared meta description and the displayed snippet. This allows you to react quickly if Google continues to show an outdated version beyond a reasonable delay.
- Request reindexing via Search Console after each strategic meta description modification
- Monitor the SERPs manually or via a rank tracker for 2-3 weeks post-modification
- Check that the old text isn’t appearing anywhere in the source code (tags, JSON-LD, HTML comments)
- Keep performing keywords from the old meta in the new if it generated CTR
- Automate snippet monitoring for high-stakes business pages
- Plan for a backup (ads, external communication) if the update needs to be immediately visible for a real-time operation
❓ Frequently Asked Questions
Combien de temps Google peut-il afficher une ancienne meta description après sa modification ?
Pourquoi Google affiche-t-il l'ancien texte alors que j'ai mis à jour la meta description ?
Peut-on forcer Google à afficher immédiatement la nouvelle meta description ?
Ce phénomène s'applique-t-il aussi aux balises title ?
Que faire si l'ancien snippet nuit à une opération commerciale temps-réel ?
🎥 From the same video 25
Other SEO insights extracted from this same Google Search Central video · duration 53 min · published on 29/10/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.