What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Google does not determine the canonical once and for all. Its algorithms continually evaluate the crawled content to detect changes. If two versions have very close duplication scores (e.g., 0.49 vs 0.51), the canonical may switch between the URLs over time, particularly depending on the crawl order and content modifications.
5:44
🎥 Source video

Extracted from a Google Search Central video

⏱ 11:24 💬 EN 📅 13/08/2020 ✂ 7 statements
Watch on YouTube (5:44) →
Other statements from this video 6
  1. Faut-il vraiment réserver la balise canonical à la duplication stricte de contenu ?
  2. 2:04 Le tag canonical est-il vraiment une simple recommandation pour Google ?
  3. 3:07 Pourquoi utiliser le canonical comme redirection sabote votre budget de crawl ?
  4. 7:15 Pourquoi vos données Search Console disparaissent-elles sans raison apparente ?
  5. 8:19 Pourquoi Google ignore-t-il parfois votre balise canonical pour servir une autre URL ?
  6. 9:19 Faut-il renoncer au contenu unique sur une page canonicalisée ?
📅
Official statement from (5 years ago)
TL;DR

Google never permanently settles on a canonical URL. Its algorithms constantly reevaluate duplication signals with every crawl. When two versions score nearly identically, the canonical may switch from one URL to another based on the order in which the bot visits or minor content changes. In practice, this technical fluctuation can lead to unpredictable indexing variations if your duplicates are not properly managed.

What you need to understand

Does Google fix the canonical after the first evaluation?

No. Google never makes a definitive decision on the canonical URL of a group of duplicate pages. Martin Splitt clarifies: algorithms continually reevaluate signals with each crawl to detect any changes in content, structure, or external signals.

This ongoing reevaluation means that a URL chosen as canonical today may be replaced tomorrow by another version, even without manual intervention on your part. The system is dynamic, not static.

What causes these switches between URLs?

The issue arises when two versions present very close duplication scores — for example, 0.49 versus 0.51. In this gray area, Google lacks a clear enough signal to consistently make a decision.

Two main triggering factors: crawl order (which URL Googlebot visits first during a given session) and content modifications, even minor ones, can slightly invert the scores. If you add a phrase or change a title on version A, it may suddenly switch ahead of version B on the next crawl.

How does this instability pose an SEO problem?

This fluctuation generates unpredictable indexing variations. You may notice that Google sometimes indexes the URL with parameters and other times the clean version, without any apparent logic. Rank tracking tools may show artificial fluctuations if the indexed URL changes.

Worse, ranking signals (backlinks, UX signals, authority) get fragmented across multiple URLs instead of concentrating on just one. You lose consistency in accumulating positive signals, diluting your ranking potential.

  • The canonical is never permanently fixed — Google reevaluates with every crawl
  • Close duplication scores (e.g., 0.49 vs 0.51) create a zone of instability where the choice easily shifts
  • Crawl order and content modifications directly influence which URL will be retained
  • This instability can fragment your ranking signals across multiple versions and hinder your performance
  • Without strong and consistent signals (canonical tag, redirects, internal linking), you leave Google to arbitrate on its own — with variable results

SEO Expert opinion

Is this statement consistent with real-world observations?

Yes, completely. Canonical URL fluctuations are regularly observed in Search Console, especially on e-commerce or media sites that generate many pages with minor variations (filters, pagination, UTM parameters).

What's new here is the official confirmation that Google never makes a definitive decision. Previously, many SEOs believed that once the canonical stabilized after a few weeks, the matter was settled. False — the system is constantly reevaluating, and if your signals remain ambiguous, the choice can switch with every crawl.

What nuances should be added to this statement?

Splitt talks about “very close” scores (0.49 vs 0.51), but Google provides no actual numbers or public thresholds. We don’t know how similar two pages must be to enter this gray area. [To be verified] : does 85% similarity suffice? 95%? No numerical data is provided.

Another point: the statement implies that explicit signals (canonical tag, 301 redirects) weigh less than expected when the contents are nearly identical. If you have a clean rel=canonical but both pages remain crawlable and their content slightly varies, Google may ignore your directive and choose the other version. This is not explicitly stated, but it’s the practical implication.

In what cases does this rule not apply?

If your canonicalization signals are strong and converging, the problem disappears. Specifically: a strict 301 redirect, a clear canonical tag, consistent internal linking always pointing to the same version, and XML sitemaps that only list one URL.

In this case, even if two pages have similar content, Google has no reason to switch — the technical signals are clear. Fluctuation occurs mainly when you leave multiple versions crawlable and indexable without unambiguous directive. If you properly block duplicates (robots.txt, noindex, or redirect), the problem no longer arises.

Note: If you notice frequent switches of canonical URLs in Search Console, it’s a sign that your technical signals are insufficient or contradictory. Google hesitates because you're not making a clear enough call.

Practical impact and recommendations

What specific actions should you take to avoid these switches?

First action: audit all your duplicate or nearly duplicate pages. Identify groups of URLs with similar content (filters, pagination, parameters, mobile/desktop versions if they differ slightly). Use Search Console to find pages marked as “Detected, currently not indexed” or “Excluded by the canonical tag” — these are often signals of fluctuation.

Next, strengthen your canonicalization signals. Add or verify the rel=canonical on each duplicated page, pointing to the version you want to index. If possible, 301 redirect unnecessary versions instead of leaving them crawlable. Clean your XML sitemap to only list canonical URLs — never the variants.

What mistakes should you absolutely avoid?

Do not leave multiple versions of the same page crawlable AND indexable without a clear directive. This is the classic mistake: you think Google will “understand on its own” which version to prioritize, but if duplication scores are close, it will hesitate and switch regularly.

Avoid cascading or contradictory canonical tags. For example, page A points to B as canonical, but B points to C. Or worse, A and B point to each other. Google ignores these inconsistent signals and chooses on its own, with the risks of fluctuation that this implies.

How can you confirm that your site is well stabilized?

Monitor Search Console, under the “Coverage” and “Pages” tabs. Look for URLs that regularly switch from “Indexed” to “Excluded by canonical” or vice versa. If you see frequent indexing variations on the same group of pages, it’s a symptom of fluctuation.

Also, use a crawler (Screaming Frog, Oncrawl, Botify) to check that your canonical tags are all pointing to the same version and that no chain redirects exist. Test URL inspection in Search Console on your critical pages: Google should consistently return the same canonical URL with each test, not variable results.

  • Audit all groups of duplicate or nearly duplicate pages
  • Add or verify the rel=canonical on each duplicated page, pointing to the unique version to be indexed
  • 301 redirect unnecessary versions instead of leaving them crawlable
  • Clean the XML sitemap to only list canonical URLs
  • Check for the absence of cascading or contradictory canonical tags
  • Monitor Search Console for frequent indexing variations on the same URLs
The stability of your canonical URLs relies on the consistency and strength of your technical signals. If Google hesitates and switches regularly, it’s because your directives aren't clear enough. Strengthen canonical tags, redirects, internal linking, and sitemaps to impose a clear choice. These technical optimizations can be complex to diagnose and implement correctly, especially on large sites with many URL variations. If you notice persistent fluctuations despite your efforts, it may be wise to seek the support of a specialized SEO agency for an in-depth audit and tailored canonicalization strategy.

❓ Frequently Asked Questions

Google peut-il changer l'URL canonique même si j'ai mis un rel=canonical ?
Oui. Le rel=canonical est un signal fort, mais pas une directive absolue. Si vos deux pages ont des scores de duplication très proches et que d'autres signaux (maillage interne, ordre de crawl) contredisent votre canonical, Google peut choisir une autre version.
Combien de temps faut-il pour que Google stabilise le choix d'une URL canonique ?
Il n'y a pas de durée fixe. Google réévalue en permanence à chaque crawl. Si vos signaux sont clairs et cohérents, la stabilisation peut intervenir en quelques semaines. Si vos signaux sont ambigus, le flottement peut durer indéfiniment.
Qu'est-ce qu'un score de duplication de 0.49 vs 0.51 signifie concrètement ?
C'est une métaphore pour illustrer deux pages quasi identiques. Google n'a jamais publié de métriques chiffrées sur les scores de duplication. Retenez simplement que plus deux pages sont similaires, plus le risque de basculement est élevé.
Les basculements d'URL canonique affectent-ils mes positions dans les résultats de recherche ?
Potentiellement oui. Si Google change régulièrement l'URL indexée, les signaux de ranking (backlinks, autorité, UX) se fragmentent entre plusieurs versions au lieu de se concentrer sur une seule, ce qui peut nuire à vos performances.
Faut-il bloquer en robots.txt ou noindexer les versions dupliquées pour éviter les basculements ?
Bloquer en robots.txt empêche le crawl, donc Google ne peut plus détecter le canonical tag. Préférez un noindex sur les versions dupliquées ou, encore mieux, une redirection 301 vers la version canonique pour concentrer tous les signaux.
🏷 Related Topics
Algorithms Content Crawl & Indexing Domain Name

🎥 From the same video 6

Other SEO insights extracted from this same Google Search Central video · duration 11 min · published on 13/08/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.