Official statement
Other statements from this video 25 ▾
- 1:36 Comment tester efficacement le rendu JavaScript avant de mettre un site en production ?
- 1:36 Pourquoi tester le rendu JavaScript avant le lancement est-il devenu incontournable pour l'indexation Google ?
- 1:38 Migrer vers JavaScript impacte-t-il vraiment le classement SEO ?
- 3:40 Hreflang : pourquoi Google insiste-t-il encore sur cette balise pour le contenu multilingue ?
- 3:40 Googlebot crawle-t-il vraiment toutes les versions localisées de vos pages ?
- 3:40 Hreflang regroupe-t-il vraiment vos contenus multilingues aux yeux de Google ?
- 4:11 Comment rendre découvrables vos URLs de contenu hyper-local sans perdre de trafic ?
- 4:11 Comment structurer vos URLs pour maximiser la découvrabilité du contenu hyper-local ?
- 5:14 La personnalisation utilisateur peut-elle déclencher une pénalité pour cloaking ?
- 5:14 Est-ce que personnaliser du contenu pour vos utilisateurs peut vous valoir une pénalité pour cloaking ?
- 6:15 Les Core Web Vitals sont-ils réellement mesurés sur les utilisateurs ou sur les bots ?
- 6:15 Les Core Web Vitals sont-ils vraiment mesurés depuis les bots Google ou depuis vos utilisateurs réels ?
- 7:18 Pourquoi le schema markup ne suffit-il pas à garantir l'affichage des rich snippets ?
- 7:18 Pourquoi les rich snippets n'apparaissent-ils pas malgré un markup Schema.org valide ?
- 9:14 Le dynamic rendering est-il vraiment mort pour le SEO ?
- 9:29 Faut-il abandonner le dynamic rendering pour du SSR avec hydration ?
- 11:40 Pourquoi le main thread JavaScript bloque-t-il l'interactivité de vos pages aux yeux de Google ?
- 11:40 Pourquoi le thread principal JavaScript bloque-t-il l'indexation de vos pages ?
- 12:33 HTML initial vs HTML rendu : pourquoi Google peut-il ignorer vos balises critiques ?
- 13:12 Que se passe-t-il quand votre HTML initial diffère du HTML rendu par JavaScript ?
- 15:50 Googlebot clique-t-il sur les boutons de votre site ?
- 15:50 Faut-il vraiment s'inquiéter si Googlebot ne clique pas sur vos boutons ?
- 26:58 La performance JavaScript pour vos utilisateurs réels doit-elle primer sur l'optimisation pour Googlebot ?
- 28:20 Les web workers sont-ils vraiment compatibles avec le rendu JavaScript de Google ?
- 28:20 Faut-il vraiment se méfier des Web Workers pour le SEO ?
Martin Splitt confirms that a redesign involving structural changes, content, or URLs forces Google to re-gather all ranking signals, temporarily affecting rankings. Only a strictly identical migration — same URLs, same content, same architecture — preserves positions. For SEOs, this means anticipating a post-redesign transition phase and minimizing unnecessary structural changes.
What you need to understand
What does it really mean to "re-gather the signals"?
When Google talks about gathering signals, it refers to the entire process of evaluating a page: crawling, indexing, semantic analysis, measuring Core Web Vitals, evaluating link context, depth in the hierarchy, and behavioral signals. Each URL has an accumulated history over time.
A redesign that modifies URLs or structure breaks this history. Google then has to start from scratch to rebuild its understanding of each page — even if the textual content remains exactly the same. 301 redirects pass PageRank but not all usage or contextual signals.
What constitutes a "strictly identical copy" according to Google?
Splitt mentions three cumulative criteria: same URLs, same content, same structure. In other words, if you change the depth of a page in the hierarchy but keep its URL, Google considers that the structure has changed.
Specifically, this includes the navigation path, internal linking, position in the XML sitemap, and even the HTML structure of the template. Changing an <aside> to an <div class="sidebar"> doesn’t change anything, but moving content blocks or reorganizing heading levels can be enough to trigger a re-evaluation.
Why can't Google simply transfer the signals?
Because signals are not portable metadata. They are tied to context: a page ranked in position 3 often owes that rank to its place in the architecture, its internal anchors, and its loading time in its specific technical environment.
If you change the template, the server, or even how resources are loaded, Google must revalidate these assumptions. This is a protective mechanism against manipulation: it is impossible to simply "copy" the ranking of one page to another by changing the URL.
- Any URL change forces a complete re-evaluation, even with a perfect 301
- A structural change (depth, linking, hierarchy) triggers a new contextual analysis
- A content modification — even minor — restarts semantic and thematic evaluation
- A technical redesign (CMS, server, loading time) requires gathering performance signals
- Only a pixel-perfect copy (URLs, HTML, hierarchy, content) avoids reset
SEO Expert opinion
Is this statement consistent with field observations?
Absolutely. All SEOs who have managed major redesigns know the post-migration turbulence — even when the redirects are flawless and the content unchanged. Positions drop for 2 to 6 weeks before stabilizing, often at a slightly lower level.
What Splitt confirms is that this is not a bug or a penalty. It is the normal functioning: Google no longer trusts its old signals and must rebuild its understanding. Sites with a solid link profile recover faster, but no one escapes the transition phase.
What nuances should be added?
Splitt does not specify how long this gathering phase lasts. Depending on the site's history, its crawl budget, and how frequently its content is updated, it can range from a few days to several months. [To be checked] on low-authority or infrequently crawled sites.
Another unclear point: what about a redesign that objectively improves the structure or performance? Splitt suggests that even in that case, there will be a re-evaluation phase — thus a temporary drop before any potential gain. But the duration and magnitude of this drop remain opaque.
In what cases does this rule not apply fully?
Very large sites (e-commerce, media) with a high crawl budget and daily updates seem to recover much faster. Google already has a constant flow of fresh signals, so the redesign does not create as marked an information void.
Conversely, a small, less crawled site that changes URLs can remain in limbo for months. The crawl frequency conditions the recovery speed, but Google does not publish any metrics to estimate it in advance.
Practical impact and recommendations
What practical steps should be taken before a redesign?
Minimize unnecessary structural changes. If the sole objective is to improve design or change CMS, maintain the existing hierarchy, even if it is not perfect. An aesthetic gain never compensates for a ranking loss of 3 months.
Document everything: current URLs, page depths, internal linking, anchors, positions in SERPs. This data allows you to compare before/after and quickly detect migration errors. An Excel spreadsheet is not enough — use Screaming Frog, Oncrawl, or Botify to map the existing structure.
What mistakes should absolutely be avoided?
Never initiate a redesign while simultaneously changing URLs, structure, and content. If you must change URLs, keep the hierarchy and content intact. If you rewrite content, keep the URLs. Each modified variable prolongs the re-evaluation phase.
Another pitfall: believing that a 301 redirect is sufficient. It transfers PageRank but not behavioral signals, crawl history, or thematic context. Google must rebuild these elements on the new URL, and that takes time.
How can the impact on ranking be limited?
If a complete redesign is unavoidable, break it down. Migrate the strategic pages first (those generating SEO traffic), monitor the recovery for 4 to 6 weeks, then migrate the rest. This limits exposure to risk.
Temporarily increase the frequency of fresh content publication on the migrated pages: Google crawls more often, gathers new signals quicker, and recovery accelerates. Update modification dates, add dynamic elements, and intensify internal linking to these pages.
- Map out the current hierarchy and URLs before any changes
- Retain the existing structure as much as possible if it works
- Avoid modifying URLs, structure, and content simultaneously
- Prepare a comprehensive 301 redirect plan and test it in pre-production
- Daily monitor positions and crawling for 6 weeks post-redesign
- Anticipate a temporary drop in traffic and brief stakeholders
❓ Frequently Asked Questions
Une refonte graphique sans changement d'URL ni de contenu impacte-t-elle le ranking ?
Combien de temps faut-il pour récupérer son ranking après une refonte complète ?
Les redirections 301 transfèrent-elles tous les signaux de ranking ?
Peut-on éviter toute baisse de ranking lors d'une migration d'URLs ?
Faut-il éviter les refontes en période de Core Update ?
🎥 From the same video 25
Other SEO insights extracted from this same Google Search Central video · duration 30 min · published on 11/11/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.