Official statement
After a migration with redirects, Google periodically crawls old URLs to verify the redirects. This behavior is normal and does not represent a waste of crawl. Old URLs should not be blocked.
22:03
Other statements from this video 18 ▾
- 1:02 What does a two-part rollout of the Google update mean for SEO?
- 3:14 How do exclusionary regex filters change the game for SEO analysis?
- 3:47 How is Search Console Insights transforming our SEO content strategy?
- 6:58 How can you enhance your indexing check using a quotation search?
- 8:00 Why Are Redirections Critical During a Website Migration?
- 11:22 Is it true that Google Sites has no indexing issues at all?
- 16:58 How do URL parameters impact indexing in unexpected ways?
- 19:05 Why should you prefer canonical tags over URL parameters for effective crawl control?
- 26:24 Why do unknown URLs show up in Search Console?
- 28:29 Why should you check all your templates for noindex errors?
- 31:32 Why do rankings fluctuate even when there are no content changes?
- 34:19 Why should your AMP content align with the canonical page?
- 36:38 How are Page Experience signals reshaping ranking in Google News?
- 38:51 Why Doesn't Google Publicly Address Violations of SEO Guidelines?
- 42:39 How does Google actually use spam reports to refine its SEO algorithms?
- 43:53 Is Quality Content Really the Only SEO Pillar According to Google?
- 46:54 Is it true that Google claims subdomains and subdirectories have no SEO impact?
- 49:09 Is using text-overflow:ellipsis a mistake for your SEO strategy?
Official statement from
(4 years ago)
⚠ A more recent statement exists on this topic
Why do old URLs still show up after a migration?
View statement →
TL;DR
Google regularly checks the redirects of old URLs after a migration. This behavior isn't a waste of crawl resources. Don't block these old URLs.
❓ Frequently Asked Questions
Les anciennes URLs peuvent-elles être désindexées ?
Non, Google les maintien en crawl pour vérifier les redirections, pas pour désindexation.
Un blocage par robots.txt peut-il impacter le SEO ?
Oui, bloquer ces URLs empêche Google de vérifier correctement les redirections.
Ce comportement affecte-t-il le crawl budget ?
Non, cela fait partie du processus attendu et ne devrait pas consommer de façon excessive votre budget de crawl.
🎥 From the same video 18
Other SEO insights extracted from this same Google Search Central video · duration 53 min · published on 24/06/2021
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.