What does Google say about SEO? /

Official statement

After a migration with redirects, Google periodically crawls old URLs to verify the redirects. This behavior is normal and does not represent a waste of crawl. Old URLs should not be blocked.
22:03
🎥 Source video

Extracted from a Google Search Central video

⏱ 53:47 💬 EN 📅 24/06/2021 ✂ 19 statements
Watch on YouTube (22:03) →
Other statements from this video 18
  1. 1:02 What does a two-part rollout of the Google update mean for SEO?
  2. 3:14 How do exclusionary regex filters change the game for SEO analysis?
  3. 3:47 How is Search Console Insights transforming our SEO content strategy?
  4. 6:58 How can you enhance your indexing check using a quotation search?
  5. 8:00 Why Are Redirections Critical During a Website Migration?
  6. 11:22 Is it true that Google Sites has no indexing issues at all?
  7. 16:58 How do URL parameters impact indexing in unexpected ways?
  8. 19:05 Why should you prefer canonical tags over URL parameters for effective crawl control?
  9. 26:24 Why do unknown URLs show up in Search Console?
  10. 28:29 Why should you check all your templates for noindex errors?
  11. 31:32 Why do rankings fluctuate even when there are no content changes?
  12. 34:19 Why should your AMP content align with the canonical page?
  13. 36:38 How are Page Experience signals reshaping ranking in Google News?
  14. 38:51 Why Doesn't Google Publicly Address Violations of SEO Guidelines?
  15. 42:39 How does Google actually use spam reports to refine its SEO algorithms?
  16. 43:53 Is Quality Content Really the Only SEO Pillar According to Google?
  17. 46:54 Is it true that Google claims subdomains and subdirectories have no SEO impact?
  18. 49:09 Is using text-overflow:ellipsis a mistake for your SEO strategy?
📅
Official statement from (4 years ago)
TL;DR

Google regularly checks the redirects of old URLs after a migration. This behavior isn't a waste of crawl resources. Don't block these old URLs.

❓ Frequently Asked Questions

Les anciennes URLs peuvent-elles être désindexées ?
Non, Google les maintien en crawl pour vérifier les redirections, pas pour désindexation.
Un blocage par robots.txt peut-il impacter le SEO ?
Oui, bloquer ces URLs empêche Google de vérifier correctement les redirections.
Ce comportement affecte-t-il le crawl budget ?
Non, cela fait partie du processus attendu et ne devrait pas consommer de façon excessive votre budget de crawl.
🏷 Related Topics
Domain Age & History Crawl & Indexing Domain Name Redirects

🎥 From the same video 18

Other SEO insights extracted from this same Google Search Central video · duration 53 min · published on 24/06/2021

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.