What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Redirecting 404s to the homepage (even with a 5-second meta-refresh) is confusing for users and Google. Google treats this as a soft 404 and will continue to crawl more. It’s better to serve a genuine user-friendly 404 page.
48:00
🎥 Source video

Extracted from a Google Search Central video

⏱ 57:16 💬 EN 📅 23/06/2020 ✂ 22 statements
Watch on YouTube (48:00) →
Other statements from this video 21
  1. 1:22 Pourquoi Google retarde-t-il la migration mobile-first de certains sites ?
  2. 3:10 Le mobile-first indexing améliore-t-il vraiment votre positionnement dans Google ?
  3. 5:13 Faut-il vraiment traiter tous les problèmes Search Console en urgence ?
  4. 7:07 Faut-il vraiment optimiser les ancres de liens internes ou est-ce du temps perdu ?
  5. 8:42 Faut-il vraiment éviter d'avoir plusieurs pages sur le même mot-clé ?
  6. 9:58 Peut-on prouver la qualité éditoriale d'un contenu à Google avec des balises structured data ?
  7. 11:33 Faut-il vraiment respecter les types de pages supportés pour le schema reviewed-by ?
  8. 14:02 Le cloaking technique est-il vraiment toléré par Google ?
  9. 19:36 Comment Google groupe-t-il vos URL pour prioriser son crawl ?
  10. 22:04 Pourquoi votre trafic chute-t-il vraiment après une pause de publication ?
  11. 24:16 Pourquoi Google Discover est-il plus exigeant que la recherche classique pour afficher vos contenus ?
  12. 26:31 Le structured data non supporté influence-t-il vraiment le ranking ?
  13. 28:37 Les erreurs techniques d'un domaine principal pénalisent-elles vraiment ses sous-domaines ?
  14. 30:44 Pourquoi vos review snippets disparaissent-ils puis réapparaissent chaque semaine ?
  15. 32:16 Le Domain Authority est-il vraiment inutile pour votre stratégie SEO ?
  16. 32:16 Les backlinks déposés manuellement dans les forums et commentaires sont-ils vraiment inutiles pour le SEO ?
  17. 34:55 Pourquoi vos commentaires Disqus ne s'indexent-ils pas tous de la même manière ?
  18. 44:52 Pourquoi Google confond-il vos pages locales avec des doublons à cause des patterns d'URL ?
  19. 50:51 Faut-il vraiment utiliser unavailable_after pour gérer les événements passés sur votre site ?
  20. 50:51 Pourquoi votre no-index massif met-il 6 mois à 1 an pour être traité par Google ?
  21. 55:39 Les URL plates nuisent-elles vraiment à la compréhension de Google ?
📅
Official statement from (5 years ago)
TL;DR

Redirecting 404 pages to the homepage—even with a 5-second meta-refresh—creates soft 404s that Google will continue to crawl unnecessarily. Users get lost, bots waste crawl budget, and your site sends inconsistent signals. The solution? A proper user-friendly 404 page with a clean HTTP 404 code.

What you need to understand

What is a soft 404 and why does Google detect it?

A soft 404 occurs when the server returns an HTTP 200 (success) code even though the requested resource no longer exists. Google sees an ‘active’ page, but its content resembles an error: often generic, text-poor, and lacking added value.

The engine detects these inconsistencies through heuristic signals: lack of unique content, identical layout to other ‘empty’ pages, and standardized title/meta tags. Result: Google marks the page as soft 404 in Search Console and continues to crawl it regularly to check if it has changed.

Why don’t meta-refreshes resolve anything?

Adding a 5-second delay before redirecting doesn’t change the diagnosis. Google largely ignores meta-refreshes for its indexing—it analyzes the initial content served to the bot, not what happens after a JavaScript timer.

The user lands on a page that doesn’t meet their expectations, waits a few seconds without understanding, then ends up on a homepage unrelated to their initial query. The bounce rate skyrockets, and the UX signal sent to Google is catastrophic.

How does this concretely affect crawl budget?

Every soft 404 remains in the index with an ambiguous status. Google recrawls it regularly to determine whether the page has returned or if it's still a disguised error. On a site with thousands of poorly managed historical URLs, this represents hundreds of wasted crawl requests each week.

A true 404 code is understood immediately: the page is dead, no need to return frequently. Google adjusts its crawl frequency accordingly and concentrates its budget on active resources.

  • Soft 404s unnecessarily consume crawl budget by forcing frequent recrawls
  • The HTTP 200 code on an empty page creates an inconsistency that Google has to resolve manually
  • Meta-refreshes are not considered for indexing—only the initial content counts
  • A real 404 page allows Google to quickly de-index and optimize its resources
  • User experience severely degrades with redirects to the homepage without context

SEO Expert opinion

Does this recommendation contradict widespread historical practices?

Yes, and that’s precisely where many sites still fail. For years, redirecting 404 → homepage was considered a ‘best practice’ to ‘not lose the visitor.’ Some mainstream CMS platforms even integrated it by default.

However, this logic completely ignores the crawl perspective and the medium-term SEO impact. We optimize for a hypothetical visitor at the expense of clear structural signals for the search engine. Field observations consistently show an inflation of the number of soft 404s in Search Console on these configurations.

In what cases is a redirect from a 404 still acceptable?

There are legitimate exceptions: if a product page is deleted but a direct and relevant alternative exists in the same category, a 301 redirect to that alternative makes sense. The user finds a close answer, and Google understands the substitution.

But the key is contextual relevance. Redirecting /nike-air-max-2018 to /nike-shoes works. Redirecting to the generic homepage, never. [To be verified]: Google has never published a precise quantitative threshold regarding the soft 404/total pages ratio triggering a crawl penalty, but field feedback suggests that beyond 10-15% of soft 404s in Search Console, overall crawl frequency begins to drop.

What is the real value of a well-designed 404 page?

A user-friendly 404 page does not just display ‘page not found.’ It offers a built-in search engine, links to main sections, and even contextual suggestions based on the requested URL. It’s an opportunity to regain engagement rather than a dead end.

From an SEO perspective, it sends a clear signal: the server returns a HTTP 404 code, Google quickly de-indexes without ambiguity, and crawl budget is no longer wasted. Some well-optimized e-commerce sites even show measurable conversion rates from their 404 pages thanks to intelligent design.

Note: If you are migrating from a 404 redirect to the homepage system, monitor Search Console for 4-6 weeks. The volume of soft 404s should gradually decrease. If not, check that your servers are indeed returning a HTTP 404 code and not a 200 with ‘error’ content.

Practical impact and recommendations

What should you prioritize checking on your site?

Start by auditing the HTTP codes actually served. Use a crawler like Screaming Frog, Oncrawl, or Botify in ‘URL list’ mode with a sample of old deleted pages. Compare the returned HTTP code (server response header) with what Google sees in Search Console.

Next, check the ‘Coverage’ or ‘Pages’ report in Search Console: look for the ‘Excluded’ section and filter for ‘Soft 404.’ If you find hundreds or thousands of URLs, it’s a red flag. These pages siphon crawl budget for nothing.

How to set up a genuine effective 404 page?

From a technical standpoint, ensure your server returns a HTTP 404 code in the response header—not a 200, not a 302. Test with curl, using browser DevTools (Network tab), or with an online tool like HTTP Status Code Checker.

Content-wise, design a branded 404 page with: a clear message (‘this page no longer exists’), a built-in search engine, links to the main sections of the site, and contextual suggestions based on the URL (e.g., if the URL contains ‘shoes’, suggest the shoes category). Avoid an impersonal tone—some humor or empathy improves UX.

What critical mistakes should absolutely be avoided?

Never use meta-refreshes, nor client-side JavaScript redirects to ‘improve’ a 404. Google crawls the initial HTML and ignores these tricks—you’ll just create more soft 404s.

Second trap: DNS wildcards or server configurations that redirect to the homepage by default for any unknown URL with a 200 code. This is common on some poorly configured shared hosting. Result: thousands of soft 404s generated automatically.

  • Audit HTTP codes with a crawler or curl on a sample of deleted URLs
  • Check the Search Console ‘Coverage’ report section ‘Soft 404’
  • Configure the server to return a true HTTP 404 code on non-existent pages
  • Create a user-friendly 404 page with internal search and contextual navigation
  • Remove all meta-refresh or JavaScript redirects from 404s
  • Regularly test with DevTools and HTTP tools to confirm server codes
Properly managing 404 errors requires a technical coordination between development, hosting, and content strategy. On complex sites with a history of migrations or redesigns, identifying and correcting thousands of soft 404s can quickly become a heavy undertaking. If your current infrastructure massively generates this type of errors or if you lack internal resources to audit and correct thoroughly, consulting a specialized SEO agency can expedite diagnostics, implement the right server configurations, and monitor progress over time with suitable professional tools.

❓ Frequently Asked Questions

Un code 410 Gone est-il préférable à un 404 pour les pages définitivement supprimées ?
En théorie oui : le 410 signale une suppression définitive et Google peut désindexer plus rapidement. En pratique, la différence est minime et Google traite les deux de manière très similaire. L'essentiel est de ne pas renvoyer de 200.
Les soft 404 peuvent-ils provoquer une pénalité algorithmique ?
Pas de pénalité directe, mais un gaspillage chronique de crawl budget et des signaux UX dégradés. Sur un gros site, cela ralentit l'indexation des nouvelles pages et peut affecter indirectement le ranking global.
Comment gérer les anciennes URLs de produits e-commerce supprimés ?
Si un produit similaire existe, redirige en 301 vers ce produit. Sinon, vers la catégorie parente avec un message contextuel. En dernier recours, sers un vrai 404 avec suggestions produits dans la même catégorie.
Faut-il bloquer les 404 dans le robots.txt pour économiser du crawl ?
Non. Bloquer dans robots.txt empêche Google de voir le code 404 : il continuera de tenter l'URL sans comprendre qu'elle est morte. Laisse Google crawler et recevoir le 404 propre.
Combien de temps Google continue-t-il de crawler une page 404 après la première détection ?
Google réduit progressivement la fréquence de crawl sur les vraies 404. Après quelques semaines, le recrawl devient rare voire nul. Les soft 404, eux, sont recrawlés régulièrement car Google cherche à confirmer leur statut ambigu.
🏷 Related Topics
Domain Age & History Crawl & Indexing AI & SEO Redirects

🎥 From the same video 21

Other SEO insights extracted from this same Google Search Central video · duration 57 min · published on 23/06/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.