Official statement
Other statements from this video 13 ▾
- 1:48 Googlebot peut-il vraiment crawler les événements déclenchés par l'utilisateur ?
- 3:17 Les avis Google affichés sur votre site influencent-ils vraiment votre référencement ?
- 4:25 Les données structurées incorrectes pénalisent-elles vraiment le classement Google ?
- 6:36 Fusionner plusieurs pages en une seule : bonne ou mauvaise idée pour le SEO ?
- 8:24 Comment le maillage interne des catégories influence-t-il vraiment leur classement dans Google ?
- 15:06 Faut-il vraiment limiter les mots-clés sur les pages de catégorie pour éviter une pénalité ?
- 17:49 Les backlinks vers les pages de catégorie sont-ils vraiment sans risque pour le classement ?
- 18:49 Les avis produits hébergés sur votre site peuvent-ils vraiment générer des rich snippets ?
- 23:39 Faut-il vraiment utiliser plusieurs balises H1 sur une même page ?
- 35:55 Le contenu dupliqué est-il vraiment pénalisé par Google ?
- 38:13 Faut-il vraiment centraliser tout son contenu sur une seule plateforme pour mieux ranker ?
- 53:37 Les Core Updates de Google modifient-elles uniquement le contenu et les backlinks ?
- 55:10 Faut-il vraiment utiliser les mots-clés exacts des requêtes utilisateurs pour ranker ?
Googlebot does not guarantee detecting redirects that occur after a delay (for example, 30 seconds). The bot may leave the page before the redirect activates, which compromises the indexing of the target URL. To avoid crawling issues and signal transfer problems, always favor instant server-side redirects (301, 302, 307, 308).
What you need to understand
What exactly is a timed redirect?
A timed redirect is triggered via JavaScript after a defined delay — typically between 3 and 30 seconds. The user first sees page A, then the script automatically redirects them to page B after the countdown.
These redirects are often used to display an intermediate message ("You will be redirected in X seconds...") or to allow time for a marketing pixel to load. The problem: Googlebot neither has the time nor the patience to wait.
Why might Googlebot miss these redirects?
Google's bot allocates a crawl budget limited to each site. It does not wait indefinitely for a script to trigger — especially if the delay exceeds a few seconds. If Googlebot leaves the page before the JavaScript redirect executes, it will index the source URL, not the target.
Even though Google regularly improves its JavaScript rendering, there is no guarantee that your redirect will be captured. It’s a lottery, and SEO does not fare well in lotteries.
What are the concrete consequences for SEO?
If Googlebot does not follow the redirect, the destination URL does not receive ranking signals (authority, links, content) from the source URL. The result: two pages indexed instead of one, dilution of PageRank, risk of duplicate content if both versions coexist.
Worse yet, if you use a timed redirect to migrate content, Google may continue to index the old page and ignore the new one — exactly the opposite of what you are trying to achieve.
- Wasted crawl budget on intermediate pages of no value
- Uncertain signal transfer between the source URL and the target URL
- Risk of double indexing if Google indexes both versions
- Loss of control over which URL appears in search results
SEO Expert opinion
Is this statement consistent with field observations?
Yes, and not surprisingly. Tests show that Googlebot generally does not wait more than 5 seconds after the initial loading of a page. If your JavaScript redirect triggers after 10, 20, or 30 seconds, the chances of it being captured are close to zero.
Timed redirects were popular in the 2010s for dubious UX reasons ("giving users time to read a message"). Today, they persist mainly due to technical debt or a lack of awareness of their SEO impact.
What nuances should be added to this recommendation?
John Mueller talks about delays of 30 seconds, but even 3 seconds can be problematic. Googlebot can theoretically execute JavaScript, but it is not instantaneous and is not guaranteed on all pages — especially those with a low crawl budget.
If you absolutely must display a message before a redirect, reduce the delay to 0 seconds on the JavaScript side and manage the message display via CSS/animation. But honestly, a real server 302 redirect is better if the move is temporary. [To be verified]: Google does not precisely document how long its bot waits after executing JavaScript — this threshold likely varies by site and context.
In what cases does this rule not apply?
If the timed redirect is intended only for human users (for example, after a specific action like a click), Googlebot will never encounter it. But as soon as it activates upon loading the page, you're in the danger zone.
A rare exception: pages with a very high crawl budget (homepage of major sites) where Google spends more time. But even in that case, why take the risk? A server redirect takes zero lines of JavaScript and works 100%.
Practical impact and recommendations
What should you do if you are using timed redirects?
Start with a technical audit: identify all JavaScript timed redirects on your site. You can spot them through a Screaming Frog crawl with JavaScript mode enabled, or by inspecting the source code (look for setTimeout, window.location, meta refresh with delay).
Once identified, replace them with server-side HTTP redirects: 301 if the move is permanent, 302/307 if temporary. Configuration in .htaccess for Apache, nginx.conf for Nginx, or via your CMS if you have a redirect plugin.
What mistakes should be absolutely avoided?
Do not keep a timed redirect "because it works for users." Users are not Googlebot — the latter has time and resource constraints that your visitors do not have. What works in UX can kill your SEO.
Also, avoid the pitfall of meta refresh with a delay greater than 0. Even if Google theoretically supports instantaneous meta refreshes (delay=0), as soon as you add a countdown, you fall back into the same issue as JavaScript redirects.
How can you check if your site is compliant after corrections?
Use the URL Inspection Tool of the Search Console: test the URL live and check that Google properly follows the redirect. If the tool shows the target URL as "Indexed URL", that's a good sign. If you still see the source URL, the redirect is not captured.
Also monitor your server logs: after correction, Googlebot should no longer crawl the old intermediate URLs. If you still see hits on these pages, it means an internal or external link still points to them — clean up your backlinks and internal linking.
- Identify all JavaScript redirects with a delay greater than 0 seconds
- Replace with HTTP 301 or 302 server-side redirects
- Remove
meta refreshwith a delay > 0 - Check redirection capture via the Search Console Inspection Tool
- Monitor logs to ensure Googlebot follows the new redirects
- Update internal links pointing to the old URLs
❓ Frequently Asked Questions
Googlebot peut-il détecter une redirection JavaScript avec un délai de 3 secondes ?
Une redirection meta refresh avec délai de 5 secondes est-elle préférable à une redirection JavaScript ?
Si je réduis le délai à 1 seconde, est-ce que ça passe mieux ?
Les redirections temporisées peuvent-elles être considérées comme du cloaking ?
Comment migrer proprement d'une redirection temporisée vers une redirection serveur ?
🎥 From the same video 13
Other SEO insights extracted from this same Google Search Central video · duration 53 min · published on 27/09/2019
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.