Official statement
Other statements from this video 36 ▾
- 1:02 Faut-il ignorer le score Lighthouse pour optimiser son SEO ?
- 1:02 La vitesse de page est-elle vraiment un facteur de classement Google ?
- 1:42 Lighthouse et PageSpeed Insights ne servent-ils vraiment à rien pour le ranking ?
- 2:38 Les Web Vitals de Google modélisent-ils vraiment l'expérience utilisateur ?
- 3:40 La vitesse de page est-elle vraiment un facteur de ranking aussi décisif qu'on le prétend ?
- 7:07 Faut-il vraiment injecter la balise canonical via JavaScript ?
- 7:27 Peut-on vraiment injecter la balise canonical via JavaScript sans risque SEO ?
- 8:28 Google Tag Manager ralentit-il vraiment votre site et faut-il l'abandonner ?
- 8:31 GTM sabote-t-il vraiment votre temps de chargement ?
- 9:35 Servir un 404 à Googlebot et un 200 aux visiteurs est-il vraiment du cloaking ?
- 10:06 Servir un 404 à Googlebot et un 200 aux utilisateurs, est-ce vraiment du cloaking ?
- 16:16 Les redirections 301, 302 et JavaScript sont-elles vraiment équivalentes pour le SEO ?
- 16:58 Les redirections JavaScript sont-elles vraiment équivalentes aux 301 pour Google ?
- 17:18 Le rendu côté serveur est-il vraiment indispensable pour le référencement Google ?
- 17:58 Faut-il vraiment investir dans le server-side rendering pour le SEO ?
- 19:22 Le JSON sérialisé dans vos apps JavaScript compte-t-il comme du contenu dupliqué ?
- 20:02 L'état applicatif en JSON dans le DOM crée-t-il du contenu dupliqué ?
- 20:24 Cloudflare Rocket Loader passe-t-il le test SEO de Googlebot ?
- 20:44 Faut-il tester Cloudflare Rocket Loader et les outils tiers avant de les activer pour le SEO ?
- 21:58 Faut-il ignorer les erreurs 'Other Error' dans Search Console et Mobile Friendly Test ?
- 27:58 Faut-il choisir un framework JavaScript plutôt qu'un autre pour son SEO ?
- 31:27 Le JavaScript consomme-t-il vraiment du crawl budget ?
- 31:32 Le rendering JavaScript consomme-t-il du crawl budget ?
- 33:07 Faut-il abandonner le dynamic rendering pour le SEO ?
- 33:17 Faut-il vraiment abandonner le dynamic rendering pour le référencement ?
- 34:01 Faut-il vraiment abandonner le JavaScript côté client pour l'indexation des liens produits ?
- 34:21 Le JavaScript asynchrone post-load bloque-t-il vraiment l'indexation Google ?
- 36:05 Faut-il vraiment passer sur un serveur dédié pour améliorer son SEO ?
- 36:25 Serveur mutualisé ou dédié : Google fait-il vraiment la différence ?
- 40:06 L'hydration côté client pose-t-elle vraiment un problème SEO ?
- 40:06 L'hydratation SSR + client est-elle vraiment sans danger pour le SEO Google ?
- 42:12 Faut-il arrêter de surveiller le score Lighthouse global pour se concentrer sur les métriques Core Web Vitals pertinentes à son site ?
- 42:47 Faut-il vraiment viser 100 sur Lighthouse ou est-ce une perte de temps ?
- 45:24 La 5G va-t-elle vraiment accélérer votre site ou est-ce une illusion ?
- 49:09 Googlebot ignore-t-il vraiment vos images WebP servies via Service Workers ?
- 49:09 Pourquoi Googlebot ignore-t-il vos images WebP servies par Service Worker ?
The 'Other Error' messages in Mobile-Friendly Test or Rich Results Test do not reflect Googlebot's actual behavior during indexing. These tools impose strict timeouts and do not cache anything to remain fast, while Googlebot can retry for hours and has robust caching mechanisms. In practice, a resource marked with an error in the testing tools can be crawled and indexed without issue in production.
What you need to understand
Why do testing tools show 'Other Error' for certain resources?
Tools like Mobile-Friendly Test or Rich Results Test are designed to provide an immediate response. Therefore, they impose very strict time limits — generally a few seconds — to load a page and all of its resources.
If a resource (CSS, JavaScript, image) takes too long to respond or exceeds an internal quota, the tool returns 'Other Error'. This does not necessarily indicate a server failure, but rather a timeout or a technical limitation on the testing tool’s side. These tools do not cache anything between tests to ensure fresh results, which makes them even more sensitive to momentary network latencies.
How does Googlebot behave differently in production?
Googlebot, during actual indexing, operates with a logic of persistence. It can retry loading a resource for several hours, or even several days if necessary. It also has a sophisticated caching system that allows it to reuse resources that have been recently crawled.
This difference in behavior explains why a page can display errors in testing tools while being perfectly indexed and rendered in search results. The 3-second timeout of a testing tool has nothing to do with the patience of a crawler in production that can wait, retry, and cache intelligently.
What is the real technical limitation of testing tools?
Testing tools are isolated and constrained environments — they do not share the same infrastructure, quotas, or priorities as Googlebot in a real situation. Their goal is to give a quick overview, not to faithfully simulate a full crawl.
This means that an 'Other Error' can stem from a simple network fluctuation, a CDN that responds slowly, or a temporarily overloaded server. If the problem is momentary, Googlebot will likely never encounter it — or will bypass it thanks to the cache.
- Testing tools: strict timeout, no cache, limited quota to remain fast
- Googlebot in production: retries for hours, robust cache, distributed infrastructure
- An 'Other Error' often indicates a limitation of the tool, not a real indexing problem
- Always check in Search Console if the page is indeed indexed before making corrections
SEO Expert opinion
Is this statement consistent with field observations?
Absolutely. We regularly observe sites displaying 'Other Error' in testing tools, yet being perfectly indexed and ranked in search results. Testing tools are useful for an initial diagnosis, but they also generate anxiety-inducing false positives.
The real indicator remains Search Console — and more specifically, real-time URL inspection. If the page passes there, the error in the testing tools is just technical noise. Conversely, if Search Console also reports a problem, then it needs serious investigation.
What nuances should be added to this statement?
Martin Splitt is correct in principle, but that doesn’t mean we should ignore all 'Other Errors'. If a critical resource — say, the main CSS file or the script generating the content — consistently fails, even Googlebot will eventually give up or render a degraded version.
The real question is: is it a momentary timeout or a structural problem? If the error recurs with every test, over several days, with critical resources, then there is likely a performance or quota issue on the server side. [To check]: Google does not provide any precise figures on the number of attempts or the maximum wait time for Googlebot — we are venturing into the unknown.
In what cases does this rule not apply?
If your site is new or has low authority, Googlebot may not invest as much time and resources into retrying the load. The crawl budget plays a role: a site with a solid history will receive more patience than a new unknown domain.
Similarly, if a resource blocked by robots.txt triggers 'Other Error', it may indicate a configuration confusion — and in that case, no magic retry will solve the problem. Context always matters more than the general rule.
Practical impact and recommendations
What should you do practically when facing an 'Other Error'?
First step: don’t panic. Open Search Console and run a URL inspection on the concerned page. If Googlebot managed to render the page correctly, with all resources loaded, the error in the testing tool has no practical importance.
If Search Console confirms a problem, then investigate on the server side: network latency, CDN configuration, quota limits, too short timeouts. Use a tool like WebPageTest or GTmetrix to measure actual response times and identify bottlenecks.
What mistakes should be avoided in interpreting these messages?
Do not confuse 'Other Error' with a real HTTP 4xx or 5xx status code. An 'Other Error' is a limitation of the tool, not a standardized server error message. This means there may be no real technical issue to correct.
Avoid also running repeated tests in a loop to "force" a green result — testing tools have quotas, and you risk getting temporarily blocked. If the error persists after 2-3 spaced tests, move on to the Search Console inspection instead of getting stuck on it.
How can I check that my site is genuinely being crawled well?
The best indicator remains the coverage of pages in Search Console. If your strategic pages are marked as indexed, with complete HTML rendering visible in the URL inspection, then the 'Other Error' from testing tools is just noise.
To go further, analyze the server logs: check that Googlebot is loading critical resources and not encountering repeated timeouts. If everything is smooth in the logs, then the error in the testing tool has no practical consequence on your SEO.
- Inspect the URL in Search Console to verify the actual rendering by Googlebot
- Compare testing tool results with server logs to detect real issues
- Measure response times of critical resources with WebPageTest or GTmetrix
- Only correct errors confirmed by multiple sources (testing tools + Search Console + logs)
- Regularly monitor index coverage to spot lasting anomalies
❓ Frequently Asked Questions
Un 'Other Error' dans Mobile-Friendly Test signifie-t-il que ma page ne sera pas indexée ?
Pourquoi les outils de test ne mettent-ils rien en cache ?
Dois-je corriger toutes les erreurs 'Other Error' remontées par les outils de test ?
Combien de temps Googlebot peut-il retenter de charger une ressource ?
Les 'Other Error' peuvent-ils impacter le rendu JavaScript de ma page ?
🎥 From the same video 36
Other SEO insights extracted from this same Google Search Central video · duration 51 min · published on 12/05/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.