Official statement
Other statements from this video 14 ▾
- 1:43 Faut-il vraiment traiter Googlebot comme un utilisateur américain ?
- 3:29 Faut-il modifier son domaine principal dans Search Console lors d'une redirection vers une sous-page ?
- 5:27 Pourquoi Google a-t-il supprimé la découverte des ressources bloquées dans Search Console ?
- 10:46 Faut-il éviter JavaScript pour générer ses balises meta ?
- 22:11 Les pages exclues de l'index consomment-elles vraiment votre crawl budget ?
- 27:01 Les thèmes WordPress préfabriqués pénalisent-ils vraiment votre SEO ?
- 27:18 Faut-il vraiment abandonner le nofollow en maillage interne pour éviter les pages de porte ?
- 28:35 Le test mobile-friendly suffit-il vraiment à valider l'indexation de votre JavaScript ?
- 29:43 Pourquoi intégrer des images Instagram via iframe ruine-t-il leur potentiel SEO ?
- 36:38 Les redirections 301 en chaîne font-elles exploser votre budget de crawl ?
- 39:59 Les données structurées suffisent-elles pour démontrer l'expertise et la crédibilité d'une page ?
- 41:31 Google peut-il modifier vos titres pour y ajouter votre marque ?
- 44:04 Pourquoi votre site bien classé n'affiche-t-il pas de sitelinks ni de boîte de recherche ?
- 48:30 ccTLD ou sous-dossier géociblé : quelle architecture choisir pour votre SEO international ?
Google has confirmed that the Search Console API has been returning incorrect indexed count data since April. Essentially, if you are using this API to monitor your indexing or generate automated reports, your figures are wrong. Before panicking over a seemingly apparent drop in the number of indexed pages, make sure you're not relying solely on this corrupted data.
What you need to understand
What exactly is the extent of the problem with the API?
John Mueller announced that since April, the Search Console API has been returning erroneous indexing figures. Not just a slight variation — outright incorrect data. The issue specifically affects the endpoint that returns the number of indexed pages.
The web interface of the Search Console remains reliable. Only the API is affected. If you check GSC manually in your browser, the displayed data is correct. It's solely the programmable version, the one used by third-party tools and automation scripts, that is malfunctioning.
Why is Google not providing any details about the nature of the error?
As usual, Mueller remains vague on the technical details. It's unclear whether the API underestimates, overestimates, or if the errors vary from site to site. No indication either on the magnitude of the discrepancy: 5%? 50%? Completely random?
This opacity makes the situation particularly problematic for anyone relying on this data. It's impossible to correct or compensate without knowing the pattern of the error. Google merely states, "don’t rely on these figures" without providing a viable alternative.
How long should we wait before the API is reliable again?
Mueller provides no timeline for correction. The announcement comes several months after the problem began in April, suggesting that the resolution isn’t trivial. Google clearly needs time to debug this system.
In the meantime, anyone who has built automated workflows around this API is left with decision-making processes based on false data. A significant issue if you use these figures to trigger alerts, plan SEO actions, or generate client reports.
- The Search Console API has been returning incorrect indexed count data since April
- The manual web interface of GSC remains reliable — the bug affects only the API
- No details are provided on the exact nature of the error (underestimation, overestimation, variability)
- No timeline for correction has been communicated by Google
- Third-party tools relying on this API display incorrect data
SEO Expert opinion
Should Google have communicated about this bug sooner?
Honestly, yes. If the problem dates back to April and Mueller announces it several months later, it means that for all that time, SEOs have made decisions based on corrupted data. How many falsely triggered alerts? How many client reports with fanciful figures?
This delayed communication raises questions about the priority Google gives to API users. The web interface is functioning correctly, so the bug only affected power users and professional tools — precisely those who automate their SEO tracking. [To verify]: it’s unknown whether Google warned third-party tool developers directly before this public announcement.
Is the manual interface truly reliable 100% of the time?
Mueller claims that only the API is affected, but let's be honest: we have seen GSC display inconsistent data even in the standard interface. Update delays, unexplained variations between "Indexed Pages" and "Submitted Pages", figures fluctuating for no apparent reason.
It's hard to know if this API bug is just a symptom of a deeper issue in Google's reporting systems. Do both interfaces draw from the same databases? If one is corrupted and the other is not, it implies a fragmented architecture — which does not inspire confidence.
Should we completely abandon the API for now?
Not necessarily abandon, but don't rely on it as the sole source of truth. If you use the API to monitor indexing, always cross-reference it with the manual interface or external checks (site: queries, SERP scraping, server logs).
The real problem is that those who have automated their alerts or client dashboards will need to either suspend these processes or validate them manually — which defeats the purpose of automation. And for agencies that charge for automated reporting? A tricky situation.
Practical impact and recommendations
How to verify that your indexing data is reliable?
First step: systematically compare the API with the manual interface of GSC. If you see a significant discrepancy, you are a victim of the bug. Document the discrepancy — it will help justify anomalies in your reports.
Second check: use alternative data sources. Site: queries in Google provide an estimate (imprecise, sure, but useful to detect sharp changes). Your server logs show how many pages Googlebot is actually crawling. These sources are independent of the GSC API and can serve as a cross-check.
Which third-party tools are affected by this issue?
All those relying on the Search Console API for retrieving indexing data. Screaming Frog, SEMrush, Ahrefs, Oncrawl, Botify — if these platforms display a "pages indexed" counter drawn from the GSC API, the figures are wrong.
Contact your tool providers directly to find out if they have implemented a disclaimer or workaround solution. Some may have already switched to alternative counting methods or temporarily disabled this metric. It's better to know this before panicking in front of a dashboard showing -40% indexed pages.
What should you do if you need to present a client report this week?
Be transparent. Explain that Google has confirmed a bug on the API since April and that automated indexing data is not reliable. Present the figures from the manual GSC interface instead.
If your client or management demands proof, capture timestamped screenshots of the GSC interface. It's less flashy than an automated dashboard, but it’s the only reliable source for now. And honestly, it’s better to delay a report by a week than to present conclusions based on fanciful figures.
- Immediately compare your API data with the manual GSC interface
- Temporarily disable automatic alerts based on the indexing API
- Document noticed discrepancies to justify anomalies in your reports
- Check with your third-party tools to see if they have issued a fix or disclaimer
- Use alternative sources (server logs, site: queries) to cross-check data
- Proactively communicate with your clients or management about this issue
❓ Frequently Asked Questions
L'interface web de la Search Console est-elle affectée par ce bug ?
Depuis quand exactement l'API renvoie-t-elle des données fausses ?
Google a-t-il donné un délai pour corriger ce bug ?
Mes outils SEO tiers affichent-ils donc de fausses données d'indexation ?
Comment puis-je obtenir des données d'indexation fiables en attendant ?
🎥 From the same video 14
Other SEO insights extracted from this same Google Search Central video · duration 1h14 · published on 09/08/2019
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.