What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

To determine if content is underperforming, check the Performance Report in Search Console rather than the crawl stats. If you're getting a lot of impressions but few clicks, the content may need to be modified. Crawl stats are not a good indicator of content quality.
4:44
🎥 Source video

Extracted from a Google Search Central video

⏱ 13:39 💬 EN 📅 09/09/2020 ✂ 8 statements
Watch on YouTube (4:44) →
Other statements from this video 7
  1. Faut-il vraiment mettre à jour vos contenus plutôt que créer de nouvelles pages ?
  2. 2:52 Un blog actif améliore-t-il vraiment votre classement Google ?
  3. 6:18 Faut-il vraiment regrouper vos pages FAQ pour éviter la pénalité thin content ?
  4. 7:21 Faut-il vraiment fusionner vos contenus similaires pour mieux ranker ?
  5. 7:34 Le nombre de mots est-il vraiment un facteur de classement Google ?
  6. 9:30 Le contenu généré pour les pages de localisation peut-il vraiment échapper au filtre duplicate content de Google ?
  7. 11:33 Comment Google détecte-t-il vraiment le contenu dupliqué avec le fingerprinting ?
📅
Official statement from (5 years ago)
TL;DR

Google states that crawl stats do not reflect the quality or performance of content. Only the Performance Report from Search Console, with the impressions/clicks ratio, can identify underperforming pages. Specifically: a page crawled 200 times a day but showing a CTR of 0.5% has a relevance issue — not a crawl issue.

What you need to understand

What is the fundamental difference between crawl stats and actual performance?

Crawl stats measure Googlebot's activity on your site: how many pages it visits, how often, how many KB it downloads. These metrics reflect the bot's technical appetite, not users' interest in your content.

The Performance Report, on the other hand, shows impressions (how often your pages appear in the SERPs) and clicks (how many users actually click). This is the only data that connects your content to actual user behavior.

Why does this confusion between crawl and performance persist?

Many SEOs observe a misleading correlation: when traffic drops, they check crawl stats and sometimes see a decline in crawl budget. They conclude that Google is “neglecting” their site.

However, this logic is reversed. If Google crawls less, it's often because your pages generate fewer signals of interest (clicks, session duration, fresh backlinks). Crawl is a consequence, not a cause. Artificially increasing crawl (by multiplying sitemaps or initiating URL inspections) will never solve a bland content issue.

How can you concretely identify underperforming content in Search Console?

Open the Performance Report, filter by page, and sort by decreasing impressions. Look for pages that accumulate thousands of impressions but a CTR below 2%. That’s your blacklist.

These pages rank well enough to be seen, but their title, meta description, or editorial angle does not convince anyone to click. It's a clear signal: the content exists in Google's eyes, but it is dead to users.

  • Crawl stats: measure the technical activity of Googlebot (pages crawled, KB downloaded, response times).
  • Performance Report: measures actual visibility (impressions, clicks, CTR, average position) and reflects user interest.
  • A low impressions/clicks ratio indicates an issue with editorial relevance or snippet attractiveness.
  • Increasing crawl without improving content is a waste of time — it’s a symptom, not a solution.

SEO Expert opinion

Is this statement consistent with real-world observations?

Yes, and it clarifies a frequent confusion. In thousands of audits, we observe that sites with a stable or even increasing crawl budget can lose 40% of organic traffic — simply because their content no longer clicks.

Conversely, some sites see their crawl decrease after a technical overhaul (URL consolidation, removal of zombie pages) and gain traffic. Why? Because they focus quality signals on fewer pages. Google tracks user interest, not the amount of available HTML.

What nuances should be added to this recommendation?

Martin Splitt simplifies for a non-technical audience. In specific cases, crawl stats remain useful — but never alone.

For example: an e-commerce site with 500,000 items sees its crawl drop by 80% after a migration. The Performance Report also shows a decrease in impressions. Here, crawl stats confirm a technical accessibility issue (broken sitemap, blocking robots.txt, redirect loops). But the primary indicator remains impressions, not crawl.

[To be verified]: Google does not specify at what CTR threshold content officially becomes “underperforming”. In practice, a CTR below 2% in positions 3-7 is a red flag — but this threshold varies by query (brand vs informational) and industry.

In what cases does this rule not fully apply?

On very large sites (news, e-commerce, aggregators), a sudden drop in crawl can precede a decrease in impressions by a few weeks. Google cannot discover your new pages if it doesn’t crawl them.

But even in this case, the solution isn’t to “force crawl” — it’s to investigate why Googlebot is losing interest. Often, this is related to insufficient freshness signals (invisible publication dates, absence of lastmod in the sitemap, 90% duplicated content among products).

If your crawl collapses AND your impressions remain stable, you likely have a crawl efficiency issue (too many facets, URL parameters, soft 404). But if your impressions are also declining, it is indeed a content problem — and crawl stats won’t provide any actionable insights.

Practical impact and recommendations

What should you concretely do to leverage the Performance Report?

First, segment your pages by template type (product sheets, articles, categories). Each type has its own CTR benchmark. A product sheet in position 5 with a CTR of 1.8% is underperforming; an informational article in the same position with 3.2% is within the norm.

Next, export the data over 16 months (to smooth out seasonality) and cross-reference it with Google Analytics 4. Look for pages that generate increasing impressions but stagnant clicks — this often signifies that Google is testing you on new queries, but your snippet isn’t convincing.

What mistakes should you avoid in analyzing crawl stats?

Don’t panic if your crawl drops by 30% after cleaning 10,000 zombie pages. It’s normal and even desirable. Google optimizes its resource allocation — less waste to crawl, more budget on your strategic pages.

Also, avoid confusing “crawled pages” with “indexed pages”. A page can be crawled 50 times a month and remain excluded from the index (duplicate, thin content, accidental noindex). Crawl stats will never tell you this — only the Coverage Report (or Indexed Pages in the new interface) reveals it.

How to prioritize optimizations on underperforming pages?

Start with pages that accumulate more than 5,000 impressions over 3 months and a CTR below 2%. These are your quick wins: they are already visible, it just takes making them clickable.

Test first the titles and meta descriptions. Inject a clear user benefit, a number, a concrete promise. If the CTR doesn’t change after 4 weeks, the problem is deeper: the editorial angle doesn’t align with search intent, or your snippet is overshadowed by featured snippets or People Also Ask boxes.

  • Export the filtered Performance Report for the last 16 months, segmented by page type.
  • Identify pages with >5,000 impressions and CTR <2% — these are your priorities.
  • Rewrite titles and meta descriptions with an explicit user benefit (number, promise, differentiation).
  • Check in the SERPs if rich elements (featured snippets, PAA) are cannibalizing your clicks — adjust your content accordingly.
  • Cross-reference Performance data with GA4 to detect pages with increasing impressions but stagnant clicks.
  • Ignore variations in crawl stats as long as impressions remain stable or increase.
Optimizing CTR in Search Console is a powerful yet technical lever. Between template segmentation, intent analysis, and iterative snippet testing, it’s easy to get lost or misprioritize. If you're managing a large site or your internal resources are limited, engaging a specialized SEO agency can save you months by structuring a rigorous analysis methodology and focusing on the pages with the highest ROI.

❓ Frequently Asked Questions

Les crawl stats peuvent-elles quand même servir dans un audit SEO technique ?
Oui, pour diagnostiquer des problèmes d'accessibilité (temps de réponse serveur, erreurs 5xx, boucles de redirections) ou pour vérifier que les nouvelles pages sont découvertes. Mais elles ne mesurent jamais la qualité du contenu.
Quel est un bon CTR moyen dans Search Console pour une page bien optimisée ?
Ça dépend de la position et du type de requête. En position 3-5, un CTR de 4-6% est correct pour une requête informationnelle. En dessous de 2%, il y a un problème de snippet ou d'angle éditorial.
Si mes impressions augmentent mais que mon CTR baisse, est-ce grave ?
Pas forcément. Ça peut signifier que Google vous teste sur des requêtes moins qualifiées ou plus concurrentielles. Surveillez le taux de conversion final dans GA4 — c'est lui qui compte.
Dois-je optimiser en priorité les pages avec beaucoup d'impressions ou celles avec un très mauvais CTR ?
Les pages avec beaucoup d'impressions ET un mauvais CTR. Ce sont celles où un petit gain de CTR (passer de 1,5% à 3%) génère immédiatement du trafic supplémentaire sans besoin de ranker mieux.
Comment savoir si mon crawl budget est vraiment un problème sur mon site ?
Si vous avez moins de 100 000 pages et un site techniquement propre, le crawl budget n'est jamais votre goulot d'étranglement. Concentrez-vous sur la qualité du contenu et l'acquisition de backlinks frais.
🏷 Related Topics
Content Crawl & Indexing AI & SEO Web Performance Search Console

🎥 From the same video 7

Other SEO insights extracted from this same Google Search Central video · duration 13 min · published on 09/09/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.