Official statement
Other statements from this video 7 ▾
- □ Faut-il vraiment mettre à jour vos contenus plutôt que créer de nouvelles pages ?
- 2:52 Un blog actif améliore-t-il vraiment votre classement Google ?
- 6:18 Faut-il vraiment regrouper vos pages FAQ pour éviter la pénalité thin content ?
- 7:21 Faut-il vraiment fusionner vos contenus similaires pour mieux ranker ?
- 7:34 Le nombre de mots est-il vraiment un facteur de classement Google ?
- 9:30 Le contenu généré pour les pages de localisation peut-il vraiment échapper au filtre duplicate content de Google ?
- 11:33 Comment Google détecte-t-il vraiment le contenu dupliqué avec le fingerprinting ?
Google states that crawl stats do not reflect the quality or performance of content. Only the Performance Report from Search Console, with the impressions/clicks ratio, can identify underperforming pages. Specifically: a page crawled 200 times a day but showing a CTR of 0.5% has a relevance issue — not a crawl issue.
What you need to understand
What is the fundamental difference between crawl stats and actual performance?
Crawl stats measure Googlebot's activity on your site: how many pages it visits, how often, how many KB it downloads. These metrics reflect the bot's technical appetite, not users' interest in your content.
The Performance Report, on the other hand, shows impressions (how often your pages appear in the SERPs) and clicks (how many users actually click). This is the only data that connects your content to actual user behavior.
Why does this confusion between crawl and performance persist?
Many SEOs observe a misleading correlation: when traffic drops, they check crawl stats and sometimes see a decline in crawl budget. They conclude that Google is “neglecting” their site.
However, this logic is reversed. If Google crawls less, it's often because your pages generate fewer signals of interest (clicks, session duration, fresh backlinks). Crawl is a consequence, not a cause. Artificially increasing crawl (by multiplying sitemaps or initiating URL inspections) will never solve a bland content issue.
How can you concretely identify underperforming content in Search Console?
Open the Performance Report, filter by page, and sort by decreasing impressions. Look for pages that accumulate thousands of impressions but a CTR below 2%. That’s your blacklist.
These pages rank well enough to be seen, but their title, meta description, or editorial angle does not convince anyone to click. It's a clear signal: the content exists in Google's eyes, but it is dead to users.
- Crawl stats: measure the technical activity of Googlebot (pages crawled, KB downloaded, response times).
- Performance Report: measures actual visibility (impressions, clicks, CTR, average position) and reflects user interest.
- A low impressions/clicks ratio indicates an issue with editorial relevance or snippet attractiveness.
- Increasing crawl without improving content is a waste of time — it’s a symptom, not a solution.
SEO Expert opinion
Is this statement consistent with real-world observations?
Yes, and it clarifies a frequent confusion. In thousands of audits, we observe that sites with a stable or even increasing crawl budget can lose 40% of organic traffic — simply because their content no longer clicks.
Conversely, some sites see their crawl decrease after a technical overhaul (URL consolidation, removal of zombie pages) and gain traffic. Why? Because they focus quality signals on fewer pages. Google tracks user interest, not the amount of available HTML.
What nuances should be added to this recommendation?
Martin Splitt simplifies for a non-technical audience. In specific cases, crawl stats remain useful — but never alone.
For example: an e-commerce site with 500,000 items sees its crawl drop by 80% after a migration. The Performance Report also shows a decrease in impressions. Here, crawl stats confirm a technical accessibility issue (broken sitemap, blocking robots.txt, redirect loops). But the primary indicator remains impressions, not crawl.
[To be verified]: Google does not specify at what CTR threshold content officially becomes “underperforming”. In practice, a CTR below 2% in positions 3-7 is a red flag — but this threshold varies by query (brand vs informational) and industry.
In what cases does this rule not fully apply?
On very large sites (news, e-commerce, aggregators), a sudden drop in crawl can precede a decrease in impressions by a few weeks. Google cannot discover your new pages if it doesn’t crawl them.
But even in this case, the solution isn’t to “force crawl” — it’s to investigate why Googlebot is losing interest. Often, this is related to insufficient freshness signals (invisible publication dates, absence of lastmod in the sitemap, 90% duplicated content among products).
Practical impact and recommendations
What should you concretely do to leverage the Performance Report?
First, segment your pages by template type (product sheets, articles, categories). Each type has its own CTR benchmark. A product sheet in position 5 with a CTR of 1.8% is underperforming; an informational article in the same position with 3.2% is within the norm.
Next, export the data over 16 months (to smooth out seasonality) and cross-reference it with Google Analytics 4. Look for pages that generate increasing impressions but stagnant clicks — this often signifies that Google is testing you on new queries, but your snippet isn’t convincing.
What mistakes should you avoid in analyzing crawl stats?
Don’t panic if your crawl drops by 30% after cleaning 10,000 zombie pages. It’s normal and even desirable. Google optimizes its resource allocation — less waste to crawl, more budget on your strategic pages.
Also, avoid confusing “crawled pages” with “indexed pages”. A page can be crawled 50 times a month and remain excluded from the index (duplicate, thin content, accidental noindex). Crawl stats will never tell you this — only the Coverage Report (or Indexed Pages in the new interface) reveals it.
How to prioritize optimizations on underperforming pages?
Start with pages that accumulate more than 5,000 impressions over 3 months and a CTR below 2%. These are your quick wins: they are already visible, it just takes making them clickable.
Test first the titles and meta descriptions. Inject a clear user benefit, a number, a concrete promise. If the CTR doesn’t change after 4 weeks, the problem is deeper: the editorial angle doesn’t align with search intent, or your snippet is overshadowed by featured snippets or People Also Ask boxes.
- Export the filtered Performance Report for the last 16 months, segmented by page type.
- Identify pages with >5,000 impressions and CTR <2% — these are your priorities.
- Rewrite titles and meta descriptions with an explicit user benefit (number, promise, differentiation).
- Check in the SERPs if rich elements (featured snippets, PAA) are cannibalizing your clicks — adjust your content accordingly.
- Cross-reference Performance data with GA4 to detect pages with increasing impressions but stagnant clicks.
- Ignore variations in crawl stats as long as impressions remain stable or increase.
❓ Frequently Asked Questions
Les crawl stats peuvent-elles quand même servir dans un audit SEO technique ?
Quel est un bon CTR moyen dans Search Console pour une page bien optimisée ?
Si mes impressions augmentent mais que mon CTR baisse, est-ce grave ?
Dois-je optimiser en priorité les pages avec beaucoup d'impressions ou celles avec un très mauvais CTR ?
Comment savoir si mon crawl budget est vraiment un problème sur mon site ?
🎥 From the same video 7
Other SEO insights extracted from this same Google Search Central video · duration 13 min · published on 09/09/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.