Official statement
Other statements from this video 21 ▾
- 1:22 Pourquoi Google retarde-t-il la migration mobile-first de certains sites ?
- 3:10 Le mobile-first indexing améliore-t-il vraiment votre positionnement dans Google ?
- 5:13 Faut-il vraiment traiter tous les problèmes Search Console en urgence ?
- 7:07 Faut-il vraiment optimiser les ancres de liens internes ou est-ce du temps perdu ?
- 8:42 Faut-il vraiment éviter d'avoir plusieurs pages sur le même mot-clé ?
- 9:58 Peut-on prouver la qualité éditoriale d'un contenu à Google avec des balises structured data ?
- 11:33 Faut-il vraiment respecter les types de pages supportés pour le schema reviewed-by ?
- 14:02 Le cloaking technique est-il vraiment toléré par Google ?
- 19:36 Comment Google groupe-t-il vos URL pour prioriser son crawl ?
- 22:04 Pourquoi votre trafic chute-t-il vraiment après une pause de publication ?
- 26:31 Le structured data non supporté influence-t-il vraiment le ranking ?
- 28:37 Les erreurs techniques d'un domaine principal pénalisent-elles vraiment ses sous-domaines ?
- 30:44 Pourquoi vos review snippets disparaissent-ils puis réapparaissent chaque semaine ?
- 32:16 Le Domain Authority est-il vraiment inutile pour votre stratégie SEO ?
- 32:16 Les backlinks déposés manuellement dans les forums et commentaires sont-ils vraiment inutiles pour le SEO ?
- 34:55 Pourquoi vos commentaires Disqus ne s'indexent-ils pas tous de la même manière ?
- 44:52 Pourquoi Google confond-il vos pages locales avec des doublons à cause des patterns d'URL ?
- 48:00 Pourquoi les redirections 404 vers la homepage détruisent-elles le crawl budget ?
- 50:51 Faut-il vraiment utiliser unavailable_after pour gérer les événements passés sur votre site ?
- 50:51 Pourquoi votre no-index massif met-il 6 mois à 1 an pour être traité par Google ?
- 55:39 Les URL plates nuisent-elles vraiment à la compréhension de Google ?
Google Discover applies stricter quality criteria than classic organic search because users are not actively seeking out this content. The Google News guidelines serve as a benchmark for eligibility, and the algorithms filter more conservatively. A drop in visibility in Discover often indicates that your content no longer meets the high standards required, rather than a simple algorithm adjustment.
What you need to understand
Why does Discover impose different rules compared to organic search?
The fundamental difference lies in the user intent. In classic search, the user formulates a precise query — they are actively looking for information. In Discover, Google pushes content to users who have not requested it.
This asymmetry creates a risk: showing mediocre content in Discover immediately degrades the user experience and erodes trust in the system. As a result, the algorithms are calibrated to favor caution. It’s better to show nothing than to display borderline content.
What does it really mean to "follow the Google News guidelines"?
Google News has strict eligibility criteria: demonstrated expertise, editorial transparency, writing quality, content freshness, absence of clickbait. These criteria serve as an entry filter for Discover.
Let's be honest: many sites eligible for organic search would never pass the Google News threshold. No clear "About" page? No identified author? Content rewritten from other sources without added value? You probably won’t cross the Discover threshold, even if you rank properly in traditional SEO.
How should a drop in visibility in Discover be interpreted?
Mueller is straightforward: a drop in Discover traffic is not an algorithmic bug, it’s a quality signal. Unlike search fluctuations where multiple factors can play (increased competition, cannibalization, SERP features), Discover operates more like an on/off switch.
If your content disappears, it means that the systems have reassessed their perceived quality level and decided that they no longer deserve proactive distribution. The problem is that Google never specifies which specific criterion caused the verdict to change — you are navigating blindly.
- Discover applies a stricter quality filter than organic search due to the lack of explicit user intent
- The Google News guidelines serve as a benchmark for eligibility, even for sites that are not officially listed on Google News
- A drop in visibility reflects a perceived quality degradation, not just a technical algorithm adjustment
- Editorial transparency and expertise are non-negotiable prerequisites for maintaining a sustainable presence
- The selection criteria remain opaque: Google never details which specific signal caused the exclusion
SEO Expert opinion
Is this statement consistent with field observations?
Yes, and it’s even one of the rare times where the official discourse aligns with the data. Sites that perform sustainably in Discover share common patterns: identified authors, clear editorial lines, regular updates, original content with a differentiating angle.
Conversely, pure affiliate sites, content aggregators, SEO-optimized article farms without substance — all face a glass ceiling in Discover. They can generate search traffic, but Discover systematically ignores them. [To be verified] regarding the exact exclusion thresholds, but the pattern is clear.
What areas of uncertainty remain in this explanation?
Mueller remains vague on one crucial point: the relative weighting of the different quality criteria. Editorial expertise, user engagement, freshness, originality, content depth — all play a role, but it’s unclear which one carries the most weight.
A second opacity: the role of behavioral signals. If users quickly swipe past your content in Discover without engagement, does this gradually degrade your eligibility? Google never explicitly states this, but field correlations suggest that it does. And that’s where the problem lies: you can comply with all editorial guidelines and still lose Discover if your titles don’t generate clicks.
In what cases does this rule not apply?
Major news sites (established press, historical media) seem to enjoy increased tolerance. Even with sometimes mediocre content, they maintain a stable Discover presence — probably an effect of pre-established editorial trust.
Another exception: geographically hyper-targeted content or very specific niches may appear in Discover even without fully adhering to the News guidelines, simply because the algorithm lacks quality alternatives on these topics. But this is an edge case, not a scalable strategy.
Practical impact and recommendations
What should you check first to improve your Discover eligibility?
Start with a strict editorial audit. Each page must clearly display: author with bio and expertise, publication date, site’s editorial policy, contact information. These transparency signals are not optional — they form the foundation of trust that Google seeks before activating Discover.
Next, scrutinize the writing quality. Too short content (less than 600 words)? Rewrites without added value? Clickbait titles? Overwhelming ads? Each of these signals degrades your implicit score and could gradually exclude you.
How to diagnose a drop in Discover traffic?
Google Search Console provides separate Discover data — utilize it. Check if the drop is gradual or sudden. A gradual decline suggests quality erosion (content that ages poorly, drop in engagement). A sudden drop might indicate an algorithm change or a manual penalty.
Cross-reference with your engagement metrics: time on page, bounce rate, pages viewed per session. If these indicators degrade simultaneously in Discover, your content is no longer delivering what users expect — and the algorithm has detected it.
What mistakes should you absolutely avoid?
Don’t fall into the trap of volume over quality. Publishing 10 mediocre articles per day will never help you gain Discover — on the contrary, it will dilute your perceived editorial authority. It’s better to have 2-3 solid pieces per week than a continuous flow of superficial content.
Another classic mistake: copying titles that perform well in Discover without understanding why they work. A catchy title for disappointing content generates quick negative engagement — and Google picks this up immediately. You are burning your algorithmic credit for a fleeting tactical gain.
- Complete audit of editorial transparency: identified authors, "About" pages, contact details, visible editorial policy
- Verification of writing quality: sufficient length (600+ words), original angle, depth of analysis, cited sources
- Optimization of Core Web Vitals and mobile experience: Discover is primarily mobile, technical performance counts double
- Analysis of engagement metrics in GSC Discover: identify underperforming content and understand why
- Review of title strategy: balancing attractiveness and deliverability of the editorial promise
- Reduction of advertising aggressiveness: too many ads degrade the experience and signal a low-quality site to the algorithms
❓ Frequently Asked Questions
Google Discover est-il accessible à tous les sites ou réservé aux médias ?
Faut-il s'inscrire à Google News pour être éligible à Discover ?
Une baisse Discover affecte-t-elle le ranking en recherche organique classique ?
Comment savoir si mon site est éligible à Discover avant d'y apparaître ?
Les contenus evergreen peuvent-ils performer dans Discover ou faut-il privilégier l'actualité ?
🎥 From the same video 21
Other SEO insights extracted from this same Google Search Central video · duration 57 min · published on 23/06/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.