Official statement
Other statements from this video 21 ▾
- 1:22 Is it true that Google delays mobile-first migration for some sites?
- 3:10 Does mobile-first indexing really improve your ranking in Google?
- 5:13 Should you really prioritize every Search Console issue as a crisis?
- 7:07 Do you really need to optimize internal link anchors, or is it a waste of time?
- 8:42 Should you really avoid having multiple pages for the same keyword?
- 9:58 Can you really prove the editorial quality of your content to Google with structured data tags?
- 11:33 Do you really need to stick to the supported page types for the reviewed-by schema?
- 14:02 Is Google really tolerant of technical cloaking?
- 19:36 How does Google group your URLs to prioritize crawling?
- 22:04 Why does your traffic really drop after a publishing break?
- 26:31 Does unsupported structured data really affect ranking?
- 28:37 Do technical errors on a main domain really penalize its subdomains?
- 30:44 Why do your review snippets seem to disappear and then reappear every week?
- 32:16 Is Domain Authority Really Useless for Your SEO Strategy?
- 32:16 Are manually posted backlinks in forums and comments really useless for SEO?
- 34:55 Why aren't all your Disqus comments indexed in the same way?
- 44:52 Is Google really confusing your local pages with duplicates because of URL patterns?
- 48:00 Why do 404 redirects to the homepage destroy crawl budget?
- 50:51 Should you really use unavailable_after to manage past events on your site?
- 50:51 Why does your massive no-index take 6 months to a year to be processed by Google?
- 55:39 Do flat URLs really hinder Google's understanding?
Google Discover applies stricter quality criteria than classic organic search because users are not actively seeking out this content. The Google News guidelines serve as a benchmark for eligibility, and the algorithms filter more conservatively. A drop in visibility in Discover often indicates that your content no longer meets the high standards required, rather than a simple algorithm adjustment.
What you need to understand
Why does Discover impose different rules compared to organic search?
The fundamental difference lies in the user intent. In classic search, the user formulates a precise query — they are actively looking for information. In Discover, Google pushes content to users who have not requested it.
This asymmetry creates a risk: showing mediocre content in Discover immediately degrades the user experience and erodes trust in the system. As a result, the algorithms are calibrated to favor caution. It’s better to show nothing than to display borderline content.
What does it really mean to "follow the Google News guidelines"?
Google News has strict eligibility criteria: demonstrated expertise, editorial transparency, writing quality, content freshness, absence of clickbait. These criteria serve as an entry filter for Discover.
Let's be honest: many sites eligible for organic search would never pass the Google News threshold. No clear "About" page? No identified author? Content rewritten from other sources without added value? You probably won’t cross the Discover threshold, even if you rank properly in traditional SEO.
How should a drop in visibility in Discover be interpreted?
Mueller is straightforward: a drop in Discover traffic is not an algorithmic bug, it’s a quality signal. Unlike search fluctuations where multiple factors can play (increased competition, cannibalization, SERP features), Discover operates more like an on/off switch.
If your content disappears, it means that the systems have reassessed their perceived quality level and decided that they no longer deserve proactive distribution. The problem is that Google never specifies which specific criterion caused the verdict to change — you are navigating blindly.
- Discover applies a stricter quality filter than organic search due to the lack of explicit user intent
- The Google News guidelines serve as a benchmark for eligibility, even for sites that are not officially listed on Google News
- A drop in visibility reflects a perceived quality degradation, not just a technical algorithm adjustment
- Editorial transparency and expertise are non-negotiable prerequisites for maintaining a sustainable presence
- The selection criteria remain opaque: Google never details which specific signal caused the exclusion
SEO Expert opinion
Is this statement consistent with field observations?
Yes, and it’s even one of the rare times where the official discourse aligns with the data. Sites that perform sustainably in Discover share common patterns: identified authors, clear editorial lines, regular updates, original content with a differentiating angle.
Conversely, pure affiliate sites, content aggregators, SEO-optimized article farms without substance — all face a glass ceiling in Discover. They can generate search traffic, but Discover systematically ignores them. [To be verified] regarding the exact exclusion thresholds, but the pattern is clear.
What areas of uncertainty remain in this explanation?
Mueller remains vague on one crucial point: the relative weighting of the different quality criteria. Editorial expertise, user engagement, freshness, originality, content depth — all play a role, but it’s unclear which one carries the most weight.
A second opacity: the role of behavioral signals. If users quickly swipe past your content in Discover without engagement, does this gradually degrade your eligibility? Google never explicitly states this, but field correlations suggest that it does. And that’s where the problem lies: you can comply with all editorial guidelines and still lose Discover if your titles don’t generate clicks.
In what cases does this rule not apply?
Major news sites (established press, historical media) seem to enjoy increased tolerance. Even with sometimes mediocre content, they maintain a stable Discover presence — probably an effect of pre-established editorial trust.
Another exception: geographically hyper-targeted content or very specific niches may appear in Discover even without fully adhering to the News guidelines, simply because the algorithm lacks quality alternatives on these topics. But this is an edge case, not a scalable strategy.
Practical impact and recommendations
What should you check first to improve your Discover eligibility?
Start with a strict editorial audit. Each page must clearly display: author with bio and expertise, publication date, site’s editorial policy, contact information. These transparency signals are not optional — they form the foundation of trust that Google seeks before activating Discover.
Next, scrutinize the writing quality. Too short content (less than 600 words)? Rewrites without added value? Clickbait titles? Overwhelming ads? Each of these signals degrades your implicit score and could gradually exclude you.
How to diagnose a drop in Discover traffic?
Google Search Console provides separate Discover data — utilize it. Check if the drop is gradual or sudden. A gradual decline suggests quality erosion (content that ages poorly, drop in engagement). A sudden drop might indicate an algorithm change or a manual penalty.
Cross-reference with your engagement metrics: time on page, bounce rate, pages viewed per session. If these indicators degrade simultaneously in Discover, your content is no longer delivering what users expect — and the algorithm has detected it.
What mistakes should you absolutely avoid?
Don’t fall into the trap of volume over quality. Publishing 10 mediocre articles per day will never help you gain Discover — on the contrary, it will dilute your perceived editorial authority. It’s better to have 2-3 solid pieces per week than a continuous flow of superficial content.
Another classic mistake: copying titles that perform well in Discover without understanding why they work. A catchy title for disappointing content generates quick negative engagement — and Google picks this up immediately. You are burning your algorithmic credit for a fleeting tactical gain.
- Complete audit of editorial transparency: identified authors, "About" pages, contact details, visible editorial policy
- Verification of writing quality: sufficient length (600+ words), original angle, depth of analysis, cited sources
- Optimization of Core Web Vitals and mobile experience: Discover is primarily mobile, technical performance counts double
- Analysis of engagement metrics in GSC Discover: identify underperforming content and understand why
- Review of title strategy: balancing attractiveness and deliverability of the editorial promise
- Reduction of advertising aggressiveness: too many ads degrade the experience and signal a low-quality site to the algorithms
❓ Frequently Asked Questions
Google Discover est-il accessible à tous les sites ou réservé aux médias ?
Faut-il s'inscrire à Google News pour être éligible à Discover ?
Une baisse Discover affecte-t-elle le ranking en recherche organique classique ?
Comment savoir si mon site est éligible à Discover avant d'y apparaître ?
Les contenus evergreen peuvent-ils performer dans Discover ou faut-il privilégier l'actualité ?
🎥 From the same video 21
Other SEO insights extracted from this same Google Search Central video · duration 57 min · published on 23/06/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.