What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

After a drop due to a core update, it is relevant to analyze the pages that now rank better to identify trends (e.g., informational vs commercial content). This helps to understand what users are looking for, but you should not copy blindly: the aim is to adapt while maintaining your commercial strategy.
4:43
🎥 Source video

Extracted from a Google Search Central video

⏱ 38:05 💬 EN 📅 14/09/2020 ✂ 15 statements
Watch on YouTube (4:43) →
Other statements from this video 14
  1. 1:36 Faut-il vraiment attendre la prochaine core update pour récupérer son trafic perdu ?
  2. 3:08 Les core updates recalculent-elles vraiment vos scores en continu entre deux déploiements ?
  3. 8:55 Pourquoi Google veut-il supprimer la catégorie « crawl anomaly » de Search Console ?
  4. 11:09 Faut-il vraiment implémenter à la fois le flux Merchant Center ET le structured data produit ?
  5. 13:14 Pourquoi nettoyer vos backlinks artificiels peut-il faire chuter vos positions Google ?
  6. 15:18 La vitesse de page a-t-elle vraiment si peu d'impact sur le classement Google ?
  7. 15:50 Changer de thème WordPress peut-il vraiment tuer votre référencement naturel ?
  8. 17:17 Faut-il vraiment préférer le code 410 au 404 pour désindexer rapidement une page ?
  9. 18:59 Pourquoi votre migration de site reste bloquée en 'pending' dans Search Console ?
  10. 23:10 Google ignore-t-il vraiment vos scripts de tracking lors du rendering ?
  11. 24:15 Faut-il vraiment limiter le contenu texte sur vos pages catégories e-commerce ?
  12. 28:32 Le contenu en footer est-il vraiment traité comme du contenu normal par Google ?
  13. 31:36 La répétition de mots-clés dans les fiches produits est-elle enfin autorisée par Google ?
  14. 33:12 Comment Google désindexe-t-il réellement un site expiré ou en 404 global ?
📅
Official statement from (5 years ago)
TL;DR

After a drop following a core update, analyzing the pages that are gaining traction often reveals trends about user expectations (informational vs commercial content). This analysis helps adjust your editorial strategy without abandoning your business line. The goal: to understand the dominant search intent, not to blindly clone competitors.

What you need to understand

Why Analyze Competitors After a Core Update?

Google's core updates reshuffle the ranking landscape without targeting a single criterion. When your site loses positions, the first question should not be "What did I do wrong?" but rather "What is the algorithm now valuing for this query?".

Comparing rising pages with those that stagnate or decrease brings out structural trends. You identify whether Google is now favoring more informational content, more transactional, longer, more visual, or more technical content. Specifically? If your climbing competitors are all betting on comprehensive guides with diagrams and FAQs, while you stick to minimalistic product sheets, there's a signal.

What’s the Difference Between Informational and Commercial Content in This Context?

Informational content answers a question, educates, and guides the user without selling directly. Commercial content drives immediate purchase (product sheet, comparison, promotional landing page). Google constantly adjusts the mix between these two intents for each query.

After a core update, a query previously dominated by product sheets can shift towards detailed blog articles. This reflects a reevaluation of user intent by the algorithm. If you remain stuck in your commercial approach while the SERP evolves towards informational, you lose ground. The opposite is equally true.

What Does "Not Copying Blindly" Mean?

Analyzing does not mean plagiarizing. If your competitors all publish a 5000-word guide, it doesn’t mean you should do the same word for word. The trap: reproducing their format without understanding the substance. You risk publishing generic content, without added value, that will not rank better.

The idea is to identify the structure, tone, depth that seem to be expected now, then adapt them to your expertise and commercial strategy. If you sell gardening tools, you can integrate practical guides while directing towards your products — without turning your site into a cold catalog or a blog without conversion.

  • Spot trends: length, format (video, text, diagram), angle (informational, comparative, transactional).
  • Compare intent: Do the rising pages answer a different question than yours?
  • Adapt without renouncing: Keep your commercial positioning but adjust the content to the dominant user expectation.
  • Measure the gap: If you're at 800 words and they're at 3000, dig into why — not how many.
  • Test and iterate: a core update is never the last one. What works today may change in 6 months.

SEO Expert opinion

Does This Recommendation Really Change the Game?

Honestly? No, it’s not a revelation. Any seasoned SEO knows that analyzing the SERP after an update is part of the basic diagnosis. What Mueller confirms here is mostly a method that many already apply — but that he publicly formalizes. The real takeaway is the explicit reminder not to fall into blind mimicry.

The problem is that this statement remains extremely vague about the criteria to observe. Which metric should be prioritized? Content length? Semantic density? Number of media? Depth of subheadings? No concrete answer. [To be verified]: we still don't know if Google has internal thresholds for these criteria or if everything relies on post-click engagement signals.

In What Cases Does This Approach Fail?

If your rising competitors after the update benefit from a massive link profile, a domain authority ten times yours, or a solid brand search history, analyzing their content won't help you. You can produce the same content — you won’t rank the same. The gap in trust and authority overshadows everything.

Another failure case: volatile SERPs where Google tests multiple page profiles without stabilizing its choice. Analyzing at one moment doesn't guarantee anything. If you adjust your content based on Monday’s SERP, and by Wednesday everything has shifted, you’ve wasted time. This is particularly true for YMYL queries or emerging niches where the algorithm is still experimenting.

What Nuance Should Be Added to This Advice?

Mueller says "identify trends", but omits a crucial point: distinguishing correlation from causation. If 8 out of 10 competitors who rise have long content, it doesn’t mean length is the factor. Perhaps they have also restructured their internal links, improved their linking, updated their data, added FAQ schema. You may miss the real lever by focusing on a visible detail.

Specifically? Don't settle for surface-level analysis. Use tools to compare backlink profiles, UX signals, Core Web Vitals, update frequency. Visible content is only part of the equation. If you ignore the rest, you risk making noise without results.

Caution: This approach assumes that Google has indeed adjusted the search intent for your query. In some cases, the drop after a core update comes from a technical issue (indexing, crawling, canonicals) or a hidden algorithmic penalty (thin content, spam link). Analyzing competitors won't solve anything if the problem is structural on your end.

Practical impact and recommendations

How to Identify Dominant Trends Concretely?

Start by scraping the top 10 for queries where you've lost positions. Compare the average length, number of media, Hn structure, presence of schema markup, recurring sections (FAQ, comparisons, step guides). Use tools like Screaming Frog, SurferSEO, or Clearscope to automate this analysis.

Then, identify the dominant formats: does the SERP lean towards long guides, numbered lists, integrated videos, or infographics? If 7 out of 10 pages include an explanatory video, that's a strong signal. If 8 out of 10 have a structured FAQ in schema, likewise. These recurring patterns are not coincidences.

What Mistakes to Avoid in This Process?

The biggest mistake: copying the form without the substance. You see a competitor at 3000 words, you tell yourself “I’ll make 3500”. The result: filler, repetition, zero added value. Google does not reward length for the sake of length. It values depth, comprehensiveness, clarity. A well-structured 1500-word piece can outperform a diluted 5000-word tome.

Another trap: neglecting your own positioning. If you're an e-commerce player, turning all your product sheets into blog articles to align with the informational SERP can kill your conversion. The goal is not to become a clone of Wikipedia, but to enrich your commercial content with relevant informational elements. A buying guide, a well-thought-out FAQ, a "how to use" section can suffice.

What to Do If the Analysis Reveals No Clear Trend?

Let’s be honest: sometimes there’s no obvious pattern. The top 10 mixes short and long formats, informational and commercial, recent and old. In this case, it means Google is still hesitant about the dominant intent — or that other criteria (authority, backlinks, UX) weigh more heavily than the visible content.

At this point, refocus your analysis on technical and off-page signals. Check the Core Web Vitals, loading speed, internal linking, and content freshness. Compare backlink profiles. If your rising competitors have all received recent links from authoritative sites, the problem may not be your content but your link-building strategy.

  • Scrape the top 10 for each impacted query and compare length, Hn structure, media, schema markup.
  • Identify recurring formats: long guides, lists, videos, FAQs, comparisons.
  • Analyze the dominant intent: informational, transactional, navigational, mixed.
  • Compare backlink profiles and domain authority to avoid false leads.
  • Adapt content without renouncing your commercial strategy: enrich, not radically transform.
  • Test changes on a portion of the site before generalizing.
Analyzing competitors after a core update is a healthy reflex, as long as you don't fall into mimicry. The objective: to understand what Google now values for this query, then adjust your content while maintaining your editorial and commercial line. These optimizations can be complex to implement alone, especially when they touch on content, technical structure, and link-building. Engaging a specialized SEO agency can thus be relevant for personalized support and strategic adjustments tailored to your context.

❓ Frequently Asked Questions

Combien de temps faut-il attendre après une core update avant d'analyser les concurrents ?
Au minimum 2 semaines, le temps que les classements se stabilisent. Les premiers jours post-update sont souvent volatils, avec des fluctuations quotidiennes. Analyser trop tôt risque de te faire prendre des décisions sur des données non fiables.
Dois-je analyser uniquement les concurrents directs ou tout le top 10 ?
Tout le top 10, même si certains sites ne sont pas des concurrents commerciaux directs. L'objectif est de comprendre ce que Google valorise pour cette requête, pas de surveiller uniquement tes rivaux business. Un blog qui monte peut t'apprendre autant qu'un e-commerçant.
Si mon contenu est déjà plus long et complet que mes concurrents, pourquoi je perds quand même ?
Parce que la longueur n'est qu'un critère parmi d'autres. Google peut privilégier la fraîcheur, l'autorité du domaine, les signaux UX (taux de rebond, temps sur page), le profil de backlinks, ou une meilleure adéquation à l'intention de recherche. Un contenu long mais mal structuré ou dilué ne gagne pas forcément.
Peut-on se fier uniquement à l'analyse du contenu visible pour comprendre une baisse ?
Non. Le contenu visible ne représente qu'une partie de l'équation. Les facteurs techniques (vitesse, crawl, indexation), l'autorité du domaine, le netlinking et les signaux UX jouent aussi un rôle majeur. Une analyse complète doit couvrir tous ces axes.
Combien de requêtes faut-il analyser pour tirer des conclusions fiables ?
Au minimum 10 à 15 requêtes représentatives de ton trafic. Analyser une seule requête risque de te donner une vision biaisée. Plus tu élargis l'échantillon, plus tu repères des tendances transversales plutôt que des anomalies isolées.
🏷 Related Topics
Algorithms Domain Age & History Content AI & SEO

🎥 From the same video 14

Other SEO insights extracted from this same Google Search Central video · duration 38 min · published on 14/09/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.