What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

All data in Search Console is retained for 16 months. Data accumulates from the moment the site is registered in Search Console.
2:05
🎥 Source video

Extracted from a Google Search Central video

💬 EN 📅 12/01/2022 ✂ 10 statements
Watch on YouTube (2:05) →
Other statements from this video 9
  1. 0:38 Comment Google Search Console peut-il réellement booster votre trafic organique ?
  2. 0:56 Search Console et Analytics : deux outils pour quelles données SEO distinctes ?
  3. 2:05 Faut-il vraiment aligner les requêtes Search Console avec vos mots-clés cibles ?
  4. 2:05 Pourquoi Google recommande-t-il de séparer l'analyse de la recherche d'images et de la recherche web ?
  5. 6:00 Comment vérifier que vos pages sont réellement indexées par Google ?
  6. 6:18 Faut-il vraiment indexer toutes les pages de son site ?
  7. 8:54 Les rich results augmentent-ils vraiment la visibilité dans les résultats de recherche ?
  8. 8:54 L'expérience de page joue-t-elle vraiment un rôle déterminant dans le classement Google ?
  9. 9:20 Pourquoi Google recommande-t-il de vérifier le rapport de couverture d'index en priorité ?
📅
Official statement from (4 years ago)
TL;DR

Google retains all data in Search Console for a maximum of 16 months, starting from the registration of the site in the tool. After this period, historical data is permanently deleted. This limitation poses a significant constraint for long-term trend analysis and seasonal pattern detection.

What you need to understand

Why does this 16-month limit exist?

Google does not publicly justify this 16-month retention window, but it likely stems from a technical trade-off between storage costs and user needs. For a search engine handling billions of queries daily, retaining detailed histories for every registered property represents a colossal volume of data.

This duration, however, allows for analyzing complete annual trends and comparing two identical seasonal periods — a vital minimum for identifying recurring patterns. But it blocks any multi-year analysis without prior exports.

When exactly does the countdown start?

The countdown begins the moment you register your property in Search Console, not when Google discovers the site. In other words, a site launched in 2018 but added to Search Console only now will not have any historical data prior to its registration.

Data accumulates gradually. A site added today will only have a few days of data tomorrow, then a few weeks, until it reaches the ceiling of 16 months. After this period, the data slips: the oldest records disappear as new ones arrive.

What happens to the data after 16 months?

It is permanently deleted. No accessible archiving, no recovery possible. If you haven't exported your reports before this deadline, you lose all trace of these periods.

  • Data disappears in a sliding manner: every day, records older than 16 months are erased
  • No way to recover deleted data, even by contacting Google support
  • The Search Console API adheres to the same time limit as the web interface
  • Custom filters and segments do not allow bypassing this window
  • Deleting and then re-adding a property does not reset the lost history

SEO Expert opinion

Does this limit pose real problems in practice?

Yes, and not just a little. For any site with a marked seasonality or subject to long cycles (real estate, tourism, recruitment), 16 months are often not enough to identify robust trends. Comparing three consecutive years of summer traffic? Impossible without prior exports.

The window becomes quite limiting for sites that have undergone major technical changes (redesign, HTTPS migration, domain change). Analyzing the long-term impact of an algorithm penalty or correction often requires a hindsight of 18-24 months. With Search Console alone, this perspective disappears.

Practically? Professionals managing sites over several years must systematically export their data or use third-party tools (SEMrush, Ahrefs, Sistrix) that store histories. [To be verified]: some tools claim to retain more than 16 months through their own crawl, but their granularity never matches Search Console for actual queries.

Could Google extend this retention period?

Technically, yes. Financially, it’s debatable. But there’s nothing to indicate that Google is considering extending this window. The company has never communicated on this point, and the limit has remained stable for years.

The most likely scenario? Google considers 16 months sufficient for the majority of use cases. If you need more, it’s up to you to implement your own storage solutions. A pragmatic stance from Google, frustrating for users.

How does this rule apply to different types of properties?

The 16-month limit applies uniformly: domain properties, URL prefixes, AMP sites, all treated the same. No preferential treatment for large sites or long-verified accounts.

A classic trap: some believe that by combining several types of properties (domain + subdomain + prefix), they extend their history. False. Each property has its own 16-month countdown, but none retains more than this duration.

Warning: If you delete a property and then re-add it, you start from scratch. The history is not restored, even if the site was previously registered. This deletion is irreversible.

Practical impact and recommendations

What should you implement right now?

The first urgency: automate the regular export of your Search Console data before it disappears. The Search Console API allows you to retrieve performance reports and store them in your own databases or spreadsheets.

For critical sites, set up a monthly script that extracts and archives data on clicks, impressions, CTR, and positions. Google Sheets with Apps Script does the job for modest volumes. For larger sites, a SQL database or BigQuery becomes necessary.

What mistakes should you absolutely avoid?

Don’t rely on sporadic manual exports. You’ll forget; it’s inevitable. And when you realize you need 18-month-old data to analyze the impact of a Google update, it will be too late.

Another trap: believing Google Analytics compensates. GA and Search Console do not measure the same things. Search Console captures pre-click queries, impressions, average positions — metrics not found in Analytics. The two tools are complementary, not interchangeable.

Finally, never delete a Search Console property on a whim to “clean up.” You permanently lose history, and Google allows no recovery.

How can you verify that your exports are working correctly?

Test your export process over a short period before letting it run autonomously. Check that the extracted data matches the numbers displayed in the Search Console interface — minor discrepancies are normal (sampling), but significant divergences indicate a script problem.

Plan alerts if the export fails. A simple automatic email when the script doesn't run can save you from irreversible losses.

  • Set up an automated monthly export via the Search Console API or a third-party tool
  • Store the data in a secure and sustainable space (cloud, SQL database, BigQuery)
  • Document your exports to easily locate historical data
  • Test the consistency of exported data with the Search Console interface
  • Never delete a property without having saved the complete history
  • Prepare an alert system in case of export failures
  • Maintain a minimum of 3 years of history for robust analyses
The 16-month limit imposed by Search Console requires rethinking SEO data management. Without an archiving system, you lose the ability to conduct multi-year analyses. For sites with strong seasonality or complex histories, this constraint becomes critical. If establishing a reliable and sustainable export infrastructure seems complex or time-consuming, the support of a specialized SEO agency may be wise — these tools and processes are part of their daily arsenal.

❓ Frequently Asked Questions

Puis-je récupérer des données Search Console supprimées il y a plus de 16 mois ?
Non, c'est impossible. Une fois les données effacées après 16 mois, aucune récupération n'est possible, même en contactant Google. Seule solution : avoir exporté les données avant leur suppression.
L'API Search Console permet-elle d'accéder à plus de 16 mois de données ?
Non, l'API respecte la même limite de 16 mois que l'interface web. Elle ne donne aucun accès privilégié à des données plus anciennes.
Si je supprime puis réajoute ma propriété, est-ce que l'historique revient ?
Non, supprimer une propriété efface définitivement toutes ses données. Réajouter le site repart de zéro, sans aucune restauration de l'historique précédent.
Les données commencent-elles à s'accumuler dès la mise en ligne du site ou dès son ajout à Search Console ?
Uniquement à partir de l'enregistrement dans Search Console. Un site existant depuis des années mais ajouté aujourd'hui n'aura aucune donnée antérieure à son enregistrement.
Google Analytics peut-il remplacer Search Console pour conserver les données plus longtemps ?
Non, ces outils ne mesurent pas les mêmes choses. Search Console capture les requêtes, impressions et positions dans les résultats de recherche — données absentes de Google Analytics. Ils sont complémentaires, pas substituables.
🏷 Related Topics
Search Console

🎥 From the same video 9

Other SEO insights extracted from this same Google Search Central video · published on 12/01/2022

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.