What does Google say about SEO? /
Quick SEO Quiz

Test your SEO knowledge in 3 questions

Less than 30 seconds. Find out how much you really know about Google search.

🕒 ~30s 🎯 3 questions 📚 SEO Google

Official statement

Meta tags on non-AMP pages are analyzed both before and after JavaScript rendering. However, some systems like caching may access the content before full rendering.
14:24
🎥 Source video

Extracted from a Google Search Central video

⏱ 18:24 💬 EN 📅 10/12/2020 ✂ 12 statements
Watch on YouTube (14:24) →
Other statements from this video 11
  1. 1:01 Faut-il vraiment contacter l'équipe AdSense pour résoudre vos problèmes de performance PageSpeed ?
  2. 1:01 Faut-il vraiment retarder le JavaScript AdSense pour booster votre SEO ?
  3. 2:35 Pourquoi Google refuse-t-il de communiquer les dimensions du viewport de Googlebot ?
  4. 3:07 Comment Googlebot gère-t-il réellement le contenu en bas de page ?
  5. 3:38 Faut-il abandonner l'infinite scroll pour être correctement indexé par Google ?
  6. 4:08 L'Intersection Observer est-il vraiment crawlé par Googlebot ?
  7. 6:24 Pourquoi Googlebot utilise-t-il un viewport de 10 000 pixels ?
  8. 9:23 Pourquoi Google refuse-t-il d'indexer le contenu qui dépend du viewport ?
  9. 10:11 Pourquoi Google fixe-t-il la largeur du viewport de son crawler à 1024 pixels ?
  10. 12:38 Les meta tags no-archive en JavaScript fonctionnent-ils vraiment ?
  11. 15:27 Faut-il rendre les meta tags côté serveur ou accepter qu'ils soient modifiés par JavaScript ?
📅
Official statement from (5 years ago)
TL;DR

Google claims to analyze meta tags on non-AMP pages at two points: before and after JavaScript rendering. This means your meta tags can be dynamically altered without being ignored. However, be cautious: certain systems like caching access the pre-rendered content, which can create temporary discrepancies between what Google indexes and what your users see.

What you need to understand

Why does Google analyze meta tags twice?

The double analysis of meta tags is rooted in the very architecture of Googlebot. During the first pass, before JavaScript execution, the bot extracts the raw HTML sent by the server. This quick step allows it to collect immediately available information.

The second analysis occurs after the complete JavaScript rendering of the page. At this point, Google retrieves meta tags that may have been dynamically modified or injected by your scripts. This double reading ensures that client-side generated content is not ignored.

This is a significant evolution compared to historical crawlers that relied solely on static HTML. Google acknowledges the reality of SPA architectures and modern frameworks that heavily manipulate the DOM after loading.

What changes for JavaScript pages?

For sites using React, Vue, or Angular, this statement confirms that your dynamic meta tags will indeed be taken into account. Specifically, if your title or description is injected after the first paint, Google will see them.

The catch? Not all Google systems operate on the post-render version. Splitt explicitly mentions the cache as accessing pre-rendered content. This means that during a certain timeframe — the one between the first crawl and complete rendering — some internal tools may work with incomplete data.

What does this distinction between AMP and non-AMP mean?

The specification "non-AMP pages" is not trivial. AMP pages follow strict rules prohibiting custom JavaScript. Their HTML is static by design, thus analyzable in a single pass.

For traditional pages, Google must manage two realities: the initial HTML and the final state after scripts. This distinction serves as a reminder that SEO optimization varies radically depending on technical stack. A static site and a JavaScript-heavy site do not share the same temporal constraints or risks of partial indexing.

  • Double pass: Googlebot extracts meta tags before and after JavaScript execution on non-AMP pages
  • Temporal discrepancy: The cache and some internal systems may access pre-rendered content for a transitional period
  • Limited guarantee: Dynamic meta tags will eventually be indexed, but may not be instantly used by all subsystems
  • Technical architecture: The AMP/non-AMP distinction reflects two radically different analysis models

SEO Expert opinion

Does this assertion match on-the-ground observations?

Splitt's statement confirms what A/B testing has shown for several years: meta tags injected via JavaScript do indeed end up in the index. On frameworks like Next.js or Nuxt with client-side hydration, title/description modifications are properly accounted for.

However — and this is where it gets tricky — the время between the initial crawl and final accounting remains opaque. Splitt mentions caching accessing pre-rendered content, but provides no scale. Are we talking about minutes, hours, days? [To be verified] as this variable directly impacts the indexing velocity of fresh content.

What gray areas still exist in this statement?

The first vague point: what "systems" exactly access pre-rendered content? Splitt remains deliberately vague. Is it only the CDN cache, or do other components like the initial ranking system, featured snippets, and Search Console also play a role?

The second blind spot: no mention of priority between the two versions. If your meta tags differ before and after JavaScript, does Google systematically choose the post-render version? Or are there cases where the pre-render version is favored, especially for latency concerns? Tests suggest that the final version prevails, but Google never explicitly confirms this.

The third gap: the cost of crawl budget. JavaScript rendering is notoriously resource-intensive. Google does not clarify whether this double analysis affects crawl frequency on heavy sites. [To be verified] on server logs of large sites with high JS dependency.

In what cases might this rule not apply?

Sites with blocking JavaScript or execution errors will obviously not benefit from the second pass. If your scripts fail, Google will only see the pre-render version — with potentially incomplete or generic meta tags.

Another likely exception: pages with excessive rendering time. Google allocates a timeout for JavaScript execution (officially never communicated, estimated between 5 to 10 seconds according to tests). If your meta tags are injected after this timeout, they risk being ignored despite Splitt's "guarantee."

Attention: Do not confuse "analyzed" and "used." Google may very well analyze your meta tags post-render without using them in ranking or snippets if other signals (main content, external anchors) diverge significantly.

Practical impact and recommendations

What should you do concretely on a JavaScript-heavy site?

If your stack is based on React, Vue, or Angular, continue to dynamically inject your meta tags — it's officially supported. But always add a static fallback in the initial HTML for systems accessing the pre-rendered content.

Specifically: integrate generic but relevant title and description tags into your base template, and then enhance them via JavaScript. This ensures that no Google system encounters total emptiness during the rendering window. SSR (Server-Side Rendering) or static generation remain the safest approaches to eliminate temporal discrepancies.

What mistakes should you absolutely avoid?

Never rely solely on JavaScript injection for your critical pages — homepage, major categories, strategic landing pages. Any execution bug or timeout exposes you to degraded indexing.

Avoid drastically altering your meta tags between the two states. If your pre-render title says "Loading..." and your final version displays the actual title, you create a discrepancy that Google might interpret as cloaking or manipulation. Maintain semantic continuity between the two versions.

Last frequent mistake: ignoring render logs in Search Console. The "Indexing > Pages" tab indicates if Google is encountering JavaScript errors. An error rate above 5% should trigger immediate investigation.

How can you check that your meta tags are properly accounted for?

Use the URL Inspection Tool in Search Console and compare the "HTML" tab (pre-render) with the visual rendering at the bottom of the page. Both should display your final meta tags. If the HTML shows empty or generic tags while the rendering shows the proper values, you have a timing issue.

Also test with a headless crawler like Screaming Frog in JavaScript mode. Set a rendering delay of at least 5 seconds and verify that your meta tags display correctly in the report. Then compare it with a JavaScript-disabled crawl to identify discrepancies.

  • Implement a static fallback for title and description in the base HTML, then enriched by JavaScript
  • Favor SSR or static generation for high business stakes pages
  • Monitor JavaScript errors in Search Console and correct any rates above 5%
  • Regularly test with the URL Inspection Tool to compare raw HTML and final rendering
  • Avoid radical content changes between pre-render and post-render to prevent raising flags
  • Audit JavaScript loading times and aim for meta tag injection within 3 seconds
Google's double analysis of meta tags offers real flexibility for modern architectures, but introduces significant technical complexity. Between managing SSR, monitoring JavaScript errors, optimizing rendering times, and ensuring pre/post-execution consistency, the equation quickly becomes tricky. If your stack heavily relies on JavaScript and you lack dedicated technical resources, support from a specialized SEO agency may prove crucial in avoiding indexing pitfalls and ensuring optimal recognition by all Google systems.

❓ Frequently Asked Questions

Si mes meta tags sont injectés par JavaScript, seront-ils toujours indexés par Google ?
Oui, Google analyse les meta tags après rendu JavaScript. Cependant, certains systèmes internes comme le cache accèdent au contenu pré-rendu, ce qui peut créer un décalage temporel avant indexation complète.
Dois-je abandonner l'injection JavaScript de meta tags au profit du rendu côté serveur ?
Non, mais un fallback statique dans le HTML de base est recommandé. Le SSR reste l'approche la plus sûre pour les pages stratégiques où tout décalage d'indexation impacte le business.
Combien de temps entre le crawl initial et la prise en compte des meta tags post-JavaScript ?
Google ne communique aucun délai officiel. Les observations terrain suggèrent de quelques heures à plusieurs jours selon la fréquence de crawl du site et la complexité du rendu JavaScript.
Les pages AMP sont-elles analysées différemment pour les meta tags ?
Oui, les pages AMP interdisant JavaScript personnalisé, elles ne subissent qu'une seule analyse sur le HTML statique. Pas de double passage ni de risque de décalage temporel.
Comment savoir si Google rencontre des erreurs lors du rendu JavaScript de mes meta tags ?
Utilisez l'outil d'inspection d'URL dans la Search Console et vérifiez l'onglet "Indexation > Pages" pour détecter les erreurs JavaScript. Comparez le HTML brut avec le rendu visuel pour identifier les écarts.
🏷 Related Topics
Domain Age & History Content AI & SEO JavaScript & Technical SEO Mobile SEO Web Performance

🎥 From the same video 11

Other SEO insights extracted from this same Google Search Central video · duration 18 min · published on 10/12/2020

🎥 Watch the full video on YouTube →

Related statements

💬 Comments (0)

Be the first to comment.

2000 characters remaining
🔔

Get real-time analysis of the latest Google SEO declarations

Be the first to know every time a new official Google statement drops — with full expert analysis.

No spam. Unsubscribe in one click.