Official statement
Other statements from this video 49 ▾
- 1:38 Does Google really track HTML links that are hidden by JavaScript?
- 1:46 Can JavaScript really hide your links from Google without destroying them?
- 3:43 Does Google really combine signals from multiple links pointing to the same page?
- 5:20 Do site-wide links in the menu and footer really dilute the PageRank of your strategic pages?
- 6:22 Is it really necessary to nofollow site-wide links to your legal pages to optimize PageRank?
- 7:24 Should you really keep nofollow on your footer links and service pages?
- 10:10 Why does Google make it impossible to use Search Console Insights without Analytics?
- 11:08 Does Nofollow still affect crawling without passing on PageRank?
- 11:08 Does nofollow really block indexing, or can Google still crawl those URLs?
- 13:50 Why is Google so tight-lipped about its indexing incidents?
- 15:58 Should you really index all paged pages to optimize your SEO?
- 15:59 Is it really necessary to index all pagination pages to optimize your SEO?
- 19:53 Are URL parameters still an obstacle for organic search?
- 19:53 Are URL parameters really a non-issue for SEO anymore?
- 21:50 Is it true that Google is blocking the indexing of new sites?
- 23:56 Do links in embedded tweets really affect your SEO?
- 25:33 Are sitemaps really essential for Google indexing?
- 26:03 How does Google really discover your new URLs?
- 27:28 Why does Google require a canonical on ALL AMP pages, including standalone ones?
- 27:40 Is the rel=canonical really mandatory on all AMP pages, even standalone ones?
- 28:09 Should you really implement hreflang across an entire multilingual site?
- 28:41 Should you really implement hreflang on every page of a multilingual website?
- 29:08 Is it true that AMP is a speed factor for Google?
- 29:16 Should you still invest in AMP to optimize speed and ranking?
- 29:50 Why does Google measure Core Web Vitals on the actual page version your visitors are really viewing?
- 30:20 Do Core Web Vitals really measure what your users actually see?
- 31:23 Should you manually deindex old pagination URLs after changing your site's architecture?
- 31:23 Is it really necessary to manually de-index your old pagination URLs?
- 32:08 Is advertising on your site harming your SEO?
- 32:48 Does having ads on your site really hurt your Google rankings?
- 34:47 Is rel=canonical in syndication really reliable for controlling indexing?
- 34:47 Does rel=canonical really protect your syndicated content from ranking theft?
- 38:14 Do security alerts in Search Console really block Google's crawling?
- 38:14 Can a hacked site lose its crawl budget due to Google security alerts?
- 39:20 Have links in guest posts really lost all SEO value?
- 39:20 Do guest post links really have no SEO value?
- 40:55 Why does Google ignore identical modification dates in your sitemaps?
- 40:55 Why does Google ignore the lastmod dates in your XML sitemap?
- 42:00 Should you really update the lastmod date of the sitemap for every minor change?
- 42:21 Does a poorly configured sitemap really diminish your crawl budget?
- 43:00 Can a misconfigured sitemap really cut down your crawl budget?
- 44:34 Should you really have to choose between reducing duplicate content and using canonical tags?
- 44:34 Is it really necessary to eliminate all duplicate content or should you rely on rel=canonical?
- 45:10 Should you really set a crawl limit in Search Console?
- 45:40 Should you really let Google decide your crawl limit?
- 47:08 Do internal 301 redirects really dilute PageRank?
- 47:48 Do cascading internal 301 redirects really drain SEO juice?
- 49:53 Can the JavaScript History API really force Google to change your canonical URL?
- 49:53 Can Google really treat URL changes made by JavaScript and the History API as redirects?
Google combines anchor signals from multiple links pointing to the same URL on a page, not just the first one. The common belief that only the first link matters is outdated. In practice, you can place your rich anchors where it makes sense for the user without juggling the HTML order.
What you need to understand
Where does this belief that only the first link counts come from?
For years, the SEO community firmly believed that Google only considered the anchor text of the first link found in the DOM to a given URL. This conviction was based on empirical tests and old statements — at a time when the engine actually treated links in a more simplistic manner.
The reasoning made sense: to avoid anchor text stuffing on the same page. But this behavior has never been officially documented comprehensively. And SEOs continued to structure their internal linking by systematically placing the richest anchor first in the source code, even at the expense of usability.
What does Google actually say today?
John Mueller states that Google can combine signals from multiple links pointing to the same page. In other words, if you have three links to /product-x/ with three different anchors, the engine can aggregate this information to better understand the target page's content.
This does not mean that all links carry the same weight — position in the DOM, visibility, and semantic context likely play a role. But strict order is no longer the determining factor. You are no longer required to place your optimized anchor at the top of the page for it to be counted.
Why does this evolution change the game?
Because it frees SEO practitioners from an arbitrary technical constraint. You can now design your internal linking based on user experience, without counterproductive micro-optimizations. If your menu contains a generic link titled "Products" and a paragraph further down offers "Compare cordless vacuums," both anchors can contribute.
This also encourages a more natural approach to internal linking: varying anchors on the same page can enrich the overall semantic context. Provided, of course, that you don't fall into the opposite excess — multiplying unnecessary links degrades readability and dilutes internal PageRank.
- Google combines anchor signals from multiple links to the same URL on a page
- The order of links in the DOM is no longer the dominant criterion
- You can prioritize user experience without sacrificing SEO optimization
- Varying anchors can enrich semantic context, as long as you remain relevant
- This approach applies to internal linking as well as contextual links
SEO Expert opinion
Is this statement consistent with field observations?
Yes, overall. Tests conducted in recent years show that Google is getting better at handling multiple contexts. Sites that optimized only the first link did not consistently outperform those that prioritized overall semantic coherence.
That said, Mueller remains vague on how exactly Google combines these signals. Does it take a weighted average? Does it prioritize anchors closer to the top of the page? Anchors in editorial content rather than navigation? [To be verified] — no official data specifies the internal mechanics.
What nuances should be considered regarding this rule?
The first nuance: not all links are created equal. A link in a contextual paragraph, surrounded by semantically rich content, probably carries more weight than an isolated link in a footer. Position, visibility (above or below the fold), and context still play a role.
The second nuance: multiplying links to the same URL on a page remains risky. If you have five identical or nearly identical links, this may be perceived as internal spam. The idea is not to stuff but to vary intelligently when it is relevant for the user.
In what cases does this rule not fully apply?
On sites with complex or multi-level navigation, certain linking patterns may still privilege the first link simply due to DOM inheritance. If your main menu loads before the content (server-side rendering), that link may be crawled first — and therefore, counted differently.
Another edge case: pages with late JavaScript anchors. If your rich contextual links are injected only after user interaction, Google may not see them at all — and there, only the first "static" link counts. Always check what Googlebot actually crawls using the URL inspection tool.
Practical impact and recommendations
What actions should you take on your pages?
First action: audit your high-traffic pages to identify cases where you have multiple links to the same URL. Check if the anchors provide complementary semantic nuances — "buy a Dyson V12 vacuum" vs "compare Dyson models," for example.
Second action: stop manipulating HTML order to artificially place the richest anchor first. Focus on editorial coherence. If your menu contains a generic link and a contextual widget offers a detailed anchor, that's perfect — both contribute.
What mistakes should be avoided in implementation?
Don’t fall into the opposite excess: multiplying identical or redundant links to "strengthen" an anchor. Google can detect this pattern as internal spam, especially if the links are concentrated in the same block without additional value for the user.
Also avoid neglecting visual hierarchy. If your most important link is buried at the bottom of the page in tiny font, the user (and potentially Google) will consider it secondary. Combining signals does not mean that all links are equivalent — prominence matters.
How can you verify that your linking is optimized?
Use a crawler like Screaming Frog or Oncrawl to list all URLs with multiple incoming links from the same page. Export the anchors and ensure they provide relevant semantic variations, not just duplicates.
Also test with Google Search Console: inspect key pages and see which links Googlebot detects. If your rich anchors do not appear in the rendering, it means they are loaded too late or hidden — and there, only the first link still counts by default.
- Audit pages with multiple links to the same URL and check the complementarity of the anchors
- Stop manipulating HTML order to artificially place the optimized anchor first
- Avoid multiplying redundant links without user-added value
- Check with a crawler that anchors are correctly detected and semantically varied
- Test Googlebot’s rendering via Search Console to identify invisible links
- Maintain a coherent visual hierarchy — important links should remain prominent
❓ Frequently Asked Questions
Google prend-il en compte tous les liens vers une même URL ou seulement certains ?
Dois-je supprimer les liens multiples vers la même page pour éviter la dilution ?
L'ordre des liens dans le code HTML a-t-il encore une importance ?
Comment vérifier quels liens Google détecte réellement sur mes pages ?
Peut-on varier les ancres vers une même page pour renforcer la sémantique ?
🎥 From the same video 49
Other SEO insights extracted from this same Google Search Central video · duration 55 min · published on 21/08/2020
🎥 Watch the full video on YouTube →
💬 Comments (0)
Be the first to comment.