martes, 2 de abril de 2019

How To Optimise A Website Structure with Internal Links in 2019



Disclosure: “This article is personal opinion of research based on my experience of almost 20 years. There is no third party advertising on this page or monetised links of any sort. External links to third party sites are moderated by meDisclaimer.” Shaun Anderson, Hobo

These are my notes on how to optimise internal links in 2019.

internal-structure.jpg

QUOTE: “How important is the anchor text for internal links? Should that be keyword rich? Is it a ranking signal? We do use internal links to better understand the context of content on your site so if we see a link that’s saying like red car is pointing to a page about red cars that helps us to better understand that but it’s not something that you need to keyword stuff in any way because what generally happens when people start kind of focusing too much on the internal links is that they have a collection of internal links that all say have like four or five words in them and then suddenly when we look at that page we see this big collection of links on the page and essentially that’s also text on a page so it’s looking like keyword stuff text so I try to just link naturally within your website and make sure that you kind of have that organic structure that gives us a little bit of context but not that your keyword stuffing every every anchor text there.” John Mueller, Google 2015

Table of Contents

An Introduction to ‘Internal Link Building

QUOTE: “Most links do provide a bit of additional context through their anchor text. At least they should, right?” John Mueller, Google 2017

Whereas Linkbuilding is the art of getting other websites to link to your website. Internal link building is the art age-old of getting pages crawled and indexed by Google. It is the art of spreading real Pagerank about a site and also naturally emphasising important content on your site in a way that actually has a positive, contextual SEO benefit for the ranking of specific keyword phrases in Google SERPs (Search Engine Results Pages).

External backlinks to your site are far more powerful than internals within it, but internal links have their use too.

Traditionally, one of the most important things you could do on a website to highlight your important content was to link to important pages often, especially from important pages on your site (like the homepage. for instance).

QUOTE: “If you have pages that you think are important on your site don’t bury them 15 links deep within your site and I’m not talking about directory length I’m talking about actual you have to click through 15 links to find that page if there’s a page that’s important or that has great profit margins or converts really –  well – escalate that put a link to that page from your root page that’s the sort of thing where it can make a lot of sense.” Matt Cutts, Google 2011

Highlighting important pages in your site structure has always been important to Google from a CRAWLING, INDEXING and RANKING point of view. It is also important for website users from a USABILITY, USER EXPERIENCE and CONVERSION RATE perspective.

Most modern CMS in 2019 take a headache out of getting your pages crawled and indexed. Worrying about your internal navigation structure (unless it is REALLY bad) is probably unnecessary and is not going to cause you major problems from an indexation point of view.

There are other considerations though apart from Google finding your pages.

I still essentially use the methodology I have laid down on this page, but things have changed since I first started practising Google SEO and began building internal links to pages almost 20 years ago.

The important thing is to link to important pages often.

Google has said it doesn’t matter where the links are on your page, Googlebot will see them:

QUOTE: “So position on a page for internal links is pretty much irrelevant from our point of view.  We crawl, we use these mostly for crawling within a website, for understanding the context of individual pages within a website.  So if it is in the header or the footer or within the primary content, it’s totally more up to you than anything SEO wise that I would worry about.” John Mueller, Google 2017

As an aside, that statement on its own does not sit nicely with some patents I’ve read, where link placement does seem to matter in some instances.

When it comes to internal linking on your website we do know:

  • where you place links on a page is important for users
  • which pages you link to on your website is important for users
  • how you link to internal pages is important for users
  • why you link to internal pages is important for users

Internal Linking is important to users, at least, and evidently, it is important to Google, too, and it is not a straight forward challenge to deal with optimally.

Take note that Google has lots of patents related to links and anchor text, for instance:

Anchor Text Indexing

QUOTE: “Using anchor text for links to determine the relevance of the pages they point towards.”  12 Google Link Analysis Methods That Might Have ChangedBill Slawski

Propagation of Relevance between Linked Pages

QUOTE: “Assigning relevance of one web page to other web pages could be based upon distance of clicks between the pages and/or certain features in the content of anchor text or URLs. For example, if one page links to another with the word “contact” or the word “about”, and the page being linked to includes an address, that address location might be considered relevant to the page doing that linking.”  12 Google Link Analysis Methods That Might Have Changed – Bill Slawski

Ranking based on ‘changes over time in anchor text

In one embodiment of the invention, the time-varying behavior of anchortext (e.g., the text in which a hyperlink is embedded, typically underlined or otherwise highlighted in a document) associated with a document may be used to score the document. For example, in one embodiment, changes over time in anchor text corresponding to inlinks to a document may be used as an indication that there has been update or even change of focus in the document; a relevancy score may take this change(s) into account.” The Original Historical Data Patent Filing and its Children –Bill Slawski

Ranking based on ‘Unique words, bigrams, phrases in anchor text

QUOTE: “In one embodiment, the link or web graphs and their behavior over time may be monitored and used for scoring, spam detection or other purposes by a search engine. Naturally developed web graphs typically involve independent decisions. Synthetically generated web graphs-usually indicative of an intent to spam a search engine are based on coordinated decisions; as such, the profile of growth in anchor words/bigrams/phrases is likely to be relatively spiky in this instance. One reason for such spikiness may be the addition of a large number of identical anchors from many places; another possibility may be addition of deliberately different anchors from a lot of places. With this in mind, in one embodiment of the invention, this information could be monitored and factored into scoring a document by capping the impact of suspect anchors associated with links thereto on the associated document score (a binary decision). In another embodiment, a continuous scale for the likelihood of synthetic generation is used, and a multiplicative factor to scale the score for the document is derived.” The Original Historical Data Patent Filing and its Children – Bill Slawski

Rank assigned to ‘a document is calculated from the ranks of documents citing it’ – ‘Google Pagerank

QUOTE: “DYK that after 18 years we’re still using* PageRank (and 100s of other signals) in ranking?” Gary Illyes from Google – Search Engine Roundtable 2017

We can only presume Google still uses Pagerank (or something like it) in its ordering of web pages.

QUOTE: “A method assigns importance ranks to nodes in a linked database, such as any database of documents containing citations, the world wide web or any other hypermedia database. The rank assigned to a document is calculated from the ranks of documents citing it. In addition, the rank of a document is calculated from a constant representing the probability that a browser through the database will randomly jump to the document. The method is particularly useful in enhancing the performance of search engine results for hypermedia databases, such as the world wide web, whose documents have a large variation in quality.” The Original PageRank Patent Application – Bill Slawski

QUOTE: “A high pagerank (a signal usually calculated for regular web pages) is an indicator of high quality and, thus, can be applied to blog documents as a positive indication of the quality of the blog documents.”  Positive and Negative Quality Ranking Factors from Google’s Blog Search (Patent Application) – Bill Slawski

*Google evidently does not throw the baby out with the bathwater. If Google still uses Pagerank, then perhaps they still use tons of other legacy methods of ranking websites that over time are obfuscated to protect the secret sauce.

A ‘measure of quality’ based on ‘the number’ of links:

QUOTE: “A system can determine a measure of quality for a particular web resource based on the number of other resources that link to the particular web resource and the amount of traffic the resource receives. For example, a ranking process may rank a first web page that has a large number of other web pages that link to the first web page higher than a web page having a smaller number of linking web pages.” Did the Groundhog Update Just Take Place at Google? Bill Slawski

A ‘measure of quality’ based on ‘traffic received by use of those links’

However, some a resource may be linked to by a large number of other resources, while receiving little traffic from the links. For example, an entity may attempt to game the ranking process by including a link to the resource on another web page. This large number of links can skew the ranking of the resources. To prevent such skew, the system can evaluate the “mismatch” between the number of linking resources and the traffic generated to the resource from the linking resources. If a resource is linked to by a number of resources that is disproportionate with respect to the traffic received by use of those links, that resource may be demoted in the ranking process.” Did the Groundhog Update Just Take Place at Google? Bill Slawski

A ‘measure of quality’ based on link ‘selection quality score’

The selection quality score may be higher for a selection that results in a long dwell time (e.g., greater than a threshold time period) than the selection quality score for a selection that results in a short dwell time (e.g., less than a threshold time period). As automatically generated link selections are often of a short duration, considering the dwell time in determining the seed score can account for these false link selections. Did the Groundhog Update Just Take Place at Google? Bill Slawski

Google certainly gives some weight to anchor text, including the anchor text it finds on your own site.

Links Are Like Lasers

I used a ‘links-are-lasers’ analogy way back then, to try and give beginners a simpler understanding of Google PageRank.

  1. Links Are Lasers
  2. Linking To A Page Heats Up A Page
  3. Pages Get Hot Or Cold Depending On Number & Quality Of The Links To It
  4. Cold Pages Don’t Rank For Sh*t
  5. Hot Pages Rank!

That was certainly how I used to think about link building and internal site structure. That is how I used to visualise how pages built up ‘ranking equity’ that could be spread about a site.

There was a time when you could very specifically structure a certain page to rank using nothing but links – and while you can still do that in 2019 in the end, Google will pick the page on your site that is MOST RELEVANT TO THE QUERY and best meets USER EXPECTATIONS & USER INTENT (see here for more on developing SEO-friendly content for Google in 2019).

That is – you can link all you want to any one page, but if Google has a problem with that page you are trying to make rank or thinks there’s a better page on your site (with a better user satisfaction score, for instance) – it will choose to rank that other page, before the ‘well-linked-to’ page.

In the past, Google would flip-flop between pages on your site, when there were multiple pages on the site targeting the same term, and rankings could fluctuate wildly if you cannibalised your keywords in this way.

Google is much more interested, in 2019, in the end-user quality of the page ranking, and the trust and quality of the actual website itself, than the inbound links pointing to a single page or a clever internal keyword rich architecture that holds content ‘up’.

It’s much more important in 2019 for a page to meet the user intent (as Google has defined it) of a specific key phrase and those intents can be complex keyword phrase to keyword phrase.

Internal link building works best when it is helping Google identify canonical pages to rank on your site.

As John Mueller points out in the above official video:

QUOTE: “we do use internal links to better understand the context of content of your sites” John Mueller, Google 2015

…but if you are putting complicated site-structure strategy before high-quality single-page content that can stand on its own, you are probably going to struggle to rank in Google organic listings in the medium to long-term.

So the message is a keyword rich anchor text system on your site IS useful, and is a ranking signal, but don’t keyword stuff it.

I have always taken that to mean we should focus on introducing as many unique and exactly relevant long-tail keyword phrases into your internal link profile as you can. This has certainly had better results for me than having one page on your site having only one anchor text phrase in its profile.

How you proceed is going to be very much dictated by the site and complexity of your site, and how much time you are willing to spend on this ranking signal.

There is no single best way to build internal links on your site, but there are some efficiencies to be had, especially if your site is of a good quality in the first place. There are some really bad ways to build your site for search engines. For example, do not build your website with frames.

I focus on optimising the important pages in the website structure e.g. the pages we need to rank fast and I prioritise internal links to these pages (all the time remembering “first link priority” which I go into later).

Propagation of Relevance between Linked Pages

By making sure you link to other relevant pages from pages on your site to other pages, you spread Pagerank (or link equity) throughout the site and each individual link can provide even more context and relevance information to Google (which can only be of use for search engine optimisation).

QUOTE:” Assigning relevance of one web page to other web pages could be based upon distance of clicks between the pages and/or certain features in the content of anchor text or URLs. For example, if one page links to another with the word “contact” or the word “about”, and the page being linked to includes an address, that address location might be considered relevant to the page doing that linking.”  12 Google Link Analysis Methods That Might Have Changed – Bill Slawski

A home page is where link equity seemed to ‘pool’ (from the deprecated Toolbar PageRank point of view) and this has since been confirmed by Google:

QUOTE: “home pages” are where “we forward the PageRank within your website” . John Mueller, Google 2014

How you build internal links on your site today is going to depend on how large your site is and what type of site it is. Whichever it is – I would keep it simple in 2019.

I thought this was an interesting statement from Google, especially if you have a much larger site:

use-original-pagerank-formula-for-internal-links.png

If you have a smaller site, I would still err on the safe side these days, but vary your anchor text to internal as much as possible – WITHIN TEXT CONTENT, and to meet long-tail variations of keywords with specific user intent, rather than relying on a site-wide navigation array to beef up raw link popularity to every page on the site (as the benefits from this tactic are not so obvious these days).

Whatever you do, I recommend you avoid anything that is easily detectable as too manipulative –  Google does not reward lazy linking in 2019.

It penalises, devalues or ignores it.

How To Do Internal Link Building in 2019

Optimising internal links says John Mueller in a webmaster hangout is

QUOTE: “not something I’d see as being overly problematic if this is done in a reasonable way and that you’re not linking every keyword to a different page on your site“. John Mueller, Google

As mentioned previously, this will depend on the size and complexity of your website. A very large site should keep things simple as possible and avoid any keyword stuffing footprint.

Any site can get the most out of internal link building by descriptively and accurately linking to canonical pages that are very high quality. The more accurately described the anchor text is to the page linked to, the better it is going to be in the long run. That accuracy can be to an exact match keyword phrase or a longtail keyword variation of it (if you want to know more see my article on keyword research for beginners – that link is in itself an example of a long tail variation of the primary head or medium term ‘keyword research‘).

I silo any relevance or trust mainly through links in a flat architecture in text content and helpful secondary menu systems and only between pages that are relevant in context to one another.

I don’t worry about perfect Pagerank siloing techniques in 2019.

On this site, I like to build in-depth content pieces in 2019 that ranks for a lot of long-tail phrases. These days, I usually would not want those linked from every page on a site – because this practice negates the opportunities some internal link building provide. I prefer to link to pages in context; that is, within page text.

There’s no set method I find works for every site, other than to link to related internal pages often and where appropriate. NOTE: You should also take care to manage redirects on the site, and minimise the amount of internal 301 redirects you employ on the site; it can slow your pages down (and website speed is a ranking factor) and impact your SEO in the long-term in other areas.

Takeaway 1: Internal links are still important. Internal links have value to Google for crawling, indexing and context. Internal links are important to get right for users and for rankings.

Broken Links Are A Waste Of Link Power

Websites ‘Lacking Care and Maintenance’ Are Rated ‘Low Quality’ by Google.

QUOTE: “Sometimes a website may seem a little neglected: links may be broken, images may not load, and content may feel stale or out-dated. If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” Google Quality Evaluator Guidelines, 2017

The simplest piece of advice I ever read about creating a website / optimising a website was over a decade ago:

QUOTE: “”make sure all your pages link to at least one other in your site””

This advice is still sound in 2019.

Check your pages for broken links.

Broken links are a waste of link power and could hurt your site, drastically in some cases, if a poor user experience is identified by Google. Google is a link based search engine – if your links are broken, you are missing out on the benefit you would get if they were not broken.

Saying that – fixing broken links is NOT a first-order rankings bonus – it is a usability issue, first and foremost.

QUOTE: “The web changes, sometimes old links break. Googlebot isn’t going to lose sleep over broken links. If you find things like this, I’d fix it primarily for your users, so that they’re able to use your site completely. I wouldn’t treat this as something that you’d need to do for SEO purposes on your site, it’s really more like other regular maintenance that you might do for your users.” GOOGLE – 2014 (John Mueller)

Takeaway 2: Broken links that are found on pages on your own site can be a frustrating user-experience, which is a big no-no in 2019. Broken links also mess up Pagerank and anchor text flow to pages on your site. Broken links on your site are often a different issue than the 404 errors Google shows in Webmaster tools. When 404s are present on your site, they hurt. When they are just 404s in Webmaster tools and not present on your pages these are less of an issue to worry about.

Internal Links Help Google Discover Other Pages On Your Website

Just because Google can find your pages easier in 2019 doesn’t mean you should neglect to build Googlebot a coherent architecture with which it can crawl and find all the pages on your website.

Pinging Google blog search via RSS (still my favourite way of getting blog posts into Google results fast ) and XML sitemaps may help Google discover your pages, find updated content and include them in search results, but they still aren’t the best way at all of helping Google determine which of your pages to KEEP INDEXED or EMPHASISE or RANK or HELP OTHER PAGES TO RANK (e.g. it will not help Google work out the relative importance of a page compared to other pages on a site, or on the web).

While XML sitemaps go some way to address this, prioritisation in sitemaps does NOT affect how your pages are compared to pages on other sites – it only lets the search engines know which pages you deem most important on your own site. I certainly wouldn’t ever just rely on XML sitemaps like that….. the old ways work just as they always have – and often the old advice is still the best especially for SEO.

XML sitemaps are INCLUSIVE, not EXCLUSIVE in that Google will spider ANY url it finds on your website – and your website structure can produce a LOT more URLs than you have actual products or pages in your XML sitemap (something else Google doesn’t like.

Keeping your pages in Google and getting them to rank has long been assured by simple internal linking practices.

Traditionally, every page needed to be linked to other pages for Pagerank (and other ranking benefits) to flow to other pages – that is traditional, and I think accepted theory, on the question of link equity.

I still think about link equity today – it is still important.

Some sites can still have short circuits – e.g. internal link equity is prevented from filtering to other pages because Google cannot ‘see’ or ‘crawl’ a fancy menu system you’re using – or Googlebot cannot get past some content it is blocked in robots.txt from rendering, crawling and rating.

I still rely on the ‘newer’ protocols like XML sitemaps for discovery purposes, and the old tried and trusted way of building a site with an intelligent navigation system to get it ranking properly over time.

Read my article on how to get Google to index an entire website.

How Many Links Is Too Many In A Website Dropdown Navigation System?

Quote some time ago now, I answered in the Google Webmaster Forum a question about how many links in a drop-down are best:

The question was:

QUOTE: “Building a new site with over 5000 product pages. Trying to get visitors to a product page directly from the homepage. Would prefer to use a two-level drop-down on homepage containing 10 brands and 5K products, but I’m worried a huge source code will kick me in the pants.Also, I have no idea how search engines treat javascript links that can be read in HTML. Nervous about looking like a link farm.”

I answered:

QUOTE – “I’d invest time in a solid structure – don’t go for a javascript menu it’s too cumbersome for users. Sometimes google can read these sometimes it can’t – it depends on how the menu is constructed. You also have to remember if google can read it you are going to have a big template core code (boilerplate) on each and every page vying alongside flimsy product information – making it harder for google to instantly calculate what the individual products page is supposed to rank for.

I would go for a much reduced simple sitewide navigation in the menu array,

Home page links to categories > Categories link to products > Products link to related products

when you go to category links the links relevant in that category appear in the menu. Don’t have all that pop down in a dropdown – not good for users at all. Keep code and page load time down to a minimum…” Shaun Anderson

QUOTE:” JohnMu (Google Employee) + 2 other people say this answers the question:” Google Webmaster Forums

I thought to see as somebody from Google agreed, it was worth posting on my own blog.

The most important thing for me when designing website navigation systems is:

  1. Make it easy for the user to navigate
  2. Make it easy for Google to get to your content and index your pages

In terms of navigation from a landing page (all your pages are potential landing pages), what do you think the benefits are of giving people 5000 navigation options.

Surely if the page meets their requirements, all you need is two buttons. Home, and “Buy Now!”; OK – a few more – but you get what I mean, I hope.

Less is more, usually.

Google says:

QUOTE: “Limit the number of links on a page to a reasonable number (a few thousand at most).” Google Webmaster Guidelines, 2018

There are benefits to a mega-menu:

QUOTE: “Mega menus may improve the navigability of your site. (Of course, it’s always best to test.) By helping users find more, they’ll help you sell more.Jakob Nielsen

and there are drawbacks to mega-menus:

QUOTE: “In the bigger scheme of things, the usability problems mentioned here aren’t too serious. They’ll reduce site use by a few percent, but they won’t destroy anyone’s business metrics. But still: why degrade the user experience at all, when the correct design is as easy to implement as the flawed one?Jakob Nielsen

Once you realise getting your product pages indexed is the key, don’t go for a mega-menu just because you think this is a quick way to solve your indexing problem.

With a site structure, it’s all about getting your content crawled and indexed. That’s the priority.

Do I Need an HTML Sitemap For My Site?

NO, but they can be useful:

QUOTE: “Ensure that all pages on the site can be reached by a link from another findable page. The referring link should include either text or, for images, an alt attribute, that is relevant to the target page.” Google Webmaster Guidelines, 2018

A basic HTML sitemap is an old friend, and Google actually does say in its guidelines for Webmasters that you should include a sitemap on your site – for Googlebot and users – although this can naturally get a bit unwieldy for sites with a LOT of pages:

QUOTE: “Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page).” Google Webmaster Guidelines, 2018

When it comes to internal links, the important thing is that you “ensure that all pages on the site can be reached by a link from another findable page” and then you can think about ‘escalating‘ the priority of pages via internal links.

Ensure Your Navigation System Is User & Search Engine Friendly

You can create dynamic drop-down menus on your site that meet accessibility requirements and are SEO friendly and then link your pages together in a Google-friendly way.

Just be sure to employ a system that uses CSS and Javascript (instead of pure javascript & HTML tables) and unordered lists as a means of generating the fancy drop-down navigation on your website.

Then, if javascript is disabled, or the style sheet is removed, the lists that make up your navigation array collapses gracefully into a list of simple links. See here for more on Javascript SEO.

Remember, with Drop down menus:

  • Drop-down menus are generally fine but the JavaScript triggering them can cause some problems for search engines, users with screen readers and screen magnifiers.
  • A <noscript> alternative is necessary.
  • The options offered in a drop-down should be repeated as text links on the same page, so use unordered lists with CSS to develop your menu.

Use a “Skip Navigation” link on large mega-menu systems

Add a skip navigation link that brings the reader straight down to the main content of the page if you have a large menu system on every page. This allows users to skip the navigation array and get immediately to the page content.

You won’t want this on your visually rich page, so some simple CSS will sort this out. You can hide it from the visual browsers, but it will display perfectly in text and some speech browsers.

Generally, I don’t like mega-menu systems on websites and there are discussions online that too many options promote indecision. Whether you use a mega-menu it totally up to you – there are demonstrable pros and cons for both having a mega-menu or not.

The 3 Click Rule of Website Design

My own site follows the 3-click rule of web design, as you can see in this visual Crawl Map of my site (using Sitebulb):

visual-scrawl-map-sitebulb.png

RULE: “Don’t put important information on your site that is more than 3 clicks away from an entrance page” Zeldman

Many have written about the Three-Click Rule. For instance, Jeffrey Zeldman, the influential web designer, wrote about the three click Rule in his popular book, “Taking Your Talent to the Web“. He writes that the three click Rule is:

QUOTE: “based on the way people use the Web” and “the rule can help you create sites with intuitive, logical hierarchical structures“. Jeffrey Zeldman

On the surface, the Three-Click Rule makes sense. If users can’t find what they’re looking for within three clicks, they’re likely to get frustrated and leave the site.

However, there have been other studies into the actual usefulness of the 3 click rule by usability experts, generating real data, that basically debunks the rule as a gospel truth. It is evidently not always true that a visitor will fail to complete a task if it takes more than 3 clicks to complete.

The 3 click rule is the oldest pillar of accessible, usable website design, right there beside KISS (Keep It Simple Stupid).

The 3 click rule, at the very least, ensures you are always thinking about how users get to important parts of your site before they bounce.

QUOTE: “home pages” are where “we forward the PageRank within your website” and “depending on how your website is structured, if content is closer to the Home page, then we’ll probably crawl it a lot faster, because we think it’s more relevant” and “But it’s not something where I’d say you artificially need to move everything three clicks from your Homepage”. John Mueller, Google 2014

This is the click depth of my content on this website in Jan 2018 (as discovered by Screaming Frog):

crawl-depth-visualisation.png

TAKEAWAY 3 – Pages do NOT need to be three clicks away from the landing page, but it is useful to think about the concept of the 3 click rule, I think when designing a navigation around your site. The simpler the better.

TAKEAWAY 4 – There is a benefit to linking to important pages often, but just because a page is linked to a LOT in an internal architecture will not necessarily make the page rank much better even with more Google Pagerank pumped into it. Relevance algorithms, page quality and site quality algorithms are all designed to float unique or satisfying pages to the top of the SERPs in 2019. As a direct result of this observation, I prefer to maximise the contextual value of internal links on smaller sites (rather than just make a page ‘link popular’). I go into ‘contextual value below).

The Benefits of A Consistent Website Navigation & Page Layout

A key element of accessible website development is a clean, consistent navigation system coupled with a recognised, usable layout.

Don’t try and re-invent the wheel here. A simpler, clean, consistent navigation system and page layout allow users to instantly find important information and allows them to quickly find comfort in their new surroundings especially if the visitor is completely new to your website.

Visitors don’t always land on your home page – every page on your website is a potential landing page.

Ensure when a visitor lands on any page, they are presented with simple options to go to important pages you want them to go to. Simple, clear calls to action that encourage a user to visit specific pages. Remember too, that just because you have a lot of pages on your site, that does not mean you need a mega-menu. You do not need to give visitors the option to go to every page from their entry page. You do not need a massive drop down menu either. Spend the time and invest in a simple site navigation menu and a solid site structure.

A traditional layout (2 or 3 columns, with a header and a footer) is excellent for accessible website design, especially for information sites.

Remember to use CSS for all elements of style, including layout and navigation.

QUOTE: “Presentation, content and navigation should be consistent throughout the website” Guidelines for UK Government websites – Illustrated handbook for Web management teams

Google has also mentioned about consistency (e.g. even your 404 page should be consistent with regards to your normal page layouts).

There is one area, however, where ‘consistency’ might not be the most optimal generic advice and that is how you interlink pages using anchor text.

For instance, on a smaller site; is it better to link to any one page with the same anchor text focusing all signals on one exact match keyword phrase – or –  is best to add more contextual value to these links by mixing up how you manage internal links to a single page.

In short, instead of one internal link to a page say “How To Optimise A Website Using Internal Links” I could have 5 different links on 5 pages all with unique anchor text pointing to the one page:

  1. how to use internal links for SEO
  2. how to build internal links
  3. how to manage internal links
  4. how to optimise internal links
  5. how to SEO internal links

I think this provides a LOT of contextual value to a page and importantly it mixes it up:

QUOTE: “Each piece of duplication in your on-page SEO strategy is ***at best*** wasted opportunity. Worse yet, if you are aggressive with aligning your on page heading, your page title, and your internal + external link anchor text the page becomes more likely to get filtered out of the search results (which is quite common in some aggressive spaces). Aaron Wall, 2009

From my tests, some sites will get more benefit out of mixing up as much as possible. If you do see the benefit of internal linking using a variation of anchor text, you will need to be aware of First Link Priority (I go into this below)

TAKEAWAY 5 – Keep internal navigation consistent and clear.

First Link Priority – Do Multiple Links From One Page To Another Count?

Google commented on this in a recent hangout:

QUOTE: “Q: If I have two internal links on the same page and they’re going through the same destination page but with different anchor text how does Google treat that so from our side? A: This isn’t something that we have defined or say it’s always like this always like the first link always the last link always an average of the links there is something like that but rather that’s something that our algorithms might choose to do one way or the other so recommendation there would be not to worry too much about this if you have different links going to the same page that’s completely normal that’s something that we have to deal with we have to understand the anchor tanks to better understand the context of that link and that’s that’s completely normal so that’s not something I kind of worry about there I know some people do SEO experiments and try to figure this out and kind of work out Oh Google currently does it like this but from our point of view can change and it’s not something that we have so even if you manage to figure out how we currently do it today then that’s not necessarily how we’ll do tomorrow or how it’s always it’s across all websites.” John Mueller, Google 2018

First link priority has been long discussed by SEO geeks. Matt Cutts of Google was asked in the video above:

QUOTE: “Hi Matt. If we add more than one links from page A to page B, do we pass more PageRank juice and additional anchor text info? Also can you tell us if links from A to A count?”

At the time he commented something like he ‘wasn’t going to get into anchor text flow’ (or as some call First Link Priority) – in this scenario, which is, actually, a much more interesting discussion.

QUOTE: “Both of those links would flow PageRank I’m not going to get into anchor text but both of those links would flow PageRank” Matt Cutts, Google 2011

But the silence on anchor text and priority – or what counts and what doesn’t, is, perhaps, confirmation that Google has some sort of ‘link priority’ when spidering multiple links to a page from the same page and assigning relevance or ranking scores.

For example (and I am talking internally here – if you took a page and I placed two links on it, both going to the same page? (OK – hardly scientific, but you should get the idea). Will Google only ‘count’ the first link? Or will it read the anchor text of both links, and give my page the benefit of the text in both links especially if the anchor text is different in both links? Will Google ignore the second link? What is interesting to me is that knowing this leaves you with a question. If your navigation array has your main pages linked to in it, perhaps your links in content are being ignored, or at least, not valued.

I think links in body text are invaluable. Does that mean placing the navigation below the copy to get a wide and varied internal anchor text to a page? Perhaps.

I’m pretty sure, from plenty of observations I’ve made in the past, that this is indeed the case. I have seen a few examples where I *thought* might contradict my own findings, but on closer examination, most could not be verified. It’s a lot harder today to isolate this sort of thing – but Google is designed that way.

I think as the years go by – we’re supposed to forget how Google worked under-the-hood of all that fancy new GUI.

The simple answer is to expect ONE link – the first link –  out of multiple links on a single page pointing at one other page – to pass anchor text value. Follow that advice with your most important key phrases in at least the first link when creating multiple links and you don’t need to know about first link priority.

A quick SEO test I did a long time ago throws up some interesting questions today – but the changes over the years at Google since I did my test will have an impact what is shown – and the fact is – the test environment was polluted long before now.

I still think about first link priority when creating links on a page.

Google says today:

QUOTE: “I know some people do SEO experiments and try to figure this out and kind of work out ‘Oh Google currently does it like this’ but from our point of view can change and it’s not something that we have so even if you manage to figure out how we currently do it today then that’s not necessarily how we’ll do tomorrow or how it’s always it’s across all websites.” John Mueller, Google 2018

The last time I tested ‘first link priority’ was a long time ago.

From this test, and the results on this site anyways, testing links internal to this site, it seems Google only counted the first link when it came to ranking the target page.

These-terms-only-appear-in-links-pointing-to-this-page.png

At the time I relied on the notification “These terms only appear in links pointing to this page” (when you click on the cache) that Google helpfully showed when the word isn’t on the page.

Google took that away so now you can really only use the SERPS itself and see if you can detect anchor text influence (which is often not obvious).

If ‘first link priority’ is a ‘thing’, then you could (and I am just theorising here) place your navigation below your text to ensure first link priority in contextual content. This lets you vary the anchor text to important internal pages on your site, within the text content, instead of ramming down Google’s throat one anchor text link (usually high in the navigation). Varying anchor text naturally optimises to an extent the page for long tail ‘human’ searches you might overlook when writing the actual target page text. Of course, I assume here that links within text surrounded by text are more important (contextually) than links in navigation menus. It makes use of your internal links to rank a page for more terms, especially useful if you link to your important pages often, and don’t have a lot of incoming natural links to achieve a similar benefit.

Of course, you could simply be sensible when interlinking your internal pages.

Does Only The First Link Count On Google? Does the first or second anchor text link on a page count?

This has been one of the more interesting geek SEO discussions over the years.

Takeaway 6: If you believe in ‘first link priority’, you are going to have to take it into account when creating your main navigation system that appears on every page, and where that sits in the template.

How Google Treats “noindex,follow” in Robots Meta Tag on Pages

It is a misconception in the SEO community that Google handles this as some may think:

QUOTE: “So it’s kind of tricky with noindex. Which I think is something somewhat of a misconception in general with a the SEO community in that with “a noindex and follow it’s still the case that we see the noindex and in the first step we say okay you don’t want this page shown in the search results. We’ll still keep it in our index, we just won’t show it and then we can follow those links.” EDIT – I am adding the quotes EDIT Shaun) But if we see the noindex there for longer than we think this page really doesn’t want to be used in search so we will remove it completely. And then we won’t follow the links anyway. So in noindex and follow is essentially kind of the same as a noindex, nofollow. There’s no really big difference there in the long run.” John Mueller, Google

NOTE:

QUOTE: “noindex, follow” “is essentially kind of the same as a” “noindex, nofollow” John Mueller, Google

and

QUOTE: “if someone were to link that page and you have it set to noindex like well they’re linking to nowhere” John Mueller, Google 2018

What is Anchor Text?

What is anchor text?

Definition: “Words, typically underlined on a web page that form a clickable link to another web page. Normally the cursor will change to a finger pointing if you hover over such a link.”

HTML code example:

 <a href="https://www.hobo-web.co.uk/">This is anchor text!</a>

How To Optimise Anchor Text

Use Descriptive Anchor Text – Don’t Use ‘Click Here’ as it provides no additional contextual information via the anchor text in the link.

QUOTE: “When calling the user to action, use brief but meaningful link text that: 1) provides some information when read out of context 2) explains what the link offers 3) doesn’t talk about mechanics and 4) is not a verb phrase” W3C

The accessibility consultants at the W3C advises “don’t say ‘click here’” and professional SEO professionals recommend it, too.

If you use link text like “go” or “click here,” those links will be meaningless in a list of links. Use descriptive text, rather than commands like “return” or “click here.”

For example, do not do this:

"To experience our exciting products, click here."

This is not descriptive for users and you might be missing a chance to pass along keyword rich anchor text votes for the site you’re linking to (useful to rank better in Google, Yahoo and MSN for keywords you may want the site to feature for).

Instead, perhaps you should use:

"Learn more about our search engine optimisation products."

Assistive technologies inform the users that text is a link, either by changing pitch or voice or by prefacing or following the text with the word “link.”

So, don’t include a reference to the link such as:

"Use this link to experience our exciting services."

Instead, use something like:

"Check out our SEO services page to experience all of our exciting services."

In this way, the list of links on your page will make sense to someone who is using a talking browser or a screen reader.

NB – This rule applies in web design when naming text links on your page and in your copy. Of course, you can use click here in images (as long as the ALT tag gives a meaningful description to all users).

QUOTE: “One thing to think about with image links is if you don’t have an alt text for that then you don’t have any anchor text for that link. So I definitely make sure that your images have alt text so that we can use those for an anchor for links within your website. If you’re using image links for navigation make sure that there’s some kind of a fallback for usability reasons for users who can’t view the images.” John Mueller, Google 2017

If that wasn’t usable enough, Google ranks pages, part, by keywords it finds in these text links, so it is worth making your text links (and image ALT text) relevant and descriptive.

You can use keyword mapping techniques to map important key phrases to important elements on important pages on your site (like internal links).

Screenshot-2015-11-17-14.35.12.png

I can provide website keyword mapping services to important pages on a site as part of my SEO audit. See my SEO auditing costs page.

Takeaway 7: Use Descriptive Anchor Text.

Is There A Limit On The Number Of Keywords In Anchor Text Links?

YESGoogle will only count the first 16 words (during 2018, at least) in the string of anchor text and this has changed at least once in the last decade (it used to be the first 8 words in the string and this includes ‘stop words’ I think it can be demonstrated).

I tested to see if there was a maximum limit of keywords Google will pass to another page through a text link. How many words will Google count in a backlink or internal link? Is there a best practice that can be hinted at?

My first qualitative tests were basic and flawed but time and again other opportunities for observation indicated the maximum length of the text in a link was perhaps 8 words (my first assumption was a character limit, maybe 55 characters, e.g not a word limit, but that was flawed).

I have had a look at this a few times now. I had a look at it again recently (about a month ago). Further observations at the time, (described below) pointed to keeping the important keywords in the first EIGHT WORDS of any text link to make sure you are getting the maximum benefit but in 2019 that number would be sixteen words (although I would STILL keep important keywords in the first 8-12 words of a link for maximum benefit).

We know that keywords in anchor text have some contextual value to Google. How many words will Google count as anchor text?

Here is some evidence.

Not many people link to you with very long anchor text links, so when people do, it is a chance to see how many keywords Google is counting as a keyword phrase in anchor text.

This blogger linked to me with an anchor text containing 21 keywords in the link:

example-article-with-anchor-text.png

From this observation we can use Google to check a few things:

screenshot-example-illustrating-16-words-anchor-text.png

In Google.co.uk the answer is consistently 16 words in anchor text for maximum contextual value.

Any signal within the 16 word limit in anchor text can be detected, nothing above the 16-word threshold was detected:

example-screenshot-illustrating-17-words-anchor-text.png

Ironically the 17th keyword in anchor text was ‘SEO’, which the Hobo page ranks page one in the UK, so it is ‘relevant’ for the query on many other ranking factors but this imposed limit is an instance where the document is not selected as a result of the limit.

How About If The Link Is In ALT Text?

Google limits keyword text in ALT attribute to 16 words too, so it may be reasonable to think it is the same.

Screenshot-2017-09-06-19.05.15-1.png

and

Screenshot-2017-09-06-19.33.19-1.png

See my article for more on optimising alternative text.

What If The Link Passes through a 301 redirect?

The same limits are in place.

In Google.co.uk the answer is consistently 16 words in anchor text for maximum contextual value:

example-anchor-text-passing-through-301-redirects.png

Any signal within the 16-word limit in anchor text can be detected, nothing above the 16-word threshold was detected:

example-anchor-text-passing-through-301-redirects-17-words.png

Do not link large blocks of text as Google only counts the first 16 words it finds in the anchor text link and the rest of the keywords ‘evaporate’ from the anchor text link leaving you with a 16 keyword maximum anchor text phrase to be descriptive and get the most out of links pointing to the page.

Something to think about.

Takeaway 8: Keep anchor text links within the limit of 16 keywords max. Keywords above the 16-word threshold limit seem to ‘evaporate’ in terms of any demonstrable value that I can show they pass.

See my article on 301 redirects and implementations.

Beware Using Plugins to Automate ‘SEO-Friendly’ Internal Links

I would avoid using plugins to optimise your internal links. Yes, its a time-saver, but it can also look spammy.

Google has been known to frown upon such activities.

Case in point:

QUOTE: “As happened to a friend of a friend, whose rankings went deep into the well of despair shortly after installing and beginning to use SEO Smart Links. Since there hadn’t been any other changes to the site, he took a flyer on a reconsideration request and discovered that yes, indeed, there had been a penalty”:

google-penalty-reconsideration-request-email-example-from-google.png

“So, what had this site owner done to merit a successful reconsideration request? Simple – he removed the SEO Smart Links plugin and apologized for using it in the first place” Dan Theis, Marketers Braintrust, 2012

You get the most from such optimisations if they are manual and not automated, anyway.

Takeaway 9: Don’t use plugins to artificially inflate link popularity to pages on your site

Does Google Count Internal Keyword Rich Links To Your Home Page?

The last time I tested this was a long time ago, and it was an age-old SEO trick that sometimes had some benefits.

A long time ago I manipulated first link priority to the home page of a site for the site’s main keyword – that is, instead of using ‘home‘ to link to my homepage, I linked to the home page with “insert keyword“). Soon afterwards the site dropped in rankings for its main term from a pretty stable no6 to about page 3 and I couldn’t really work out exactly any other issue.

Of course, it’s impossible to isolate if making this change was the reason for the drop, but let’s just say after that I thought twice about doing this sort of SEO ‘trick‘ in future on established sites (even though some of my other sites seemed to rank no problem with this technique).

I formulated a little experiment to see if anchor text links had any impact on an established home page (in as much a controlled manner as possible).

Result:

Well, look at the graph below.

ranking-drop-for-target-keyword-because-of-internal-links.gif

It did seem to have an impact.

It’s possible linking to your home page with keyword rich anchor text links (and that link being the ONLY link to the home page on that page) can have some positive impact in your rankings, but it’s also quite possible attempting this might damage your rankings too!

Trying to play with first link priority is for me, a bit too obvious and manipulative these days, so I don’t really bother much, unless with a brand new site, or if it looks natural, and even then not often, but these kinds of results make me think twice about everything I do in SEO.

I shy away from overtly manipulative onsite SEO practices in 2019 – and I suggest you do too.

Takeaway 10: Avoid anything too manipulative with internal anchor text.

Should I Use Nofollow on Internal Links (PageRank Sculpting)?

NO.

QUOTE: “I’d recommend not using nofollow for kind of PageRank sculpting within a website because it probably doesn’t do what you think it does” John Mueller, Google 2017

This was actually said by Matt Cutts too about 10 years ago (although even in 2019, I see pages with nofollow on internal links).

QUOTE: “Q: Does this mean “PageRank sculpting” (trying to change how PageRank flows within your site using e.g. nofollow) is a bad idea? A: I wouldn’t recommend it, because it isn’t the most effective way to utilize your PageRank. In general, I would let PageRank flow freely within your site. The notion of “PageRank sculpting” has always been a second- or third-order recommendation for us. I would recommend the first-order things to pay attention to are 1) making great content that will attract links in the first place, and 2) choosing a site architecture that makes your site usable/crawlable for humans and search engines alike.

Pagerank sculpting was a hot topic many years ago, and then changed the way they handled it:

QUOTE: “When we added a help page to our documentation about nofollow, we said “a solid information architecture — intuitive navigation, user- and search-engine-friendly URLs, and so on — is likely to be a far more productive use of resources than focusing on crawl prioritization via nofollowed links.” In a recent webmaster video, I said “a better, more effective form of PageRank sculpting is choosing (for example) which things to link to from your home page.” At Google I/O, during a site review session I said it even more explicitly: “My short answer is no. In general, whenever you’re linking around within your site: don’t use nofollow. Just go ahead and link to whatever stuff.” But at SMX Advanced 2009, someone asked the question directly and it seemed like a good opportunity to clarify this point. Again, it’s not something that most site owners need to know or worry about, but I wanted to let the power-SEOs know..” Matt Cutts, Google 2009

Questions arise if you start thinking about it too much – I know I did – before Google changed the way it handled nofollow:

  1. Should you nofollow unimportant internal pages or nofollow external links in an effort to consolidate the Pagerank you have already accrued?
  2. Or should you spend your time getting other quality links pointing to your site to increase the PR you have to start off with (how you get Pagerank)?

The long-term best impact strategy here is simply to earn more Google Pagerank in the first place than PageRank sculpt with rel=nofollow links.

You can certainly control PR on a granular level (page by page in this case) – that is, which page gets available real PR from another page on your site. It’s easy to follow, that some SEO professionals think, if that’s the case, you can sculpt Pagerank, and channel page rank to important pages in a site.

The theory was that by adding the attribute to (for instance) internal links to your contact page, or disclaimer, or privacy policy page would send more PageRank to more important pages on your site.

I’d long fell out of love with PR sculpting internal pages using the attribute after testing the theory. The results were not worth it for me on the sites I worked on (some are quite large) –  a few years back I posted this about PR sculpting:

QUOTE: “I’ve been playing about with rel=’nofollow’ on this site for 4 months, and in all honesty, in future, I won’t be relying on nofollow to sculpt unimportant pages out of any possible link graph, just optimising those pages better, or leaving them out altogether, like I used to do in 1999. It can be a useful tool in a site redevelopment, but from here on in, I’ll be keeping nofollow for bad neighbourhoods and, pending further testing, on top level blog pages.” Shaun Anderson, Hobo

In June 2008 I also posted this about Nofollow and PR Sculpting:

QUOTE: “I tested it, and as far as I am concerned, on a 300 page site at least, any visible benefit is microscopic.” Shaun Anderson, Hobo

In theory, PR sculpting sounded cool, but was in practice, very disappointing. Some people think it works, of course, even to this day. Maybe the effects are noticeable on giant sites with millions of pages.

Notes

I should point out you do not use rel=”nofollow” to prevent the indexing of a page – merely to control which pages any particular page shares it’s Pagerank.

When it came to first link priority on a page for Google it appeared at that time that the first link you nofollow on a page *might* also nofollow any other link to the same URL on that page.

Google changed the way it flowed PR through nofollowed links ten years ago, making Pagerank sculpting redundant:

QUOTE: “So what happens when you have a page with “ten PageRank points” and ten outgoing links, and five of those links are nofollowed? Let’s leave aside the decay factor to focus on the core part of the question. Originally, the five links without nofollow would have flowed two points of PageRank each (in essence, the nofollowed links didn’t count toward the denominator when dividing PageRank by the outdegree of the page). More than a year ago, Google changed how the PageRank flows so that the five links without nofollow would flow one point of PageRank each.” Matt Cutts, Google 2009

When you add ‘rel nofollow’ to internal links you spread less Pagerank around your own site.

QUOTE: “Q: Does this mean “PageRank sculpting” (trying to change how PageRank flows within your site using e.g. nofollow) is a bad idea? A: I wouldn’t recommend it, because it isn’t the most effective way to utilize your PageRank. In general, I would let PageRank flow freely within your site. The notion of “PageRank sculpting” has always been a second- or third-order recommendation for us. I would recommend the first-order things to pay attention to are 1) making great content that will attract links in the first place, and 2) choosing a site architecture that makes your site usable/crawlable for humans and search engines alike.“Matt Cutts, Google 2009

and

QUOTE: “So today at SMX Advanced, sculpting was being discussed, and then Matt Cutts dropped a bomb shell that it no longer works to help flow more PageRank to the unblocked pages. Again — and being really simplistic here — if you have $10 in authority to spend on those ten links, and you block 5 of them, the other 5 aren’t going to get $2 each. They’re still getting $1. It’s just that the other $5 you thought you were saving is now going to waste.” Moz, 2009

So – it is fairly unequivocal. The Pagerank you think you are sculpting around your site is actually “evaporating” (a quote from Matt Cutts at SMX 2009) and you want MORE PageRank in your site not less of it.

However, and here’s where some confusion comes in, even in 2019.

Google does say this in their general guidelines about nofollow:

QUOTE: ” Crawl prioritization: Search engine robots can’t sign in or register as a member on your forum, so there’s no reason to invite Googlebot to follow “register here” or “sign in” links. Using nofollow on these links enables Googlebot to crawl other pages you’d prefer to see in Google’s index. However, a solid information architecture — intuitive navigation, user- and search-engine-friendly URLs, and so on — is likely to be a far more productive use of resources than focusing on crawl prioritization via nofollowed links.“Google Webmaster Guidelines, 2018

However, I very much doubt “crawl prioritisation” is something 99% of Webmasters need to be concerned about at all. It is a concern on a site with millions of pages, but not most sites.

The simple answer is to NOT apply rel=nofollow to ordinary HTML internal links on your website.

How To Find Rel=Nofollow On Internal Links On A Website

how-to-find-nofollow-links-on-your-site.png

SEMRush finds and highlights nofollow links on internal links across your website so that you can check if they should be removed. In the example above, the rel=nofollow attribute was removed from every internal link as they were erroneously implemented in the first place.

Simply set up a site-audit using the SEMRush Audit Tool and the tool will automatically highlight any internal nofollow links (and a lot more, naturally).

Takeaway 11: Don’t use rel=nofollow on internal links.

How Can I Optimise Anchor Text Across A Website?

Optimising your anchor text across an entire site is actually a very difficult and time-consuming process. It takes a lot of effort to even analyse internal anchor text properly.

You can use tools like SEMRush (specifically the SEMRush Audit Tool), SiteBulb CrawlerDeepCrawlScreaming Frog or SEO Powersuite Website Auditor to check the URL structure and other elements like anchor text on any site sitewide.

If you are not technically minded, we can analyse and fix your site for you, if necessary, as part of our fixed price SEO service.

Does Google Count Keywords in Anchor Text In Internal Links? YES

Recently I looked to see if Google counts keywords in the URL to influence rankings for specific keywords and how I investigated this.

In this article, I am looking at the value of an internal link and its impact on rankings in Google.

My observations from these tests (and my experience) include:

  • witnessing the impact of removing contextual signals from the anchor text of a single internal link pointing to a target page (April 15 impact in the image below)
  • watching as an irrelevant page on the same site take the place in rankings of the relevant target page when the signal is removed (19 April Impact)
  • watching as the target page was again made to rank by re-introducing the contextual signal, this time to a single on-page element e.g. one instance of the keyword phrase in exact match form (May 5 Impact)
  • potential evidence of a SERP Rollback @ May 19/20th
  • potentially successfully measuring the impact of one ranking signal over another (a keyword phrase in one element via another) which would seem to slightly differ from recent advice on MOZ, for instance.

Will Google Count Keywords in Internal Anchor Text Links?

QUOTE: “we do use internal links to better understand the context of content of your sites” John Mueller, Google 2015

Essentially my tests revolve around ranking pages for keywords where the actual keyphrase is not present in exact match instance anywhere on the website, in internal links to the page or on the target page itself.

The relevance signal (mentions of the exact match keyword) IS present in what I call the Redirect Zone – that is – there are backlinks and even exact match domains pointing at the target page but they pass through redirects to get to the final destination URL.

In the image below where it says “Ranking Test Implemented” I introduced one exact match internal anchor text link to the target page from another high-quality page on the site – thereby re-introducing the ‘signal’ for this exact match term on the target site (pointing at the target page).

Where it says ‘Test Removed‘ in the image below, I removed the solitary internal anchor text link to the page, thereby, as I think about it, shortcutting the relevance signal again and leaving the only signal present in the ‘redirect zone’.

Screenshot-2016-04-16-21.51.48.png

It is evident from the screenshot above that something happened to my rankings for that keyword phrase and long tail variants exactly at the same time as my tests were implemented to influence them.

Over recent years, it has been difficult for me, at least, to pin down, with any real confidence anchor text influence from internal pages on an aged domain. Too much is going on at the same time, and most out of an observers control.

I’ve also always presumed Google would look at too much of this sort of onsite SEO activity as attempted manipulation if deployed improperly or quickly, so I have kind of just avoided this kind of manipulation and focused on improving individual page quality ratings.

TEST RESULTS

  1. It seems to me that, YES,  Google does look at keyword rich internal anchor text to provide context and relevance signal, on some level, for some queries, at least.
  2. Where the internal anchor text pointing to a page is the only mention of the target keyword phrase on the site (as my test indicates) it only takes ONE internal anchor text (to another internal page)  to provide the signal required to have NOTICEABLE influence in specific keyword phrase rankings (and so ‘relevance’).

——————————————————-

Test Results: Removing Test Focus Keyword Phrase from Internal Links and putting the keyword phrase IN AN ALT TEXT ELEMENT on the page

Screenshot-2016-05-25-15.43.32.png

To recap in my testing: I am seeing if I can get a page to rank by introducing and removing individual ranking signals.

Up to now, if the signal is not present, the page does not rank at all for the target keyword phrase.

I showed how having a keyword in the URL impacts rankings, and how having the exact keyword phrase in ONE internal anchor text to the target page provides said signal.

Ranking WEIRDNESS 1: Observing an ‘irrelevant’ page on the same site rank when ranking signal is ‘shortcutted’.

The graph above illustrates that when I removed the signal (removed the keyword from internal anchor text) there WAS a visible impact on rankings for the specific keyword phrase – rankings disintegrated again.

BUT – THIS TIME – an irrelevant page on the site started ranking for a long tail variant of the target keyword phrase during the period when there was no signal present at all in the site (apart from the underlying redirect zone).

Screenshot-2016-05-25-15.36.05.png

This was true UNTIL I implemented a further ranking test (by optimising ANOTHER ELEMENT actually on the page this time, that introduced the test focus keyword phrase (or HEAD TERM I have it as, in the first image on this page) again to the page – the first time that the keyword phrase was present on the actual page (in an element) for a long time).

WEIRDNESS 2 – SERP Rollback?

On May 1st I added the test focus keyword to the actual page in a specific element to test the impact of having the signal ONLY in a particular element on the page.

As expected the signal provided by having the test keyword phrase ONLY in one on-page element DID have some positive impact (although LESS than the impact when the signal was present in Internal Links and this comparison I did find very useful).

That’s not the anomaly – that results from RANKING TEST 3 were almost exactly as I expected. A signal was recognised, but that solitary signal was not enough to make the page as relevant as it was to Google when the signal was in internal links.

The weirdness begins on May 17, where I again removed the keyword phrase from the target page. I expected with NO SIGNAL present anywhere on the site or the page, Google rankings would return to their normal state (zero visibility).

The opposite happened.

Screenshot-2016-05-25-15.36.20.png

WTF?

Rankings returned to the best positions they have been for the term SINCE I started implemented these ranking tests – even WITHOUT any signal present in any of the areas I have been modifying.

Like a memory effect, the rankings I achieved when the signal was present only in internal links (the strongest signal I have provided yet) have returned.

THINKING OUT LOUD

It’s always extremely difficult to test Google and impossible to make any claims 100% one way or another.

The entire ecosystem is built to obfuscate and confuse anyone trying to understand it better.

Why have rankings returned when there is no live signal present that would directly influence this specific keyword phrase?

My hunch is that this actually might be evidence of what SEOs call a SERP ROLL-BACK – when Google, we think, randomly ‘rolls’ the set of results back to previous weeks SERPs to keep us guessing.

If this is a rollback the rollback time frame must be during the period of my RANKING TEST 2 (a month or so maximum) as the page did not rank for these terms like this at all for the year before – and yet they come back almost exactly as they were during my test period.

In the following image, I show this impact on the variant keyword (the keyword phrase, with no spaces) during this possible ‘roll back’.

Screenshot-2016-05-25-23.58.46.png

An observation about SERP ROLLBACKS.

If a rollback is in place, it does not seem to affect EVERY keyword and every SERP equally. NOT all the time, at least.

MORE IMPORTANTLY – Why did an irrelevant page on the same website rank when the signal was removed?

The target page was still way more relevant than the page Google picked out to present in long tail SERPs – hence my question.

Google was at least confused and at worst apathetic and probably purposefully lazy when it comes to the long-tail SERPs.

Because my signal to them was not explicit, ranking just seemed to fail completely, until the signal was reintroduced and I specifically picked out a page for Google to rank.

To be obvious, something else might be at play. Google might now be relying on other signals – perhaps even the redirect zone – or relative link strength of the ‘irrelevant’ page – but no matter – Google was ranking the less relevant page and CLEARLY IGNORING any relevance signals passing through the redirect zone to my target page.

Observations

From my ranking test 2 (internal links) – it is evident that modifications to internal links CAN make IRRELEVANT PAGES on your site rank instead of the target page, and for some time, IF by modifying these internal links, you SHORTCUT the signal that these links once provided to the target page in a way that entirely removes that signal from the LIVE signals your site provide for a specific keyword phrase (let’s call this in the “CRAWL ZONE” e.g. what can be picked up in a crawl of your HTML pages, as Google would do – which sits above the REDIRECT ZONE in my imagination – which is simply where signals need to pass through a 301 redirect).

That test was modifying only ONE anchor text link.

This might be very pertinent to site migrations when you are modifying hundreds of links at the same time when you need to migrate through redirects and change of URLs and internal anchor text.

YES – the signal may return – but this doesn’t look to anything that happens over a quick timescale. Site migrations like this are potentially going to be very tricky if low-quality pages are in the mix.

Which Ranking Signal Carries the most weight?

In these two tests, I switched the signal from internal links to another element, this time in ALT TEXT on the page, to observe the impact on the change in terms of rankings for the page for the test focus keyword phrase.

Screenshot-2016-05-25-21.29.04.png

The signal I switched it to (ALT TEXT)  would seem to have less of an impact on ranking than internal links, therefore making a Whiteboard Friday at least potentially inaccurate in terms of weighting of signals for relevance as they are presented on the whiteboard.

This would be UNLESS I have misinterpreted the presumed rollback activity in the SERPs in May 2016 and those rankings are caused by something else. Only time will shed some light on that, I think.

Screenshot-2016-05-25-21.15.00.png

Yes – I replaced the signal originally in H (in Rand’s list) in an element that was in E (on Rands list) – and the result was to make the page markedly LESS relevant, not more.

Site Quality Algorithm THEORY

If you have site quality problems, then any help you can get will be a good thing.

Any advice I would offer, especially in a theory, would really need to be sensible, at worst! I have clearly shown in past posts that simply improving pages substantially clearly does improve organic traffic levels.

This is the sort of thing you could expect in a ‘fairish’ system, I think, and we apparently have this ‘fairness’, in some shape or form, baked in.

If you improve INDIVIDUAL pages to satisfy users – Google responds favourably by sending you more visitors – ESPECIALLY when increased user satisfaction manifests in MORE HIGH-QUALITY LINKS (which are probably still the most important ranking signal other than the content quality and user satisfaction algorithms):

Screenshot-2016-05-26-01.01.06.png

In these SEO tests, I have been DELIBERATELY isolating a specific element that provides the signal for a specific keyword phrase to rank and then SHORTCUTTING it, like in an electrical switch, to remove the signal to the target page.

What if this process is how Google also shortcuts your site from a site quality point of view e.g. in algorithm updates?

Let us presume you have a thousand pages on your site and 65% of them fail to meet the quality score threshold set for multiple keyword phrases the site attempts to rank for – a threshold Google constantly tweaks, day to day, by a fractional amount to produce flux in the SERPs. In effect, most of your website is rated low-quality.

Would it be reasonable to presume that a page rated low-quality by Google is neutered in a way that it might not pass along the signals it once did to other pages on your site, yet still remain indexed?

We have been told that pages that rank (e.g. be indexed) sometimes do not have the ability to transfer Pagerank to other pages.

Why would Google want the signals a low-quality page provides, anyways, after it is marked as low-quality or more specifically not preferred by users?

SEOs know that pages deemed extremely low-quality can always be deindexed by Google – but what of pages that Google has in their index that might not pass signals along to other pages?

It would be again reasonable to suggest, I think, that this state of affairs is a possibility – because it is another layer of obfuscation – and Google relies on this type of practice in lots of areas to confuse observers.

So – let us presume pages can be indexed but can sometimes offer no signals to other pages on your site.

If Google’s algorithms nuke 65% of your pages ability to relay signal to other pages on your site, you have effectively been shortcutted in the manner I have illustrated in these tests  – and that might end up with a state of affairs where irrelevant pages on your website will start to rank in place of once relevant pages, because the pages that did provide the signal no longer pass the quality bar to provide context and signal to your target pages (in the site structure).

Google has clearly stated that they “use internal links to better understand the context of the content of your sites“.

If my theory held water, there would be lots of people on the net with irrelevant pages on their site ranking where other, relevant pages once did – and this test would be repeatable.

If this was true – then the advice we would get from Google would be to IMPROVE pages rather than just REMOVE them, as when you remove them, you do not necessarily reintroduce the signal you need and you would get if you IMPROVED the page with the quality issues e.g. the ideal scenario in a world with no complication.

Especially if webmasters were thinking removing pages was the only answer to their ranking woes – and I think it is fair to say many did, at the outset of this challenge.

Guess what?

Screenshot-2016-05-25-19.32.25.png

This would make that statement by Google entirely correct but monumentally difficult to achieve, in practice, on sites with lots of pages.

Site Quality punishment is a nightmare scenario for site owners with a lot of low-quality pages and especially where it is actually CONTENT QUALITY (as opposed to a purely technical quality issue) that is the primary issue.

It is only going to get acuter a problem as authorship, in however form Google assigns it, becomes more prevalent e.g. you can have high-quality content that is exactly what Google wants, but this will be outranked by content from authors Google wants to hear from e.g. Danny Sullivan over the rest of us, as Matt Cutts was often fond of saying.

There is incredible opportunity ahead for authors with recognized topical expertise in their fields. Google WANTS you to write stuff it WILL rank. WHO writes your content on your website might be even more important than what is written on your page (in much the same way we recognised what we called ‘domain authority’ where Google gave such ability to rank to domains with a lot of ‘link juice’).

To be clear – I still think a lot of the old stuff is still baked in. Links still matter, although ‘building’ low-quality external backlinks is a lot riskier today.

Theoretically, fixing the content quality ‘scores’ across multiple pages would be the only way to reintroduce the signal once present on a site impacted by Google Panda or Site Quality Algorithms, and this would be, from the outset, an incredibly arduous – almost impossible – undertaking for larger sites – and AT LEAST a REAL investment of time and labour – and again – I think there would be lots of sites out there in this sort of scenario, if my theory held water, and you accepted that low-quality content on your site can impact the rankings for other pages.

Google actually has said this too:

QUOTE: “low-quality content on part of a site can impact a site’s ranking as a whole” Google

It is probably a nearly impossible task for content farms with multiple authors of varying degrees of expertise in topics – mostly zilch – with the only way I see of recovering from that would be at best distasteful and at worst highly unethical and rather obvious so I won’t print it.

Let’s look at that statement in full, from Google, with emphasis and numbers added by me:

One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus 1. removing low quality pages, 2.merging or 3.improving the content of individual shallow pages into more useful pages, or4. moving low quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE

To recover from Google Panda and site quality algorithms, unless you are waiting for Google to release a softer Panda…. you really need to focus doing ALL of the points 1-3 – but lots of webmasters stop at number 1 thinking that will be sufficient when any SEO with any experience knows that is way too simple for what Google wants this entire process to achieve – to take TIME – and, accusations have been made, in many a forum to drive the cost of organic SEO UP to comparable levels and beyond of Adwords.

I advised a client to do no.4 (4. moving low-quality pages to a different domain) and move an old very-low-quality blog with zero positive signal for the business to another domain to expedite the ‘this content isn’t here anymore – don’t rate my site on this‘ process for re-evaluation by Google.

Site quality problems are, BY DESIGN, MEANT to take a long time to sort out – JUST LIKE GOOGLE PENGUIN and the clean up of unnatural links – but, contrary to the complaints of many webmasters who accuse Google of being opaque on this subject, Google tells you exactly how to fix Google Panda problems and Matt Cutts has been telling people all along to “focus on the user” – which again is probably an absolute truth he can feel he is morally correct in relaying to us (which many guffawed at as lies).

If a site quality algorithm was deployed in this fashion, then punishment would be relative to the infraction and cause the maximum amount of problems for the site owner relative to the methods used to generate rankings. All that once helped a site rank could be made to demote it and hold it under the water, so to speak. In a beautiful system, I think, you would actually penalise yourself not so much Google penalising you, and users would indeed determine the final ranking order in organic SERPs not being ‘sharded’ by Google for their own benefit.

We would, of course, need to assume Google has a ‘Quality Metric‘ separate of relevance signals that are deployed in this fashion.

Guess what?

If your site is impacted by this shortcut effect, then identifying important pages in your hierarchy and user journey and improving them is a sensible way to proceed, as you may well be providing important signals for other pages on your site, too.

Why does Amazon rank for everything during these updates? That is the accusation, at least, and this theory would have an answer.

I would presume that when you remove an important signal from your website, you don’t have many other pages that provide said signals. Amazon ALWAYS has higher quality multiple pages on the same topic and so other signals to fall back on and EASY for an algorithm not to f*&^ up. Amazon, too, probably has every other positive signal in bucket loads too, let’s not forget.

Site quality algorithms deployed in this manner would be a real answer to a wayward use of ‘domain authority’, I’ve long thought.

What about webmasters who have, in good faith, targeted low-quality out of date content on a site and removed it, in order to combat Panda problems?

This was natural after Google said in effect to clean up sites.

I imagine somewhere in Google’s algorithm there is a slight reward for this activity – almost as if Google says to itself – “OK, this webmaster has cleaned up the site and brought the numbers of lower quality pages down on the site, thereby incrementally improving quality scores and so traffic levels we will allow it” – but NOT to the extent that would ever bring back traffic levels to a site hit by Content Quality Algorithms (after May 2015, especially).

Google, I think, must seek to reward white hat webmasters (on some level) if the intent is to adhere to the rules (even if those recommendations have been slightly misunderstood) or what is the point of listening to them at all? Most distrust Google and most evidently consistently fail to understand the advice given.

Again – if my theory held water, there would be a lot of webmasters who spent a lot of time cleaning up sites to comply with Panda who DO see positive numbers month to month in terms of increased organic traffic to a site – but rarely do they see a quick return to former ranking glory WITHOUT a severe investment in sitewide page quality improvement.

I have certainly observed this, but many sites miss the ‘site quality’ aspect of Panda in that it is NOT just about ‘content quality’.

From my own experience – you cannot just delete pages to bring traffic levels back to a site in the same numbers after a Panda algorithm change that impacts your site. You must also improve remaining content, substantially.

It is baked into the system that if you have ranked with low-quality techniques, it is going to take a monumental effort deployed quickly to get your site moving again.

Confirmation Bias? Conspiracy theory?

You can tell me.

This theory posits that the site quality metric can shortcut your site and cause irrelevant pages to rank, and maybe even relevant pages to rank lower than they could if ALL the pages on the site where high quality. It is just a theory though.

I’ve deployed these tactics on this very site over the last few years to see if it drove traffic (and this is why I do SEO the way I currently do it). When I was focused on it, this was the results:

Screenshot-2016-05-25-23.37.34.png

SEO has become part of the legitimate long-term marketing mix with quick or even manipulated results at least a potentially business damaging exercise.

Conversely, though, if you achieve ranking nirvana through improved legitimate site and content quality efforts, you can be rest assured the effort required to dislodge you is probably going to be relative to the effort you put in (if applied correctly in the first place) and a great barrier to entry for all but your most dedicated competitors.

On that point, those that do rank and Google themselves are more than happy with that scenario.

I did like when SEO was fast, gratification was instant and risk was a distant prospect. Today risk is always seemingly close by, and gratification becomes increasingly a longer prospect.

I am wondering if the black hat SEO tests might be more fun.

How To Proceed

Improve pages in the user journey. Try to NOT present low-quality pages to users.

I think it reasonable to say that identifying DEAD PAGES is still an incredibly important first step but how to handle the challenge from there is of equal importance.

Removal of stale, irrelevant, out of date obviously low-quality content on a domain should still be a prerequisite for most webmasters. Canonical’s and redirects are STILL your VERY BEST FRIEND when merging ANY PAGES (but pay close attention to Google dicking about with your redirect chains are all your work goes in the toilet).

Paying close attention to where important signals lie on your site is the only way to attempt to protect them, especially during periods of change, where Google is happy to shortcut your ranking ability.

If you fail to preserve certain signals during changes, you remove signals, and irrelevant pages can rank instead of relevant pages.

If you ultimately don’t improve pages in a way that satisfies users, your quality score is probably coming down, too.

Does Google Count Keywords In The URL As A Ranking Signal?

Do you need keywords in URLs to rank high in Google in 2019?

No.

Is there a benefit to having keywords in URLs?

Yes.

QUOTEI believe that is a very small ranking factor.” John Mueller, Google 2016

While I wouldn’t necessarily rip apart a site structure JUST to change URLs, it is the case that often poorly thought out website page URLs are a sign of other, less obvious sloppy SEO, and if a big clean up is called for, then I would consider starting from what is a basic SEO best practice and use search engine friendly URLs.


No hay comentarios:

Publicar un comentario