Link metrics that matter

  • 0
  • February 4, 2016
Stephen Kenwright

Stephen Kenwright

Director of Search

A good link passes 3 things: PageRank, TrustRank and traffic.

PageRank

Toolbar PageRank will never be updated again and has arguably been useless since 2011. The SEO industry is forced to rely on third parties’ best guess metrics as to how PageRank is transferred, with Moz’s Domain Authority usually considered a front runner.

“Domain Authority is a score (on a 100 point scale) developed by Moz that predicts how well a website will rank on search engines. Use Domain Authority when comparing one site to another or tracking the “strength” of your website over time. We calculate this metric by combining all our other link metrics – linking root domains, number of total links, MozRank, MozTrust etc. – into a single score.”

So Domain Authority is meant to predict how well a website ranks – not how much authority can be transferred from one site to another.

That would depend on, among other things, the link’s location and type. A link further up the page is likely to transfer more authority than one further down. Links in author bios are likely to be discounted completely. A sitewide link will probably transfer more authority – unless it’s in a sidebar or a footer, in which case it might transfer none.

The Mozscape Index is another sticking point. Have you ever audited a backlink profile? In my experience Open Site Explorer can pull 50-60% of the links accessible in a backlink profile at any one time – it has to be combined with Search Console, Bing Webmaster Tools, Ahrefs and Majestic to get a decent sample. Even then, Search Console displays only a snapshot of the links Google has access to. At one point we were estimating this at 10% – which is why a reconsideration request is sometimes turned down citing link examples that we haven’t even seen before.

Open Site Explorer also can’t see a website’s disavow file so it not only displays linking URLs that might not be counted but it uses these when calculating its metrics too. Several of our clients have upwards of 75% of their link profiles in their disavow files – several more gain upwards of 10,000 anchor text links every week which are disavowed but still used to calculate Moz metrics. This is obviously going to make any kind of “link velocity” metrics completely useless as well – another factor used to calculate Domain Authority.

Disavowed link profile with brand terms redacted

An actual link profile with brand terms redacted. Yes, most of this is in the disavow file.

So Domain Authority is a made up metric, based on a factors including the number of links – which is inaccurate – and MozRank and MozTrust, which are also made up.

Returning to Moz’s use case for Domain Authority – “use Domain Authority when comparing one website to another”. Ignoring the fact that Moz states it should really only be used to estimate how well a site will rank and not how much authority could be transferred from one site versus another, I think it’s worth asking ourselves why we would need to compare two websites:

The answer is that we’ve got finite resources. But despite the industry’s dependence on Moz metrics and PageRank, I don’t think link metrics are the first thing any link builder looks at.

  1. First we’ll look at relevance. There are a bunch of websites in the Philippines with great metrics that the brand we’re working with just isn’t relevant to…and notice I said that we’re not relevant to them, not the other way around. When you reach out to a blogger you shouldn’t treat them like you’re asking them to do you a favour, or like they’ve won the lottery either. You’re giving them something – a story, something to write about – because your brand is relevant to them and their audience, not the other way around.
  2. Second we’ll look at legitimacy. We’ll visit the social profiles and read some posts. Does the site sell links? Is there a disclosure policy? Is there an affiliate scheme? Who does the blogger link out to? A handy trick is to go to bing.com (yes, seriously) and do a linkfromdomain: search (all one word, as if you were doing a site: search in Google) – you can see every external site a website is linking to.
  3. Then we’ll look at reach. Does the site get comments? Does the owner get tweeted at? Does the site get links? Your link target is not a domain – your target is a person. You should treat them like you’re trying to win a customer.

So regardless of whether you think link metrics are worthless or not, you probably only use them to differentiate between two sites with all other things being equal. Why would you need to do this? Well, if you’ve got a link budget. You’ve got £50 and only one blogger can have it. Link metrics are a hangover from the days when we paid for links and we just don’t live in that world anymore.

This is the exact same reason why there’s no point caring about block IP addresses for linking sites anymore. It never had any bearing on the amount of authority passed. But if one site on a network was caught selling links Google would take down the whole thing.

TrustRank

On the subject of paid links:

Paid links pass PageRank. When a website is caught selling links Google doesn’t prevent that site from ‘voting’ for other sites, which is basically how linking works – but it doesn’t trust those votes so it doesn’t count them.

Majestic’s Trust Flow metric is meant to emulate how TrustRank works and the principle is totally sound (I’d go so far as to say that I actually like the Trust Flow metric).

  1. Start with a list of “trusted” sites. Literally manually compile a list of websites you think that Google is likely to trust
  2. Calculate how far many links are in the chain between the website you’re looking at and a trusted site

The problems are basically the same as those with Domain Authority:

  1. We can’t really know which websites Google trusts and which it doesn’t. Even if we know the criteria Google is using, neither MozTrust nor Trust Flow has ranked as many websites as Google
  2. There’s no indication in the numerical score that none of the links in the chain come from sites that Google knows are link sellers. One paid link – or more worryingly, even a legitimate link from a domain that has been caught selling links – and the chain ends. No TrustRank is passed.

It’s for this reason that Toolbar PageRank was actually a better measure of TrustRank than PageRank – we could tell when a site had been caught selling links because its PageRank was halved or reduced to zero. It does beg the question: the sites caught selling advertorials a few years ago often had their PageRank halved – like the Independent. Does Google trust links from these websites?

Looking at the patent Google filed for TrustRank back in 2009:

“A search engine system provides search results that are ranked according to a measure of the trust associated with entities that have provided labels for the documents in the search results.”

So a website labels another website – e.g. links out – and since 2009 Google has determined the trustworthiness of the linking source. The patent also goes into “annotations” – e.g. anchor text.

“For example, an entity [this is 2009 and Google is talking about entities] such as a digital camera expert [remember Google is looking for expertise, authority and trust] operating a website devoting a website devoted to digital cameras, may create an annotation associating the label “professional review” with a particular review of a digital camera on some third party site (e.g., on the site of a news publication). In addition, the system maintains information about trust relationships between entities, such as individual users, indicating whether (or the degree to which) one entity trusts another entity.”

So let’s make some assumptions based on the patent:

  1. If your website is labelled (e.g. linked to) by another website using a particular annotation (e.g. anchor text), the links will pass more trust if you have that anchor text on your page because Google understands that what the linking site is talking about directly references what’s on the other side of the link. So for example: if you’re pushing out a survey and websites such as news publications are linking to your homepage and referencing the survey results you could improve the trust flowing through the link by creating an optimised page containing the survey results and linking to that instead. So I would suggest going through your link profile, looking at the anchor text you’ve got, and tweaking the copy on your page to make sure it reflects your inbound links where that’s possible
  2. The trust flowing through a link depends on the relationship between the two entities, so getting more than one link from the same author increases the amount of trust flowing through the links. I write 2-3 blog posts per week on various sites and about half of them link to Screaming Frog’s SEO Spider page. Therefore Google assumes that I really fucking trust Screaming Frog and as long as I’m not selling or buying links (I’m not) you can take those links to the bank. So “one night stands” in link building have got to end – maybe you get more PageRank with a more diverse link profile but you get more TrustRank if you keep getting referenced in the same trusted places.
  3. There is a trust relationship between the entities i.e. sites. A link is an indication that one website trusts another website. Google then corroborates whether it trusts the linking entity using a “second trust rank” (this is in the patent) to form what it calls a “trust factor”. The relationship between the entities is crucial and Google needs to understand whether the site being linked to trusts the site linking to it. Three years after filing the TrustRank patent Google launched the Disavow Links tool. Do you think Google did this to help penalised businesses? If Google cared about penalised businesses it would roll out Penguin more than once every 18 months.

Disavow links tool

The disavow file is a massively multiplayer trust rating engine, whether millions of SEOs dictate to Google whether they trust a linking domain or not. Google has flat out denied that adding a website to your disavow file – even at scale – will influence how that website ranks. Google will not say a word as to whether it will continue to trust a website that is in a hundred thousand disavow files.

As I said above: Moz and Majestic can’t see anyone’s disavow files; may or may not be able to detect whether a link is paid; and may or may not begin with a pool of trusted sites that looks anything like Google’s.

Traffic

It’s obviously difficult to estimate in advance how much traffic a website is likely to send. Your best guess is to judge how engaged the website’s audience is, but the real differential here is what you’re offering to the audience on top of what the journalist or blogger is writing about.

This is what we call link earning.

Earning coverage is relatively straightforward:

  1. Once you’ve decided on your media target work out what you can offer them that’s newsworthy. Data or an expert opinion usually do fine as long as you’ve got your audience and timing right.
  2. Make it as easy as possible for the site to cover your story. Write a press release containing quotes from someone in your organisation, make sure the format is something that works on the website (e.g. if you want a website to embed something make sure the dimensions work). Journalists are busy and they don’t want to work for a story.

Turning that coverage into links is more difficult. You need to hold something back on your own website. As above: host the full survey results on your site; visualise the data; publish a full interview with your thought leader on your blog; create a quiz or a game that a journalist can’t steal. Having something prepared in the first place is much more effective than getting back in touch with a site owner who hasn’t linked and asking them to edit their page.

If a journalist annotates your website with a call to action – “see the results/play the game here” – people will click through. We make sure we do this because we report on traffic.

Branded3 reports on links before and after we build them.

  1. Before we include the URL, name and contact details of the blogger or journalist we want to reach out to, as well as the angle we’ll approach them with. We know what they write about so we know what we think they’re most likely to cover. We send this over to our clients in the form of a seeding list – we cross reference with internal PR teams to make sure we’re not going after contacts the client already has a relationship with, for example.
  2. Afterwards we include the URL, the referral traffic, conversions and assisted conversions, and social shares and comments the post has received. No link metrics at all – and we make sure to point out that a post with more social shares isn’t better than one with less; it’s simply an indication that the page has been seen. We report on “nofollow” links; we report on non-linking mentions. A non-linking mention is an annotation – there’s every chance it can pass trust even if it doesn’t pass PageRank. A “nofollow” link can pass traffic even if it doesn’t pass trust.

Some reports get more detailed. We update referral traffic to see whether more has been sent. We use 90 day cookies in the ecommerce reports in Analytics to see if more referral traffic has converted. Know your customers’ purchase cycle. If you place a link on the 20th of Jan and report on the 1st Feb you won’t take into consideration conversions if it’s a 6-8 week purchase cycle. Revisit it. You could even use SEMrush to see what the linking pages rank for, estimating how much search traffic the page still gets. Funnily enough nobody asks for that.

Free of charge. Unsubscribe anytime.