By 6 years ago in SEO

Using X-Robots HTTP Header To Sneakily Delete Your Link Farms

Warning- The strategy below might be risky.

Zoe asked a question this week about the best way to remove old link exchange pages. I responded on Twitter but wanted to post about it as well.

It’s well known that having loads of links pages on your website isn’t a good idea. One links page with 20 links is fine, 10 links pages all with 100 links will probably do more harm than good.

If you have been given a penalty in Google then deleting all these links pages is the first thing you need to do. But what if your site is ranking well but you want to get rid of the pages before Google gives you a penalty? Surely if you remove the pages everybody will stop linking to you and your rankings will fall?

The key is to find a way of silently removing your links pages, without alerting all your link partners to the fact that their link has gone.

My solution is to make use of the X-Robots HTTP header that Googlebot is now supporting. This allows you to add the noindex meta information to a page server side so that visitors can’t see any traces in your robots.txt or your source code.

header(’X-Robots-Tag: noindex, nofollow’, true);

This will remove your links pages from Google and probably won’t alert your link partners.

You might also like to cloak the X-Robots tag so that it only appears to Googlebot – you don’t want people to be able to run a header checker and see your sneaky plan.

By Patrick Altoft. at 2:41PM on Wednesday, 19 Nov 2008

Patrick is the Director of Strategy at Branded3 and has spent the last 11 years working on the SEO strategies of some of the UK's largest brands. Patrick’s SEO knowledge and experience is highly regarded by many, and he’s regularly invited to speak at the world’s biggest search conferences and events. Follow Patrick Altoft on Twitter.

comments

  • wesley

    Ok, that’s pretty darn unethical..

  • http://www.blogstorm.co.uk Patrick Altoft

    Without a doubt it’s totally unethical.

  • http://www.seohome.co.uk gabs

    Sneaky..

  • Rick

    Cheers Patrick, always wondered how to go around this.

  • http://yoast.com Joost de Valk

    Patrick, you’re smart but a stupid fuck at the same time :) (and I’ve been doing this on some sites ever since I wrote the post you linked to ;) )

    But this is why you’re stupid: don’t noindex the freaking page! People will see that it’s not indexed and start bitching. Just index, nofollow it!

    Still love you though :)

  • Craig Mullins

    Lovely, Now I need to cloak myself as Googlebot now. :)

  • Tim

    Wouldn’t your link partners work out that the pages which link to their website are no longer indexed, let alone displaying toolbar PageRank?

  • http://yoast.com Joost de Valk

    @Craig if the cloaking is done well you won’t be able to :)

  • http://yoast.com Joost de Valk

    @Tim that’s what I said as well, but Patrick hadn’t moderated my comment yet ;)

  • http://www.pablogeo.com Pablo

    Pretty darn interesting indeed. Thanks for sharing Patrick :)

  • http://hubpages.com/hub/Acai-Berry-Free-Trial-Review Acai

    @Tim I would say about 5-10% of webmasters may notice, who cares? You’re still benefiting from the hundreds of other people you have conned.

  • Pingback: Silently Delete Your Link Farm | Sha Money Maker

  • http://www.5ubliminal.com/ 5ubliminal

    Reinventing the wheel only 1year and some later ;)
    Check out my post.

  • http://darussia.com/ DaRussia

    Damn, good idea, i just thought about smth like that, because banning directories thru robots.txt is easy detectable.

  • Bob

    Craig: “Lovely, Now I need to cloak myself as Googlebot now.”

    Make sure you also come from the Googlebot IP address, in case the pages use that instead of User Agent.

  • Pingback: Cuando te dan enlaces con queso | Dictina

  • http://andymax.com/ Andy Max Jensen

    Sneaky …and Joost …even more sneaky ;-)