By 6 years ago in SEO

Using X-Robots HTTP Header To Sneakily Delete Your Link Farms

Warning- The strategy below might be risky.

Zoe asked a question this week about the best way to remove old link exchange pages. I responded on Twitter but wanted to post about it as well.

It’s well known that having loads of links pages on your website isn’t a good idea. One links page with 20 links is fine, 10 links pages all with 100 links will probably do more harm than good.

If you have been given a penalty in Google then deleting all these links pages is the first thing you need to do. But what if your site is ranking well but you want to get rid of the pages before Google gives you a penalty? Surely if you remove the pages everybody will stop linking to you and your rankings will fall?

The key is to find a way of silently removing your links pages, without alerting all your link partners to the fact that their link has gone.

My solution is to make use of the X-Robots HTTP header that Googlebot is now supporting. This allows you to add the noindex meta information to a page server side so that visitors can’t see any traces in your robots.txt or your source code.

header(’X-Robots-Tag: noindex, nofollow’, true);

This will remove your links pages from Google and probably won’t alert your link partners.

You might also like to cloak the X-Robots tag so that it only appears to Googlebot – you don’t want people to be able to run a header checker and see your sneaky plan.

By Patrick Altoft. at 2:41PM on Wednesday, 19 Nov 2008

Patrick is the Director of Strategy at Branded3 and has spent the last 11 years working on the SEO strategies of some of the UK's largest brands. Patrick’s SEO knowledge and experience is highly regarded by many, and he’s regularly invited to speak at the world’s biggest search conferences and events. Follow Patrick Altoft on Twitter.


17 Responses to “Using X-Robots HTTP Header To Sneakily Delete Your Link Farms”

  1. wesley says:

    Ok, that’s pretty darn unethical..

  2. Without a doubt it’s totally unethical.

  3. Rick says:

    Cheers Patrick, always wondered how to go around this.

  4. Patrick, you’re smart but a stupid fuck at the same time :) (and I’ve been doing this on some sites ever since I wrote the post you linked to ;) )

    But this is why you’re stupid: don’t noindex the freaking page! People will see that it’s not indexed and start bitching. Just index, nofollow it!

    Still love you though :)

  5. Craig Mullins says:

    Lovely, Now I need to cloak myself as Googlebot now. :)

  6. Tim says:

    Wouldn’t your link partners work out that the pages which link to their website are no longer indexed, let alone displaying toolbar PageRank?

  7. @Craig if the cloaking is done well you won’t be able to :)

  8. @Tim that’s what I said as well, but Patrick hadn’t moderated my comment yet ;)

  9. Pablo says:

    Pretty darn interesting indeed. Thanks for sharing Patrick :)

  10. Acai says:

    @Tim I would say about 5-10% of webmasters may notice, who cares? You’re still benefiting from the hundreds of other people you have conned.

  11. [...] got this cool tip via blogstorm. It’s about deleting your old links from your link farm pages without alerting your link [...]

  12. 5ubliminal says:

    Reinventing the wheel only 1year and some later ;)
    Check out my post.

  13. DaRussia says:

    Damn, good idea, i just thought about smth like that, because banning directories thru robots.txt is easy detectable.

  14. Bob says:

    Craig: “Lovely, Now I need to cloak myself as Googlebot now.”

    Make sure you also come from the Googlebot IP address, in case the pages use that instead of User Agent.

  15. [...] decir tiene que en mi opinión hay que elevar a Joost de Valk a los altares. También se agradece a Patrick Altof el detalle de [...]

  16. Sneaky …and Joost …even more sneaky ;-)

Leave a Reply