Using X-Robots HTTP Header To Sneakily Delete Your Link Farms

  • 1
  • November 19, 2008
Patrick Altoft

Patrick Altoft

Director of Strategy

Warning- The strategy below might be risky.

Zoe asked a question this week about the best way to remove old link exchange pages. I responded on Twitter but wanted to post about it as well.

It’s well known that having loads of links pages on your website isn’t a good idea. One links page with 20 links is fine, 10 links pages all with 100 links will probably do more harm than good.

If you have been given a penalty in Google then deleting all these links pages is the first thing you need to do. But what if your site is ranking well but you want to get rid of the pages before Google gives you a penalty? Surely if you remove the pages everybody will stop linking to you and your rankings will fall?

The key is to find a way of silently removing your links pages, without alerting all your link partners to the fact that their link has gone.

My solution is to make use of the X-Robots HTTP header that Googlebot is now supporting. This allows you to add the noindex meta information to a page server side so that visitors can’t see any traces in your robots.txt or your source code.

header(’X-Robots-Tag: noindex, nofollow’, true);

This will remove your links pages from Google and probably won’t alert your link partners.

You might also like to cloak the X-Robots tag so that it only appears to Googlebot – you don’t want people to be able to run a header checker and see your sneaky plan.

Free of charge. Unsubscribe anytime.

--> -->