Using X-Robots HTTP Header To Sneakily Delete Your Link Farms
Warning- The strategy below might be risky.
Zoe asked a question this week about the best way to remove old link exchange pages. I responded on Twitter but wanted to post about it as well.
It’s well known that having loads of links pages on your website isn’t a good idea. One links page with 20 links is fine, 10 links pages all with 100 links will probably do more harm than good.
If you have been given a penalty in Google then deleting all these links pages is the first thing you need to do. But what if your site is ranking well but you want to get rid of the pages before Google gives you a penalty? Surely if you remove the pages everybody will stop linking to you and your rankings will fall?
The key is to find a way of silently removing your links pages, without alerting all your link partners to the fact that their link has gone.
My solution is to make use of the X-Robots HTTP header that Googlebot is now supporting. This allows you to add the noindex meta information to a page server side so that visitors can’t see any traces in your robots.txt or your source code.
header(â€™X-Robots-Tag: noindex, nofollowâ€™, true);
This will remove your links pages from Google and probably won’t alert your link partners.
You might also like to cloak the X-Robots tag so that it only appears to Googlebot – you don’t want people to be able to run a header checker and see your sneaky plan.
Latest from B3Labs
- Another milestone reached for Branded3 as it’s acquired by the
St Ives Group
- The latest media consumer findings & what they mean for digital marketers
- Talk to Branded3 at @BuyYorkshire in Leeds next week!
Latest from Blogstorm
- After five years, Google still doesn’t know how to rank images
- Tickets now on sale for the next #B3Seminar in London – book now!
- Google Only Shows One Organic Result To iPad Users