By 7 years ago in SEO

The Internet is Out of Date

While checking the latest submissions to the Digg spam section Upcoming queue at the weekend I spotted an article entitled “Ten Tips to the Top of Google”.

Ever keen to see what tips Digg spammers are offering these days I decided to check it out.

My initial thought was that the article is one of the many SEO articles that is totally misleading and offers bad advice to confused webmasters.

However with a bit more research it turns out that the article was a copy of an old (published March 2004) version of Jill Whalens Ten Tips article which is actually quite a good resource, although it doesn’t appear to have been updated since 2005. So the problem here isn’t that the article is bad, it was perfectly fine when first published, it’s just a few years out of date.

This led me to thinking about all the millions of pages of blog posts, articles in article directories and forum posts that have been published over the last 5 years that are starting to become dangerously out of date. Large and well known brands are giving bad advice every day purely because articles they wrote and syndicated in 2004 are still being published around the web.

Google has an algorithm to offer search results based on the sites that have attracted the most votes (ie links) from other websites. This works perfectly fine for the first few years of a search engines lifetime and for ranking time independent data but what is Google going to do in 10 or 20 years time? Just because an article has attracted 10,000 links over the last 5 years doesn’t mean its relevant or even accurate today.

Some pages will be updated and can remain relevant as long as the author is willing to keep publishing new information. Other pages, such as blog posts and news articles, are out of date within months of being published and are very unlikely to ever be updated.

Google is clearly thinking about this issue with its new meta tag allowing webmasters to set an expiry date for their pages but I really don’t see this becoming widely used. The search results seem to be getting more out of date all the time for some niches and there is no way webmasters will voluntarily remove top ranking pages just because they are old.

The only way for Google to solve the issue is to stop relying on overall link numbers and to work out rankings based on the rate a site has been attracting links over the last few months. Google is supposed to give users the results they want, not the results they might have wanted 2 years ago. A site that attracted 10,000 links in 2003 and 100 in 2007 should not rank as well as a new site that gained 10,000 links in the last 6 months.

Think about an example of a new site that launches a TV advertising campaign and gets a flurry of new links in the first few weeks. This is what people want to see in the search results, not some 10 year old site that doesn’t even bother to advertise any more. At the very least people want a diversity of old sites and new sites, at present a lot of search results are just full of outdated information.

By Patrick Altoft. at 7:40PM on Monday, 16 Jul 2007

Patrick is the Director of Strategy at Branded3 and has spent the last 11 years working on the SEO strategies of some of the UK's largest brands. Patrick’s SEO knowledge and experience is highly regarded by many, and he’s regularly invited to speak at the world’s biggest search conferences and events. Follow Patrick Altoft on Twitter.

comments

  • http://seoandstuff.com John King

    I think Google needs to get a better handle on page topicality with relation to information being out of date. SEO changes very quickly (we’re talking months) however an blog article “The history of ties” likely won’t change very quickly. Site topicality is too broad, it needs to be narrowed down to specific pages. In SEO blogs lots of the info is on marketing (which doesn’t change that often), however posts about google’s search algo change all the time.

    I don’t envy the guys at Google!

  • http://noodleinvite.com.au Andy

    Great insight…makes me think of public libraries that maintain collections of useless books like “Getting the most of Windows 98″.

    I like your suggestion for improving the alogorithm based on recent links. I’m trying to improve my site’s ranking for the term “free SMS”, and bemusingly a site that has been dead for some time (now with a spammy landing page) is still ahead of me.

  • http://optimalizalas.info Longhand

    Yes.
    Checking Google Patent about ranking criterions we see they check by more segments the freshness and relevance of documents (tendency of link growth and anchor texts in time), but probably the age of the domain and PageRank still get too much weight in ranking.

  • http://tempo-cs.net/tarsashaz Társasházkezelés

    Google Patent Smile
    I like your suggestion for improving the alogorithm based on recent links