By 7 years ago in Google SEO

Why PageRank is broken and how it’s being fixed

The original PageRank algorithm looked at links and web pages through the eyes of a web surfer following the random walk model.

The surfer would visit a page and then click at random on one of the links and arrive at another page. Eventually the surfer would have visited every page on the web and the pages they visited the most (ie the ones with the highest number of inbound links) would be deemed the most important.

Google is trying to create an algorithm that mirrors human behaviour and displays the results that real people want to see, this becomes a problem because humans don’t follow the random walk model.

Consider a person viewing a news story or blog post, how likely is it that they click a navigation link or a link in the footer compared with a link in the middle of the news article? What is the chance they click on a link at the top of the post compared to a link right down at the bottom? Treating all links equally isn’t a viable way to decide which sites are most important.

Today Bill Slawski analyses a patent from Yahoo that proposes a new method of handling PR. The patent is filed by Yahoo but I’m sure Google is using the exact same methods.

The document discusses how a search engine could use user data (for example toolbar or Adense data) to see which links were clicked on most from different web pages. Links that received the most clicks are deemed to be more important and are given a higher weight. This weight can be combined with a user “satisfaction” figure determined by the amount of time the person stays on a certain page before moving on. Pages with a higher satisfaction score are deemed more important.

Looking at how search engines hope to mirror human behaviour makes it very easy for webmasters to insulate themselves from algorithmic fluctuations and updates. Buying footer links for traffic is a bad idea so why should it be a good idea to buy them for search engine rankings?

Google is only going to get better at mirroring human behaviour so why not make sure you are ahead of the game.

By Patrick Altoft. at 11:39PM on Wednesday, 16 Jan 2008

Patrick is the Director of Strategy at Branded3 and has spent the last 11 years working on the SEO strategies of some of the UK's largest brands. Patrick’s SEO knowledge and experience is highly regarded by many, and he’s regularly invited to speak at the world’s biggest search conferences and events. Follow Patrick Altoft on Twitter.

comments

  • BH Spamer

    If they’re going to implement these stuff into their algorithms… well… finally some Chinese traffic would become valuable… ;-)

  • http://www.sciencebase.com David Bradley

    Basically, they need some kind of social-bookmarking random walk mashup…I reckon.

    db

  • http://www.gadgetvenue.com Matthew

    There was some talk a year or more ago about the use of analytics data that could change how good your site was doing. I have a number of contacts who removed analytics code thinking that google were spying on them and effecting their rankings as they could get all the data they needed about visitors from the tracking code. I have no clue if that data is or is not used by google for those purposes though.

    • http://www.brianchappell.com Brian

      @matthew the problem with that though process is that the rich would stay rich. Google wants to display the most resourceful page for a given query. Are you telling me the most trafficked page is always the most resourceful? I wouldn’t think so.

  • http://www.wayneliew.com Wayne Liew

    Well, recently I have just finished my two part series on building post level links. Apart from the reasons that you have stated here, building post level links seem to be easier than links by the sidebar for navigation.

    Post level links are also much more cheaper if you are used to trading links. If what is said in your article is true, a rise of price for post level links might happen.

    Let us see where Google will proceed from now on.

  • http://www.technomoney.net/ Ruchir

    No matter how intelligent bots become, human behavior is pretty difficult to emulate perfectly.

  • http://bloggingfingers.com Matt Jones

    If the ‘satisfaction’ is measured by the time a user spends on a page that isn’t very fair. If the user was looking for something that didn’t take long to read/watch then they would leave quickly, giving the page a lower ranking even if it had exactly what that user wanted.

    Also the number of clicks a link receives isn’t fair either… that number depends largely on how well the site was promoted. i.e. an budding marketer might put their page on Stumbleupon getting loads of traffic and clicks but that doesn’t mean their content was any better than the person who didn’t use Stumbleupon.

    An insolvable problem! I may post about this….

  • http://www.manishpandey.com Manish Pandey

    Absolutely Patrick,

    With the advancements in technology like checking the human behavior and vips, search engines are improving day by day. With the new search engines like chacha.com which works on the principle of voting for the sites, I don’t understand why would any search engine implement that tactic. How easily it could be cheated.

  • http://seoroi.com/seo-roi-quality/why-google-is-broken/ Why Google’s PageRank Is Really Broken

    Posted on this a few weeks back, yet Sphinn didn’t decide it was good enough to go hot lol. Bottom line is that PageRank sees links as votes nowadays, whereas the idea behind it was the random walk model. It means a lot on their business model, so you might care to check out my name link.

    Cheers
    Gab

  • Pingback: Gabfire web design » Around the blogsphere: 17 Jan 2008()