Google to blame for New York Times issues not SEO

  • 0
  • August 28, 2007
Patrick Altoft

Patrick Altoft

Director of Strategy

Last weeks New York Times article discussing the
problems they
face when outdated content rises to the top of the search results

reminded me of my blog post about the
outdated
internet
.

The NYT blames the situation on an:

unhappy byproduct of something called search engine
optimization, which The Times has been using to make money by driving
traffic to its Web site. Technically complex, search engine
optimization pushes Times content to or near the top of search
results, regardless of its importance or accuracy.

Having an old and out of date article ranking at the top of the search
results purely due to the trust that Google places on nytimes.com is
hardly the fault of search engine optimisation. The NYT shouldn’t feel
obliged to fix the issue either, aside from making sure they link the
original stories to any subsequent retractions or updates to the
story.

The problem lies in how Google ranks web pages. As pages get older
they attract more authority and rank higher, even when the number of
inbound links to the page stops increasing and starts dropping in some
cases.

The articles mentioned in the NYT column probably haven’t attracted
any new links for years and yet Google still believes they are
relevant. If the situation carries on entire search results will be
dominated by out of date information.

In response to the story Bryan Eisenberg has a set of
excellent questions regarding journalistic ethics but I really don’t
see that journalists should need to alter their pages in any way. The
only obligation journalists have is to update original articles with
links to the corrected versions, as appropriate.

The rest of the issue is up to the search engines to solve – there is
no way in the world commercial websites are going to remove thousands
of articles from their archives just in case some are inaccurate.

Free of charge. Unsubscribe anytime.