Google has lost the war on webspam

  • 0
  • January 18, 2016
Stephen Kenwright

Stephen Kenwright

Director of Search

Google made huge changes to its algorithm over the last 18 months – the most recent being last weekend’s update – dealing with its “core” algorithm.

Panda and Penguin have been conspicuous by their absences, as have manual actions. It’s no longer normal to get a notification in Search Console (Webmaster Tools) that Google has found unnatural links. It still happens from time to time but rarely happens for big brands. A million miles away from 2012-13 when half the high street was penalised by Matt Cutts and co.

By all accounts Google has made almost no effort to deal with the spam infesting its SERPs since late 2014. Google rolled out Penguin 3.0 in October last year and though we were arguably premature with our analysis (going live with some winners and losers before anyone at Google had even confirmed the update) little actually changed once the dust had settled. There have been no fundamental shifts in how Google algorithmically deals with webspam since 2013. If you’re still suffering from the effects of those algorithms you’re expected to know what to do and just get on with it.

Does anyone believe that Google is “hard at work” tinkering with those algorithms and they’re just not ready? If site owners have taken the search engine’s advice – removed the links, replaced the content – Google can just run the same version of the algorithm it ran last time across the newer data set. Less spam, problem (partially) solved. It’s not like they can’t afford the bandwidth.

When Google does attempt to tackle manipulative tactics the response is lacklustre at best. This week we’ve seen manual actions in Search Console – without messages – for sites that were once hacked but don’t even exist anymore. We know of some sites that we’d hardly call “spammy” that have been deindexed completely for a day or two.

What is Google playing at?

It’s tempting to argue that Google can’t police itself without Matt Cutts at the helm. I don’t think that this is true.

The overwhelming majority of shifts in the SERPs have surrounded core parts of the algorithm. The latest is still too fresh for real analysis but I can promise you that it’s meant to deal with user intent in the SERPs. The previous “Phantom” updates have been similar.

Google has realised that it’s easier – and more sustainable – to fix its issues with user experience than it is to fix the issues with spam.

Nobody is conspiring to make Google harder to use (except maybe Microsoft’s bing). But there will always be plenty of cowboys looking to make a quick buck who can beat Google’s filters and make money from spam.

We’re heading for a new age of user signals. Google knows that Penguin won’t solve its issues with links. The SEO industry is catching on to that fact too and Google is about ready to admit defeat.

We need to face the proposition that it might actually be easier for Google to dial down the influence of links than to police webspam in its results.

Free of charge. Unsubscribe anytime.