Recently Google decided that the internet wasn’t quite big enough and started creating extra pages on a number of websites.
Googlebot does this by making up random words, entering them into web forms and indexing the results.
You can see that blogstorm.co.uk has 57 of these auto-generated pages indexed in Google already.
It’s quite easy to block these pages being indexed as other people have pointed out but 99.9% of webmasters won’t know about the issue. The best method to “fix” the problem is to noindex,follow the pages you don’t want to be indexed. Don’t use robots.txt as this is a waste of any links you happen to gain to these pages.
As long as a site has lots of authority (PageRank & TrustRank) then Google adding a hundred or so extra pages isn’t going to have an effect on your rankings. It might even get you some long tail traffic (as long as the pages are optimised) but you need to keep an eye on the situation to make sure Google isn’t creating thousands of near duplicate pages.
My guess is that these auto generated pages might cause a big issue for some people so watch this space.