Have you ever opened up your Google Reader account or personalized Google Hompage and spotted a feed that you didn’t remember subscribing to? If you have then it might have been due to a security issue with the way Google handles RSS subscription requests.
Clicking on the subscribe using Google button on most blogs takes you to a page saying “Google offers two different ways to keep up-to-date with your favorite sites” with the option to click on either “Add to Google homepage” or “Add to Google Reader”.
You can see it in action by clicking on the button below (don’t worry, this won’t auto-subscribe you to anything):
The problem is that unscrupulous websites can copy the links to Add to Google homepage or Add to Google Reader and open them up in an IFRAME for every visitor, meaning that anybody who visits their website while signed in to a Google account will suddenly have subscribed to the RSS feed on both Google Reader and the Google homepage automatically.
All a site needs to do is add the following code to their pages, replacing the blogstorm feed with their own feed, and they get a bunch of new readers.
<iframe width="1" height="1" border="0" scrolling="0"
<iframe width="1" height="1" border="0" scrolling="0"
It is worth noting that none of the other RSS readers I tested had this vulnerability.
If you want to see the security issue in action, and are signed into your Google account, click this link. Please be aware that this will auto subscribe you to the BlogStorm RSS feed so if you don’t want to know about internet marketing and general web design related topics you might want to be careful.
Why would somebody want to do this?
Now most of you are saying “Why would a blogger want to get readers in this way?”
Well, there are two answers. The first is simple: blogs like to show off a large number of Feedburner subscribers so if you have no morals, a low quality blog and want lots of subscribers, this is the way to get them.
The second is a bit more sneaky. Imagine you are doing some affiliate marketing, what is the most valuable piece of real estate on the web? Where would huge corporations pay millions per day to get an advert? The answer is right below the search box on the Google homepage.
With this exploit thousands of people could suddenly see your best offers plastered right underneath the Google search box that they use hundreds of times per week. Some people will just assume Google put them there, many will trust Google’s recommendation and buy the products.
I wonder how long this will take to get fixed?
Today is the last day that webmasters can use the old Google Analytics interface.
In case you haven’t quite found your way around the new interface I wanted to remind readers about the Google Analytics tutorial we released last month as well as the second part where we answered your questions.
If there are any more questions, now is the time to ask!
Over the past year my websites have had the good fortune to be mentioned on a lot of very high traffic blogs. The likes of Engadget, Gizmodo, CrunchGear, TechCrunch are some of the larger ones but since BlogStorm launched it has attracted links from loads of search related blogs around the 1,000 to 10,000 RSS reader mark.
The thing that always amazes me is the low level of traffic that blogs can send you. For instance Engadget has 600,000 RSS readers and normally sends about 1,000 visitors, TechCrunch has 250,000 readers and sends about 2,000 visitors, even when they write an entire post about your site.
Some of the search blogs that have linked to BlogStorm have thousands of readers and only sent a handful of visitors.
I know you are probably thinking that most people read these blogs using RSS readers but that would still cause a spike in traffic from people clicking through from their reader.
Obviously I love receiving any amount traffic and links but it just goes to show that just because a blog has 600,000 readers it doesn’t mean they actually read it, the only way to judge the real readership is by looking at the number of comments each post is getting.
People are always talking about bloggings A-list but I still haven’t been able to figure out a clear definition. I’m pretty sure there is an RSS subscriber threshold that you have to cross before you make the A list.
My guess is that on average, any blogger with more than 10,000 RSS subscribers is probably A list. What do you think? Is this the best way to measure a blogs success? Are you an A-list blogger?
In tribute to the amazingly popular icanhascheezburger.com, which, in spite of failing the “can people type my domain name into the address bar” test, is serving half a million page views each day, I’ve complied a lolcats SEO edition. Enjoy!
Don’t worry – until this morning I didn’t know what lolcats were either.
Yesterday I was trying to take advantage of the Favourite blogs meme run by Mashable.
Mashable were displaying a Google Blog Search widget to show all posts linking to this post on their site. Mashable has loads of readers so I figured if BlogStorm could be the first one listed in Google Blog Search I would be onto a winner.
Under the Publicize tab in Feedburner there is a service called PingShot which notifies various services when you post. I have 5 additional services including the Google blog search pinging service listed so whenever I publish a post Feedburner pings a bunch of sites to tell them.
Feedburner also offers a pinging service of their own so you can tell them when you post.
Google is fast!
Armed with these tools I published the post, pinged Feedburner, Feedburner pinged Google and I sat back, waited a few seconds and started refreshing Google Blog Search to see how long it would take to get indexed.
This is the cool part, the blog post was indexed in Google in about a minute. I don’t know exactly how long but it was less than a minute. Pretty cool, huh?
Was it a fluke?
Tonight I published another post and tried again. This time it took just 120 seconds to be indexed in Google Blog Search. I took some screenshots below.
How fast can you get your blog posts indexed? Maybe some of you could take screen capture videos and we can see who is the fastest?
While checking the latest submissions to the Digg
spam section Upcoming queue at the weekend I spotted an article entitled “Ten Tips to the Top of Google”.
Ever keen to see what tips Digg spammers are offering these days I decided to check it out.
My initial thought was that the article is one of the many SEO articles that is totally misleading and offers bad advice to confused webmasters.
However with a bit more research it turns out that the article was a copy of an old (published March 2004) version of Jill Whalens Ten Tips article which is actually quite a good resource, although it doesn’t appear to have been updated since 2005. So the problem here isn’t that the article is bad, it was perfectly fine when first published, it’s just a few years out of date.
This led me to thinking about all the millions of pages of blog posts, articles in article directories and forum posts that have been published over the last 5 years that are starting to become dangerously out of date. Large and well known brands are giving bad advice every day purely because articles they wrote and syndicated in 2004 are still being published around the web.
Google has an algorithm to offer search results based on the sites that have attracted the most votes (ie links) from other websites. This works perfectly fine for the first few years of a search engines lifetime and for ranking time independent data but what is Google going to do in 10 or 20 years time? Just because an article has attracted 10,000 links over the last 5 years doesn’t mean its relevant or even accurate today.
Some pages will be updated and can remain relevant as long as the author is willing to keep publishing new information. Other pages, such as blog posts and news articles, are out of date within months of being published and are very unlikely to ever be updated.
Google is clearly thinking about this issue with its new meta tag allowing webmasters to set an expiry date for their pages but I really don’t see this becoming widely used. The search results seem to be getting more out of date all the time for some niches and there is no way webmasters will voluntarily remove top ranking pages just because they are old.
The only way for Google to solve the issue is to stop relying on overall link numbers and to work out rankings based on the rate a site has been attracting links over the last few months. Google is supposed to give users the results they want, not the results they might have wanted 2 years ago. A site that attracted 10,000 links in 2003 and 100 in 2007 should not rank as well as a new site that gained 10,000 links in the last 6 months.
Think about an example of a new site that launches a TV advertising campaign and gets a flurry of new links in the first few weeks. This is what people want to see in the search results, not some 10 year old site that doesn’t even bother to advertise any more. At the very least people want a diversity of old sites and new sites, at present a lot of search results are just full of outdated information.
Here are mine, in no particular order:
These are blogs that I look forward to reading, to be honest the list could have been quite easily extended to 50. There are so many great blogs at the moment.
Believe it or not July 14th marks the 10th birthday of the very first blog, according to the WSJ.
The very first weblog was run by a guy called Jorn Barger and was a collection of interesting things he found on the web. The site is still going at robotwisdom.com. Interestingly Barger doesn’t seem to have made the most of his blog and ended up homeless and broke by 2005. In 2007 he can be found on blogger asking for donations to cover his hosting fees.
From the WSJ article:
We are approaching a decade since the first blogger — regarded by many to be Jorn Barger — began his business of hunting and gathering links to items that tickled his fancy, to which he appended some of his own commentary. On Dec. 23, 1997, on his site, Robot Wisdom, Mr. Barger wrote: “I decided to start my own webpage logging the best stuff I find as I surf, on a daily basis,” and the Oxford English Dictionary regards this as the primordial root of the word “weblog.”
It’s hard to describe how much I wish I started blogging in 1997. Maybe with a few years headstart some of us might have created our very own engadget.
I’ve only been blogging for around 18 months but before that I read quite a few blogs. How long have you been blogging?
Often described as “voodoo” by frustrated webmasters the use of mod_rewrite and htaccess files is one of the more advanced tasks a web developer has to face.
The good news is that unless you are looking for really advanced solutions you don’t have to fully understand how they work to use them on your website. Most of the htaccess and mod_rewrite tips on this page can simply be cut and pasted into a text file and uploaded to your server. Continue reading »
Daniel at Daily Blog Tips has a tip today about backing up your blog that I wanted to expand on slightly.
Most of you will be able to use the WordPress backup plugin but for BlogStorm I don’t use WordPress so have to use another script.
The script I use is Backup2Mail and it is quite simple to install. The script sends a backup email every 24 hours (or however often you run the cron) to your gmail account.
If you don’t have a backup system in place for your files and database you need to start thinking about it.
Thanks to Online Backup for showing me this tip.
Since most internet marketers offer advice on avoiding the Google Sandbox I thought it would be fun to give some tips on how you can get your new site into the sandbox with just a few simple steps.
The sandbox is the place websites go before they have earned enough trust to get good Google rankings. Once a site gets enough trust it moves out of the sandbox and into the trustbox and your traffic increases. Some sites never earn enough trust to achieve top rankings and the number and quality of links to achieve good rankings varies dramatically across different niches.
The key point to remember is that for each competitive search term there are only 10 sites that can rank on the front page (9 if you assume Wikipedia will be in the top 10). Even if your mortgage website has 500,000 links it still won’t rank at number 1 unless it has more trust than all the other mortgage sites.
Common symptoms of the sandbox are not ranking for the keywords in your title tags and also being outranked by popular bloggers who cover your stories. For example if you write a blog post about a new gadget and the post is covered by Gizmodo and Engadget they will rank on the first page for the gadgets name while you will be nowhere to be found, even though they link to you as the source.
A lot of people blame the sandbox for their sites not ranking very well. In most cases it is just due to the fact the site doesn’t have enough links. As a general rule unless your site has more quality (natural) links than the top ranking competitor sites it’s probably not sandboxed.
How to get into the sandbox:
- 1) Get the site indexed using your signature at Digital Point
- 2)Submit to 500 free web directories
- 3) Publish articles on your website and submit them to 200 article directories
- 4) Use ezine articles to find content for your website
- 5) Sell sitewide links
- 6) Buy sitewide links
Next week: how to get into the trustbox
Latest from B3Labs
- Another milestone reached for Branded3 as it’s acquired by the
St Ives Group
- The latest media consumer findings & what they mean for digital marketers
- Talk to Branded3 at @BuyYorkshire in Leeds next week!
Latest from Blogstorm
- Why your press releases are getting you penalised
- After five years, Google still doesn’t know how to rank images
- Tickets now on sale for the next #B3Seminar in London – book now!