My preferred method of reading blogs is to subscribe to their RSS feed using Netvibes and I sometimes forget that a lot of people don’t use an RSS reader.
When building BlogStorm I forgot to give people the option to sign up by email until infonote reminded me.
Thanks to this reminder I added the email subscribe option on the left hand side and 15 people signed up in the first few hours.
Feedburner offers email subscriptions as a free service. Visit Publicize > Email Subscriptions in your feed control panel to turn it on. Make sure you configure the options to suit your site.
The moral of this post is make sure you turn on your email subscription option in Feedburner and give users a way to keep up with your blog via email.
Most web publishers use a CPC network such as Adsense or a CPM network
that shows banner adverts and pays a tiny amount such as $1 per
thousand impressions. I prefer to use affiliate programs that pay me a
commission for every product I sell.
Having been banned from Adsense 3 times in the past for no reason at
all I’ve had to learn how to make money without relying on Google to
send me a cheque. The reason for my first ban was that I signed up
with a site and never got round to adding the Adsense code. Then one
day I added the code to the revenue sharing program in the Digital
Point forums, quickly earned $60 and then got banned. Google didn’t
even offer an explanation. The next two bans were for starting accounts when I was already banned.
The best thing about affiliate schemes is you can’t get banned easily
and you have email & phone access to loads of people who want to help
you make as much money as possible.
Making money using affiliate marketing is way too large a subject for
me to cover it in one blog post so I will be doing a series of
probably a few posts per week explaining exactly how I make over
$10,000 per month from affiliate marketing on various blogs.
If you have any specific questions please post them in the comments. Otherwise wait until tomorrow for the first post which is about link baiting – the first
thing you need to do when launching a new affiliate site.
The first step – link bait
aspect of link baiting for affiliate sites is to make sure you don’t
have any affiliate links in your posts. This might sound strange but
bloggers won’t link to sites with affiliate links. You need to wait
until your posts are about a week or two old before you start adding
Remember that any visitors from feed readers or social bookmarking
sites are very unlikely to want to buy anything. Visitors who find
your posts by searching on Google are very likely to want to buy
A link baiting campaign for a new affiliate site can help the site get
good rankings very quickly and avoid the dreaded Google Sandbox.
Affiliate sites are very hard to build links for so starting your
sites life as a blog allows you to build links for free and then flip
the blog to an affiliate site after a few months.
Read the rest of the link baiting article tomorrow.
The blogosphere today is so competitive that most bloggers don’t stand a chance. To give yourself the best shot of success you need to make your posts stand out from the crowd. Your content needs to be remarkable and, most importantly, linkable.
It’s no secret that blogs like Engadget and Gizmodo get the most links to their posts. In fact Engadget averages 2348 links to each single post. Their reporting isn’t that much better than everybody else so what do they do that other sites don’t?
One answer is in the quality and originality of their images. Engadget has a team of talented writers and people who can produce cool images within minutes of a new story breaking. The images are not just good pictures of a particular gadget they are original, never seen before images that other bloggers can use on their posts.
If you are the source of a breaking news story you will get far more exposure and links if you can find a unique image to add to the story, if a larger site is looking for stories they are far more likely to choose a site with a stunning image than a boring, text only blog post.
What if my topic doesn’t need images
Every topic needs images to illustrate certain points. Whether you are trying to post about website bounce rates or how Google crawls websites, images can work wonders to make your blog appear remarkable.
Long term traffic
Including a stunning image in every blog post brings benefits long after the post sunk into your archives. Having thousands of images on a site will bring a huge amount of traffic from Google Images which you can hopefully leverage to your advantage.
Michael Gray has started an interesting discussion on the value of article marketing over at his blog. A number of SEO pro’s are weighing in to the discussion in the comments (including myself) and it’s well worth a read.
Most of you will be aware that for over a year I ran the largest article submission business on the Digital Point Forums. I used some software to automate the process and could submit an article to 250 sites in about an hour. The submission was exactly the same as manual submission and you could include 4 links in a signature with the anchor text of your choice.
Having submitted thousands of articles and seen first hand where they were syndicated and how they affected the rankings of the websites it became more and more apparent to me early this year that the process just wasn’t working anymore. Obviously my charges of $30 to submit 2 articles to 250 sites were not steep and nobody was expecting a miracle but if the process wasn’t working then I didn’t want to keep providing the service.
In the past article submission worked very well, especially with Yahoo, so anybody bulk submitting prior to 2007 is likely to have seen good results. As with most SEO tactics Google has just figured out a way to stop it working.
How to do article marketing in 2007
If you are really struggling for links then you might want to submit to the top few article sites on this list but otherwise I would really recommend steering clear of article sites. They are full of low quality duplicate content that never attracts natural links and are joining directories in the webs spam filled wasteland.
The key is to get your unique articles syndicated by as many real websites as possible. Most people write one article and try to syndicate it but with many webmasters becoming more and more aware of duplicate content you really need to start writing multiple versions of the articles.
My favourite methods
Create a webmasters area on your site and let people know that if they want content for their site you will write custom articles for them on subjects related to your site. Include a couple of links, some images & a signature and you both will get a lot of value.
If you want a slightly more scalable method sit down and write 10 or 20 brand new articles and upload them to your site in a zip file for people to download. Most will include links to your site and the articles will be published on real sites rather than spam filled directories.
Promoting off site content
When your content gets picked up on another site don’t simply sit back and hope. Start seeding the content at sites like StumbleUpon, Digg and Del.icio.us to maximise the value of your link.
Whats your favourite article marketing method?
After being listed in 5th place on the Alexa Movers & Shakers list last week I just saw that BlogStorm has a mention on Time Magazines 25 Websites We Can’t Live Without article.
If you jump to page 4 you will see that we aren’t actually one of the 25, just an add on to the Technorati entry but it is nice to get a mention all the same.
I doubt Time will send much traffic but the article might get on Digg which will be extra publicity for us.
Due to the success of BlogStorm over the last few weeks we are moving to a new server today.
The site runs cron jobs pretty much continually and serves so much embedded content and RSS feeds that even with the heavy caching we were using the server sometimes only had 86% uptime which is clearly not acceptable.
If you can read this post then the site has moved, if you spot any bugs please let me know in the comments.
Google Images can drive a surprising amount of traffic to your website
or blog and it’s much easier to get good rankings in the Image search
results than the normal results. The majority of visitors arriving
from Google Images won’t want to read your pages or buy any products – they are just looking for an image to use on their web page, PC or
blog post. Over 500 people have already downloaded the script to use
Google Images to build links so clearly people are getting traffic
from Google Images already and want to make the most of it.
Getting good rankings in Google Images isn’t actually very hard. The
first thing to do is make sure you have the basic page elements in
place so that Google can figure out what your images are likely to
contain. Search engines are experimenting with using software to
detect the contents of your images but until this is fully operational
they are reliant on using the rest of the page to determine what the
The key elements that search engines use are the page title tag, alt
text of the image, filename of the image, title tag of the image and
the text immediately surrounding the image. A good example would be a
blog post with the page title “Paris Hilton goes to prison” and an
image with the following html code:
<img src="images/paris-hilton.jpg" title="Paris Hilton in prison"
The image should be placed either at the beginning or after the first
paragraph of the post for maximum effect.
The long tail of image search
While some keywords can attract a lot of traffic (especially in the
adult industry) the best way to take advantage of Google Images is by
ranking on the first page for thousands of image searches. This is
very hard for new sites unless they are some kind of web shop with
lots of products so the key is to concentrate on making sure the site
is as well optimised for image search as possible so it will gain good
results in the future.
A lot of web shop software programs are quite poor in terms of general
SEO but they fare even worse for image search optimisation. You would
need to make sure all the images have alt text and keyword filled
filenames and make changes where appropriate.
Choosing an image size
There is no “one size fits all” solution for Google Images, you need
to consider what type of images your visitor will be looking for and
size your images accordingly. Visitors searching for web icons will
want much smaller images than people looking for Britney spears
wallpaper. Check out the results for the search term you are targeting
and look at the sizes of the other images. I prefer to make my images
towards the larger end of the scale so that visitors are not
disappointed to see a low quality small image that they can’t use.
Maximising your traffic
The Google Images search results are laid out in a very clear way so
the user can see straight away which image they are looking for and
whether it is the right size for their application. To make the most
of your Google Images rankings you need to make sure that your image
is the best one on the page and that it stands out from all the others
in some way.
To illustrate this point check the rankings for the iPhone image
results. Bearing in mind the searchers are likely to want a picture of
the shiny new Apple iPhone the companies with pictures of fake iPhone
prototypes are likely to be getting a much lower click through rate
than the number 1 result from Gizmodo which shows 2 images of the
iPhone in a good quality image. If you have good rankings in Google
Images don’t be afraid to alter the image or (keeping the dimensions
the same) replace it with a better or more interesting one. The image
search results take a while to update so it might take some weeks
before the new image shows but the results or changing the image can
be quite impressive.
Ever since Digg became the biggest blogger hangout on the web it has
been the target of thousands of webmasters, bloggers and internet
marketers desperate to use the power of Digg to promote their sites.
The appeal of Digg is that the demographic contains so many webmasters
that if your content is seen by 20,000 diggers there is a very good
chance 500 of them might write about your site and link to it.
Unfortunately a lot of people seem to think that getting dugg is an
easy way to promote low quality content and it’s this attitude that is
increasingly causing a lot of bloggers to fail dismally in getting
quality links. The problem is that a lot of bloggers sacrifice writing
good content to concentrate on pushing out top 10 lists every couple
No matter how many people read your article one simple fact remains:
if it’s not remarkable bloggers won’t write about it. Can you imagine
engadget deciding that the latest “Top 10 iPhone competitors” article
to be dugg is so amazing that they really need to point it out to
their readers? It just wouldn’t happen.
If you want to get links you have to push out remarkable content that
isn’t available anywhere else on the web. Getting dugg is a great way
to get remarkable content in front of thousands of bloggers but unless
your content is so good that they simply have to share it with their
readers the only links you will get are from sites that scrape the
Digg RSS feed.
What many people are calling linkbait or Link Bait, I call content.
Call me old fashioned, but if you try to create a buzz (and links) by
creating contrived content, the buzz you hear will be flies. – Eric
As if Google Street View wasn’t cool enough Bill Slawski has found a
patent application indicating that
Google might be able to use Optical Character recognition or OCR to
pinpoint the locations of thousands of businesses across the world.
At present the GPS coordinates are based on a particular street and
the actual coordinates of the individual addresses have to be
estimated. If you have ever tried adding your business to Google Local
you will probably have noticed the location was slightly inaccurate.
Google allows you to drag the location marker so you can pinpoint your
business but this method isn’t really scalable across thousands of
OCR isn’t new but it in the past it has been difficult for an
algorithm to figure out which parts of an street photo to read. This
latest patent looks at the picture and then does a database lookup to
see which businesses are expected and matches them up to the picture.
Once the business has been located in the picture the accurate GPS
coordinates can be updated on the database.
From the Database assisted OCR for street scenes and other images patent:
Optical character recognition (OCR) for images such as a
street scene image is generally a difficult problem because of the
variety of fonts, styles, colors, sizes, orientations, occlusions and
partial occlusions that can be observed in the textual content of such
scenes. However, a database query can provide useful information that
can assist the OCR process. For instance, a query to a digital mapping
database can provide information such as one or more businesses in a
vicinity, the street name, and a range of possible addresses. In
accordance with an embodiment of the present invention, this mapping
information is used as prior information or constraints for an OCR
engine that is interpreting the corresponding street scene image,
resulting in much greater accuracy of the digital map data provided to
I found a neat new widget today that allows you to make a bit of extra cash from your websites.
The ChipIn interface is very cool and works perfectly. There are widgets for most of the social networking sites and blogging platforms such as Blogger and WordPress.
Widgets that ask your readers to donate money are not always hugely successful. If you give people a specific reason to donate money such as a new server to keep a free tool running then you stand a greater chance of success. My guess is that this widget will work well on sites that either give a way a lot for free or have a very large and strong community. If you have a blog then it is well worth testing ChipIn for a week or two.
To test ChipIn I created a Beer Money widget which you can see below:
Adding a forum to your website can seem like a great way to increase the amount of natural content and build traffic to your site. However the nature of a forum means they are full of duplicate content, noisy low value pages and attract very few external links.
Link equity has become a major factor over the last 12 months, if you have too many pages on your site then your link equity will be spread too thinly and your site will not reach its full potential in Google.
Aaron Wall sums this up nicely:
If you are wasting link equity getting low value noisy pages indexed then your high value pages will not rank as well as they could because you wasted link equity getting low value pages indexed. In some cases getting many noisy navigational pages indexed could put your site on a reduced crawling status (shallow crawl or less frequent crawl) that may preclude some of your higher value long tail brand specific pages from getting indexed.
Aside from building some natural links the best way to stop a forum using up too much of your valuable link equity is to stop the low value, noisy pages being indexed. My preferred method is to use the following robots.txt file for vBulletin:
To remove any noisy pages already indexed you would need to block them using robots.txt and then visit Google Webmaster Central and use the “Url Removals” tool.
Most forum packages will have unique titles and meta description tags on all the pages – if yours doesn’t then you will need to figure out how to add these. The other pitfall to watch out for is your forum software appending session id’s to the url’s – this will harm your site in the search engines.
It is hard to cover all aspects of forum SEO in a blog post so if you have any questions please ask in the comments.
Digg.com has a pretty major problem today after the new threaded comment system was widely slammed by Digg users.
Almost 4,000 users have dugg up this story and the first comment “No kidding. This sucks” has received over 1000 diggs so far.
Mashable has a number of interesting questions but the key aspect is for Digg to figure out how long to keep the comment system running and what to do about the revolt.
Is this a justifiable revolt? Do the majority of Diggers really think that the new system is awful, or is it just a vocal minority that’s winning the argument? Will this turn into another catastrophe, and will Digg again be forced to change something because the users didn’t like it? And, most importantly, what will happen if Digg users massively revolt some big feature that brings revenue to Digg, like the upcoming restaurant & product reviews, as announced by Kevin Rose? So far, the feedback on that idea hasn’t been positive.
How Facebook stopped the revolt
Facebook had a similar issue last September when they launched news feeds so people could gain easy access to updates from their friends. Users hated it and threatened to boycott the site.
Facebook solved the issue by allowing 100,000 users to help shape the future of the site. This open consultation has helped Facebook continue to grow and stopped the user revolt in its tracks.
It’s very hard for a site to launch a new project and be told straight away that it sucks. It’s even harder to accept the comments and pull the new features from the site.
In most cases the best method is to let users trial the features and then allow them to offer feedback and suggestions and even vote on how the system should work. Give the power back to the users – they only want whats best for the site.
If your community is especially vocal like Digg then you really can’t afford to let a user revolt happen for more than 24 hours.
Thanks Rob from Battery Recycling for the tip on this.
Latest from B3Labs
- Another milestone reached for Branded3 as it’s acquired by the
St Ives Group
- The latest media consumer findings & what they mean for digital marketers
- Talk to Branded3 at @BuyYorkshire in Leeds next week!
Latest from Blogstorm
- 5 myths about manual penalty recovery
- Google gets more aggressive with link devaluation
- Why your press releases are getting you penalised