With thousands of low value web directories available, choosing a decent one to submit to is a hard task. A lot of webmasters decide to submit to 500 free ones but unless you have a lot of natural links to counteract the unnatural directory links you might find yourself with a spammy link profile.
The value in submitting to directories is to gain some high PR links and co-citation. Decent directories have a lot of incoming links and PR so will help your site get more pages indexed. This is of particular importance to affiliate sites as it can be hard to attract natural incoming links.
The key issue is to find a few (less than 10) decent places to get your links from. The first 2 on my list are Yahoo and Business.com. For me they are a great starting link for affiliate and commercial sites. If you are running a blog you can probably get enough natural links to not need to submit to these but for most other new sites they are a must have link in my opinion.
If you do get accepted to the Yahoo directory make sure you use the NOYDIR tag on your web pages otherwise Yahoo might use your directory description in the normal search results pages.
My next favourite directory is the Aviva Directory. The reason this stands out is because the owners have been linkbaiting. 99% of directories don’t have any natural links but Aviva has even been on the front page of Digg a few times so has more trusted links than most of the other directories put together.
I’m not going to discuss any other directories by name as there are plenty of other lists on the web already. The best way to find a few more is to search Google for directories in your niche. Use search terms like “cell phone directory” or “cell phone add link” if your site is about cell phones.
Other good methods include searching for all the places your competitor’s sites are listed. Use Yahoo site explorer to find links to the sites and search Google for all the places their url’s are listed.
A big issue for me with directory submission is knowing when to stop. Once you have been listed in about 10 decent directories there really isn’t any point in getting any more of these types of links.
For more details on how to choose a directory Aaron Wall has some discussion about checking the cache date to see how trusted the pages are. Trusted pages are crawled more frequently than non-trusted ones.
Last Monday TIME magazine released an article titled “25 Sites We Can’t Live Without”. The article was, not surprisingly, widely read and linked from a lot of popular blogs as well as getting over 2500 diggs.
It might come as a surprise then that TIME has now removed the article from their website barely a week after releasing it. The page now redirects (badly) to the TIME homepage. Doing a Google search for the article title brings up an almost identical article from 2006 so clearly TIME has simply recycled old content for this latest version and had no intention of keeping the new one live.
Here is how they do the redirect:
<meta http-equiv="refresh" content="0;url=http://www.time.com">
It seems TIME needs to learn how to properly redirect a web page.
Linkbait is the art of creating a piece of content so good that lots of people want to link to it. TIME have clearly achieved this part but in removing the content after only a few days they haven’t maximised the number of links they can get and have damaged their credibility at the same time.
I will be removing the link from my blog as there doesn’t really seem much point in linking to the TIME homepage, they have quite enough links already.
Thanks to Tim from web design agency leeds for the tip!
Linkbait is the act of adding content to a website with the aim of attracting links from other sites. The content can take a variety of different forms from a unique tool or a breaking news story to a well written article or controversial image.
Sometimes linkbait is intentional but quite often the best linkbait is conceived quite by accident. Continue reading »
Yesterday I spotted that John Chow was no longer ranking for the search term “John Chow” on Google.
I didn’t want to post about it until John had chance to comment and try to figure out what happened.
Quite a few people search for John Chow, our Adwords ad set up 24 hours ago has had about 400 impressions so far.
Now as a John Chow dot com reader I think he deserves to rank. The blog offers loads of information and Google is being a bit harsh to apply a penalty.
What I think happened
Here is an extract from Google Webmaster Guidelines:
Some SEOs and webmasters engage in the practice of buying and selling links, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. Buying links in order to improve a siteâ€™s ranking is in violation of Google’s webmaster guidelines and can negatively impact a site’s ranking in search results.
Matt Cutts has more on paid links but you get the general idea – Google doesn’t like people buying and selling links.
Although Google hates paid links they are still very effective and will continue to be effective as long as you follow some simple rules when you buy and sell them. This is where John Chow has gone wrong I believe.
The paid links on the right hand side are totally unrelated to his content so stand out like a sore thumb.
Buying and selling links
If you want to buy and sell links you need to make sure the links are related to your site. If you start buying links from a page that is selling links to casino and v!agra sites then you are asking for trouble. If your site is about mobile phones and you are selling links to p0rn sites then you are going to get a much harsher penalty than if you sell to other phone sites.
When you buy from a good text link broker you can choose the exact pages your link is going on and look at the other outbound links. When you sell links via a broker you can accept or reject the links as you wish. Don’t just accept all the links.
How John Chow can get his rankings back
The best way to get his ranking back is to go through the Google Webmaster Guidelines and solve any issues he can find. Once the site is clean then he needs to submit a reinclusion request in Google Webmaster Central.
Obviously nobody except Google really knows whats going on here, and they aren’t about to tell anybody. If you have a theory please let me know in the comments.
Most web publishers use a CPC network such as Adsense or a CPM network
that shows banner adverts and pays a tiny amount such as $1 per
thousand impressions. I prefer to use affiliate programs that pay me a
commission for every product I sell.
Having been banned from Adsense 3 times in the past for no reason at
all I’ve had to learn how to make money without relying on Google to
send me a cheque. The reason for my first ban was that I signed up
with a site and never got round to adding the Adsense code. Then one
day I added the code to the revenue sharing program in the Digital
Point forums, quickly earned $60 and then got banned. Google didn’t
even offer an explanation. The next two bans were for starting accounts when I was already banned.
The best thing about affiliate schemes is you can’t get banned easily
and you have email & phone access to loads of people who want to help
you make as much money as possible.
Making money using affiliate marketing is way too large a subject for
me to cover it in one blog post so I will be doing a series of
probably a few posts per week explaining exactly how I make over
$10,000 per month from affiliate marketing on various blogs.
If you have any specific questions please post them in the comments. Otherwise wait until tomorrow for the first post which is about link baiting – the first
thing you need to do when launching a new affiliate site.
The first step – link bait
aspect of link baiting for affiliate sites is to make sure you don’t
have any affiliate links in your posts. This might sound strange but
bloggers won’t link to sites with affiliate links. You need to wait
until your posts are about a week or two old before you start adding
Remember that any visitors from feed readers or social bookmarking
sites are very unlikely to want to buy anything. Visitors who find
your posts by searching on Google are very likely to want to buy
A link baiting campaign for a new affiliate site can help the site get
good rankings very quickly and avoid the dreaded Google Sandbox.
Affiliate sites are very hard to build links for so starting your
sites life as a blog allows you to build links for free and then flip
the blog to an affiliate site after a few months.
Read the rest of the link baiting article tomorrow.
Michael Gray has started an interesting discussion on the value of article marketing over at his blog. A number of SEO pro’s are weighing in to the discussion in the comments (including myself) and it’s well worth a read.
Most of you will be aware that for over a year I ran the largest article submission business on the Digital Point Forums. I used some software to automate the process and could submit an article to 250 sites in about an hour. The submission was exactly the same as manual submission and you could include 4 links in a signature with the anchor text of your choice.
Having submitted thousands of articles and seen first hand where they were syndicated and how they affected the rankings of the websites it became more and more apparent to me early this year that the process just wasn’t working anymore. Obviously my charges of $30 to submit 2 articles to 250 sites were not steep and nobody was expecting a miracle but if the process wasn’t working then I didn’t want to keep providing the service.
In the past article submission worked very well, especially with Yahoo, so anybody bulk submitting prior to 2007 is likely to have seen good results. As with most SEO tactics Google has just figured out a way to stop it working.
How to do article marketing in 2007
If you are really struggling for links then you might want to submit to the top few article sites on this list but otherwise I would really recommend steering clear of article sites. They are full of low quality duplicate content that never attracts natural links and are joining directories in the webs spam filled wasteland.
The key is to get your unique articles syndicated by as many real websites as possible. Most people write one article and try to syndicate it but with many webmasters becoming more and more aware of duplicate content you really need to start writing multiple versions of the articles.
My favourite methods
Create a webmasters area on your site and let people know that if they want content for their site you will write custom articles for them on subjects related to your site. Include a couple of links, some images & a signature and you both will get a lot of value.
If you want a slightly more scalable method sit down and write 10 or 20 brand new articles and upload them to your site in a zip file for people to download. Most will include links to your site and the articles will be published on real sites rather than spam filled directories.
Promoting off site content
When your content gets picked up on another site don’t simply sit back and hope. Start seeding the content at sites like StumbleUpon, Digg and Del.icio.us to maximise the value of your link.
Whats your favourite article marketing method?
After being listed in 5th place on the Alexa Movers & Shakers list last week I just saw that BlogStorm has a mention on Time Magazines 25 Websites We Can’t Live Without article.
If you jump to page 4 you will see that we aren’t actually one of the 25, just an add on to the Technorati entry but it is nice to get a mention all the same.
I doubt Time will send much traffic but the article might get on Digg which will be extra publicity for us.
Due to the success of BlogStorm over the last few weeks we are moving to a new server today.
The site runs cron jobs pretty much continually and serves so much embedded content and RSS feeds that even with the heavy caching we were using the server sometimes only had 86% uptime which is clearly not acceptable.
If you can read this post then the site has moved, if you spot any bugs please let me know in the comments.
Google Images can drive a surprising amount of traffic to your website
or blog and it’s much easier to get good rankings in the Image search
results than the normal results. The majority of visitors arriving
from Google Images won’t want to read your pages or buy any products – they are just looking for an image to use on their web page, PC or
blog post. Over 500 people have already downloaded the script to use
Google Images to build links so clearly people are getting traffic
from Google Images already and want to make the most of it.
Getting good rankings in Google Images isn’t actually very hard. The
first thing to do is make sure you have the basic page elements in
place so that Google can figure out what your images are likely to
contain. Search engines are experimenting with using software to
detect the contents of your images but until this is fully operational
they are reliant on using the rest of the page to determine what the
The key elements that search engines use are the page title tag, alt
text of the image, filename of the image, title tag of the image and
the text immediately surrounding the image. A good example would be a
blog post with the page title “Paris Hilton goes to prison” and an
image with the following html code:
<img src="images/paris-hilton.jpg" title="Paris Hilton in prison"
The image should be placed either at the beginning or after the first
paragraph of the post for maximum effect.
The long tail of image search
While some keywords can attract a lot of traffic (especially in the
adult industry) the best way to take advantage of Google Images is by
ranking on the first page for thousands of image searches. This is
very hard for new sites unless they are some kind of web shop with
lots of products so the key is to concentrate on making sure the site
is as well optimised for image search as possible so it will gain good
results in the future.
A lot of web shop software programs are quite poor in terms of general
SEO but they fare even worse for image search optimisation. You would
need to make sure all the images have alt text and keyword filled
filenames and make changes where appropriate.
Choosing an image size
There is no “one size fits all” solution for Google Images, you need
to consider what type of images your visitor will be looking for and
size your images accordingly. Visitors searching for web icons will
want much smaller images than people looking for Britney spears
wallpaper. Check out the results for the search term you are targeting
and look at the sizes of the other images. I prefer to make my images
towards the larger end of the scale so that visitors are not
disappointed to see a low quality small image that they can’t use.
Maximising your traffic
The Google Images search results are laid out in a very clear way so
the user can see straight away which image they are looking for and
whether it is the right size for their application. To make the most
of your Google Images rankings you need to make sure that your image
is the best one on the page and that it stands out from all the others
in some way.
To illustrate this point check the rankings for the iPhone image
results. Bearing in mind the searchers are likely to want a picture of
the shiny new Apple iPhone the companies with pictures of fake iPhone
prototypes are likely to be getting a much lower click through rate
than the number 1 result from Gizmodo which shows 2 images of the
iPhone in a good quality image. If you have good rankings in Google
Images don’t be afraid to alter the image or (keeping the dimensions
the same) replace it with a better or more interesting one. The image
search results take a while to update so it might take some weeks
before the new image shows but the results or changing the image can
be quite impressive.
Ever since Digg became the biggest blogger hangout on the web it has
been the target of thousands of webmasters, bloggers and internet
marketers desperate to use the power of Digg to promote their sites.
The appeal of Digg is that the demographic contains so many webmasters
that if your content is seen by 20,000 diggers there is a very good
chance 500 of them might write about your site and link to it.
Unfortunately a lot of people seem to think that getting dugg is an
easy way to promote low quality content and it’s this attitude that is
increasingly causing a lot of bloggers to fail dismally in getting
quality links. The problem is that a lot of bloggers sacrifice writing
good content to concentrate on pushing out top 10 lists every couple
No matter how many people read your article one simple fact remains:
if it’s not remarkable bloggers won’t write about it. Can you imagine
engadget deciding that the latest “Top 10 iPhone competitors” article
to be dugg is so amazing that they really need to point it out to
their readers? It just wouldn’t happen.
If you want to get links you have to push out remarkable content that
isn’t available anywhere else on the web. Getting dugg is a great way
to get remarkable content in front of thousands of bloggers but unless
your content is so good that they simply have to share it with their
readers the only links you will get are from sites that scrape the
Digg RSS feed.
What many people are calling linkbait or Link Bait, I call content.
Call me old fashioned, but if you try to create a buzz (and links) by
creating contrived content, the buzz you hear will be flies. – Eric
As if Google Street View wasn’t cool enough Bill Slawski has found a
patent application indicating that
Google might be able to use Optical Character recognition or OCR to
pinpoint the locations of thousands of businesses across the world.
At present the GPS coordinates are based on a particular street and
the actual coordinates of the individual addresses have to be
estimated. If you have ever tried adding your business to Google Local
you will probably have noticed the location was slightly inaccurate.
Google allows you to drag the location marker so you can pinpoint your
business but this method isn’t really scalable across thousands of
OCR isn’t new but it in the past it has been difficult for an
algorithm to figure out which parts of an street photo to read. This
latest patent looks at the picture and then does a database lookup to
see which businesses are expected and matches them up to the picture.
Once the business has been located in the picture the accurate GPS
coordinates can be updated on the database.
From the Database assisted OCR for street scenes and other images patent:
Optical character recognition (OCR) for images such as a
street scene image is generally a difficult problem because of the
variety of fonts, styles, colors, sizes, orientations, occlusions and
partial occlusions that can be observed in the textual content of such
scenes. However, a database query can provide useful information that
can assist the OCR process. For instance, a query to a digital mapping
database can provide information such as one or more businesses in a
vicinity, the street name, and a range of possible addresses. In
accordance with an embodiment of the present invention, this mapping
information is used as prior information or constraints for an OCR
engine that is interpreting the corresponding street scene image,
resulting in much greater accuracy of the digital map data provided to
Latest from B3Labs
- Another milestone reached for Branded3 as it’s acquired by the
St Ives Group
- The latest media consumer findings & what they mean for digital marketers
- Talk to Branded3 at @BuyYorkshire in Leeds next week!
Latest from Blogstorm
- Watch @Tim_Grice talk all things Penguin 2.0 in June’s #B3Brunch
- Content can kill your site: How to fix it
- Search expert @Tim_Grice talks Penguin 2.0 in a G+ Hangout this Thursday