By 6 months ago in Blogstorm

VIDEO: Watch #B3Brunch for expert tips on Penguin 2.1 & Hummingbird

Last Thursday saw our October installment of Brunch with Branded3, an expert Google+ Hangout with our Head of Search, Tim Grice and our Head of SEO Strategy, Matthew Jackson.

This month’s subject was Google’s latest updates; Penguin 2.1 and Hummingbird, so Tim and Matthew answered questions including, “How will Hummingbird affect my SEO strategy?” and “I’ve had a manual penalty revoked, why are my ranking and visibility being hit now by Penguin 2.1?

Check out all of Tim and Matthew’s advice in the video below, and if you have anymore questions on this – feel free to get in touch with us, and make sure you follow us on Google+ to get involved in next month’s Hangout.

You can watch all of our past Brunch with Branded3 Hangouts on our YouTube channel here.

Video transcription


Tim
Grice: Okay. Hi everybody. Thanks for joining us today on this Branded3 brunch. It’s myself, Tim Grice. I’m head of search at Branded3, and Matthew is joining as well, our head of SEO strategy. We have a few questions already to start the brunch with, but if you’ve got any questions please feel free to ask or to add them in the messenger on the right there, and then we’ll try our best to get through them.

We will just dive straight into it, really, and I guess I will go ahead with the first one.
We have a question asking if Hummingbird is a penalty to somebody’s website. Will it cause a penalty? Will it create a penalty? In our opinion, in my opinion, Hummingbird isn’t a penalty at all. Nobody even saw it coming for one month. So if it had been a penalty we would have probably heard about it before Google announced it a month later. For me Hummingbird is, as you have probably seen on lots and lots of blogs, it’s more of a change fundamentally to Google’s algorithm. They have changed it, taken away all the bad bits, obviously keeping the good bits but adding in this new more intelligent way of ranking websites as their algorithm.

So Hummingbird, no is not a penalty. If you have had a website which has relied on creating pages optimised for lots of different keywords, specifically with the intention of driving SEO traffic, so you have a keyword for each page, then yes. You probably have seen an impact because Hummingbird understands the meaning of words, or that’s what it is supposed to, anyway, understand the meaning of words better. The keywords will be bucketed together, so you won’t be able to rank just because you’ve got the keyword on the page. You will have to have the authority of Google to be able to understand the meaning, giving other people with good authority the chance to rank.

Matthew: Yeah. To add to what Tim is saying there, for me, you have to think about why Google has introduced Hummingbird. If you think about how people searched maybe six months to twelve months ago people were very used to just searching for exactly the specific thing they were looking for. They would search for very small keywords. But now obviously with mobile search becoming a lot more relevant, and certainly all the speech related searches that you can now do on Google on things like Siri and other things like that. Google is trying to interpret those searches a lot better. So those kind of searches, if I just speak into my phone I’m going to be telling Google exactly what I’m looking for, rather than just I’m on a desktop I can’t be bothered to type too many words in there.

So Google is trying to interpret that actual spoken query a bit better and trying to interpret what people are looking for within that search result. So that’s what Hummingbird is about to me. It’s Google trying to interpret what people are searching for better, rather than anything to do with penalties because of SEO practices or anything like that. It’s really just Google trying to update its algorithm to better understand new search queries.

[no audio 3:57-5:55]

Matthew: . . . ising amounts have not provided visits that we get in analytics. And it’s now becoming a lot more difficult to interpret if in the past we have been seeing visits by a long tail key word coming, actually, no, we can’t really tell how that has been effected since Hummingbird has been released, and I think that’s one of the reasons we’re not really seeing much impact at the moment. It has been difficult to actually tell that Hummingbird was there in the first place, really.

Tim: Okay guys. Sorry about that. We have been having some problems. It keeps going on to mute for whatever reason, so apologies if you missed any of that. It sounds like it’s okay. If you want to put any messages in the side bar I’m hoping you can hear us okay at the minute.

So moving on to the next question then, how will Hummingbird effect your SEO strategies?
Well, I guess like we’ve said in previous questions, it will affect your strategy. You no longer have to pick a keyword for every page or make sure you have a page exactly optimised for every single keyword string that you can find. What you need to do is answer the user’s query better, and that may mean that instead of just doing keyword research alone that you actually do a little bit of market research as well. You try and understand who your user is and what they are likely to search and the type of things they are really looking for.

So in terms of your SEO strategy, you shouldn’t be fixated on making sure you have every single keyword variation on the page, or having a page for every single keyword variation. Instead you should be focused on developing a page that answers a bucket of queries correctly even if the keywords aren’t included, because our understanding of Hummingbird is that Google is beginning to understand the meaning of different keywords. So something, if you’re asking Google to look for something that is the best, that may mean great, that might mean awesome, it might mean fantastic, and Google is now beginning to understand all these different meanings of words, and therefore you optimise your page for what is the best answer to that particular question.

[no audio 8:04-8:34]

Matthew: …Yeah. I think this is definitely something related to the patent there that looks at very related similar queries, and in the past it might have been Google’s perhaps looking at things, like, if you typed in ‘auto insurance’ it would understand that it was writing to car insurance rather than anything, and it’s trying to get better at understanding those kind of similar queries. Again, coming back to that point about how users are evolving their search because of mobile search and voice search. People are just asking questions rather than thinking about the specific query that might return the best results. They are just wanting to get the results as quickly as they can, so Google is just trying to understand how those queries just relate to each other.

Tim: We now have a question about Hummingbird and mobile, and how it might affect desk top and mobile searches differently. Personally I have not seen anything that suggests that Hummingbird is designed for mobile of course we have heard about it being designed for voice search, but I think it has an algorithm, and the way it works will be similar to desktop or mobile. I don’t think there’s any great differences in that. I don’t know if you have an opinion on that.

Matthew: Obviously, I think Google is developing its Android systems, and it’s using things like the new card systems in there to try and integrate with things like your diary, so if that’s your diary coming up Google will tell you about it using the card system and things like that.
Again, I think the search algorithm and things like that are trying to really integrate with things like Android Bear and trying to make things easy to use with that. Again, I think Hummingbird comes back to that. So it is really intended for how user search is going in terms of more actual changes to searches and preferences. People are using mobile a lot more, so Google really wants to capitalise on that and make sure that their algorithm works in that way, really.

Tim: So we’re being asked a lot of questions down the side here, and also questions we have got before the brunch.

Why did nobody spot this up there? Why did nobody spot that Hummingbird had taken effect, and why would it take Google to announce this change? Normally when Penguin hits or Panda hits there are messages all over forums, blogs, everywhere saying that something is going on.

I think, fundamentally, it’s because it didn’t actually change that much, or certainly not the keywords that SEOs will be focusing on. The only sites that I could imagine would have possibly seen this before it was actually rolled out are sites that were relying heavily on long tail queries, sites that had developed these capture pages for as many keywords as possible. But then, to be honest, they probably put that down through Panda algorithm change. Probably a lot of SEO agencies did that as well.

But that’s the only way it will have been spotted. Probably for a lot of head terms, a lot of the big keywords that we constantly monitor all the time, but there probably hasn’t been that much change and that’s why nobody has actually spotted it. It will have only been sites that rely on a long tail search, and then if they did see a drop they would probably put it down to the Panda algorithm update or Refresh, or Pub Crawl, whatever it is now. They would probably put it down to the Panda algorithm. That’s probably why it wasn’t seen.

Matthew: I absolutely agree with that. I think, again, not provided to probably being one of those things that has made it a bit more difficult to actually tell how Hummingbird has affected your site. If you have some search traffic through some of those bigger terms it’s obviously easy to tell when you’re getting that, and you won’t really be affected on those kinds of terms.

Some of the examples that Google used in its posts about Hummingbird and things like that are kind of the easiest ways to explain it, really. So, for example, if I searched for, say, the best steakhouse in Leeds, which is near our offices, if there was a restaurant called The Steakhouse in Leeds that restaurant is going to come up at the top of the search because it’s a steakhouse, it’s got a Leeds location and that’s why it would come up at the top of the results.
Now with Hummingbird understanding that query a bit better in terms of a steakhouse is a steak restaurant, so if people are looking for steak restaurants they might be looking for variations around that. An Italian restaurant might do good steaks, or it might be an American restaurant that do good steaks. So instead of returning just that steakhouse website they are going to be returning other things that are relating more to steak restaurants in Leeds and variations around those steak restaurants.

They are the sort of queries that are going to be affected and sites that are going to be affected. I can see a lot of smaller local businesses who used to rely on a bit of traffic because they had various keywords in their URLs or their domains being affected by it just because Google is trying to understand that search a bit better and return users to somewhere that is showing the best results, because they’ve got a lot more reviews or something like that, really.

Tim: We’re getting a few questions through guys, about Penguin and link penalties, but just to finish off the Hummingbird bit, if you’ve got any more questions on Hummingbird or anything about content and that kind of thing, please send them through.

Someone has asked, does Hummingbird affect Google’s reliance or reduce Google’s reliance on links?

Now, my opinion is that it doesn’t. I don’t think there’s any reason why it would. I think what it does do is it reduces our reliance as online marketers, on building pages for every single different keyword variation we can possibly think of. That’s it. It reduced our rank, because

Google now understands that when we say something is good it could be great, it could mean best, whatever, so we don’t have to optimise our pages for every single term. We can pull out the keyword tool that is literally just creating the best answer for who we perceive our user to be or who we know it may be.

So no, it absolutely doesn’t reduce reliance on links. In fact, the most authoritative sites will probably see an increase in traffic now because they have probably got a page that answers questions extremely well. They just weren’t using the right keyword variations, that other sites will have been benefitting from. So it is taking away traffic from sites that have low authority but that were ranking a lot of keywords based on the fact that they created a lot of pages on long tail variations. It will take that traffic away from them and give it to the high authority sites that have the answer, but maybe didn’t use the right keyword combination.
That’s my opinion on Hummingbird.

Matthew: Yeah. I don’t think we are going to be changing any of our strategies that we have been talking about for the last couple of years in terms of creating really good websites that give the user the best experience, take advantage of things like user reviews and other things like that to really make sure that the site is the best it can be. Then really make sure that we’re marketing it in a very user friendly, likable, linkable, sharable manner, so I really can’t see it affecting any SEO strategies as far as we’re concerned.

That’s not to say that if you’ve got a bit of a different SEO strategy that might have worked in the past in terms of other linking strategies and other things like that, Hummingbird may well affect that, and some of the queries and different keywords that you are using within your site really, I would guess it depends on what sort of SEO strategy you have in place, and just make sure that it’s solid and is a good strategy that would have worked well previous to Hummingbird. I think it would still be a good strategy anyway.

Tim: Okay, so we’re going to move onto more, what everybody wants to talk about penalties, manual actions, Penguin 2.1. We’ve had a question in here. I would not put names to any penalty questions, I don’t think, but advice for a site had a manual penalty revoked, but then still been hit in terms of rankings and visibility by Penguin 2.1.

Now we’ve seen this with quite a lot of websites that we have been on and business owners that we have spoken to for the past couple of years. What they do is they clean up their links. Brilliant. That’s what Google wants. They disavow them. They get the manual action revoked. They may see a slight climb in rankings and visibility after it’s been revoked, some of them a full recovery, and then Penguin comes along and they’ve seen a drop off in rankings at that point, when Penguin has come along.

There are a lot of reasons, but I think there is a couple of manual reasons that we could presume are in effect here. One is that when you disavow your links Google uses that disavow file to do a look up against your link profile and then decide whether or not you have done enough work and have disavowed to pass that manual penalty. It doesn’t necessarily mean that any of them links that were passing value within that disavow tool stop the minute you press ‘submit’.

In my opinion. That is my own opinion.

What Google will do is they will be collecting the data within the disavow tool. They will be using lots of manual tactics to detect networks and anchor text fields and all that kind of thing, and when they release algorithms like Penguin it is based on the data that they collect during that process of pulling together the disavowed work, pulling together any networks they have uncovered, pulling together any anchor text patterns that they have found, and then they will release Penguin. So even though you may get a manual penalty by disavowing your links it doesn’t necessarily mean that you won’t get hit by further Penguin algorithm updates.
Because having them in the disavow file, yes it will stop you from receiving a manual action, but it won’t mean, if they were passing any benefit, Google isn’t going to stop that whenever they roll out the new algorithm. So you may still lose value from links you’ve disavowed, effectively.

The second part to that, in my opinion, is that any links within your profile that you can’t find, even if you pull together all the sites, [inaudible 19:26] Majestic, Ahrefs, the master tools data, you will still not find the whole of your linked profile, and that linked profile will be being actively devalued at stages when Google roll out things like Penguin and link spam [sounds like 19:42] algorithms.

So it will be that there’s some links that you haven’t found. You have done enough to get yourself out of a manual penalty, but there will be links that you haven’t found that are still passing value that Penguin will still be picking up. That’s why you will still see drops in your rankings after – particularly websites that have a history of extremely aggressive, manipulative link acquisition. This will affect them.

If you have been hit by Penguin 2, 2.1 or any of the Penguins and you previously had a manual penalty revoked, should you start again? I don’t know. It’s a business decision, really. If you have thousands and thousands, if you are refreshing your link profile, through all these different tools, every single month like you should be if you have any link issues, and you keep finding hundreds of thousands of links every single month, and you have been doing that for six months or seven months then it could be that you’re never going to see your whole profile. You are always going to suffer at some point.

The advice we give is to keep your link profile refreshed and up to date, and keep bunging it into the disavow file and hopefully you will dodge any of these Penguin updates. But the fact is that you have bad links now that are still passing value that you may have disavowed, they may not be passing value in the future and it’s impossible to know what that balance is. I don’t know if you want to…

Matthew: Yeah. I mean, Google’s updates to the Penguin algorithm, obviously it’s just trying to get better at understanding what’s a bad link, what’s a good link. It’s now got a lot more data than it used to have, so you might have been able to get out of a manual penalty six months ago. But actually now, the way that Google’s got all this data from people using the disavow file form just getting better at its algorithm anyway, it’s understanding those bad links a lot better. So originally they may have passed a manual review, but actually now the algorithm is so good that it’s actually finding those bad links and really trying to get rid of them. That’s why Penguin starts to come into effect, really.

I think as Tim says, keep refreshing. Make sure you are really on top of things. Maybe even start getting more aggressive in terms of disavowing links that you may have thought, ‘Well, these are okay. These might just about pass weight.’ Just kind of trying to understand really what Google is looking for. It’s just really trying to look for those absolutely natural links, so those are the only ones, really, that we can look at keeping and making sure that we’re looking to build more of. So really these natural approaches and creating the content, creating that nice natural outreach that gets people to engage with the site.

Tim: One other thing about Penguin that potentially got… I don’t know if anyone has read my post from last week after the Penguin algorithm had updated. I’m not trying to big myself up or anything, but we have seen a few instances of sites starting to recover now from Penguin, which is quite interesting, really. Because it’s really the first instance of what I’ve actually seen of some genuine recoveries. We have traffic data to back it up, search metrics, data ranking, data to back it up, so these sites have been suffering for probably 12, 18 months now.

They have either had manual action that was revoked and then still suffering from Penguin, or if they have just been suffering from the original Penguin update. It’s good to see those sites now starting to recover. Potentially you know they’ve been hit for 12 or 18 months I’m kind of undecided as to whether there’s a potential time based thing there, because obviously if you spam your site to death, get your rankings for a couple of months and then all of a sudden just stick everything in a disavow file until the next Penguin update runs then you are not going to be in danger of being hit by Penguin because you now have a load of spam links that you produced in your disavow file.

Google really doesn’t want it to work that way. It wants to say, ‘Look, you have been naughty. You shouldn’t have done that and we really want to affect your business in the future. So we are going to put this Penguin algorithm based penalty on you,’ and it might well be that that is now time based.

That’s not something I don’t know for definite, but that’s sort of the recoveries that we have seen seem to suggest that to a little bit of an extent. But that’s not to say, you know, by making sure that you get rid of all of your bad links and make sure your profile is really actively being cleaned up, that you can’t recover at any point of an algorithm update.

Matthew: Just to clear on that question, really, should you start again or not? If you have been doing nothing but manipulative linked spam for the past four or five years, and your link profile is largely made up of nothing but SEO type links then you may want to think about it.

My advice will be if you can, disavow as much as you can and start implementing a good solid SEO strategy that allows you to collect links naturally. Mark, just to pick on your questions about the penalty hitting in April with no web master tools notification, that we have no way of submitting a reconsideration request, I would just go ahead and disavow as many links as you can possibly find and keep an eye on your webmaster tools.

I think any drops that you know are related to links, whether it be Penguin, whether it be manual action, whether it be just seeing some sort of devaluation, that senior ranking drop away, I would absolutely disavow links no matter what. They will just hamper any progress from your SEO efforts in the future.

Tim: We do have a question now, which is going away from Penguin and Hummingbird talking about the impact of not providing for marketers and what we see really happening, and what we’re going to do about it I guess, is what people are asking.

So all your keywords have gone. Nobody’s got any keywords. Everybody is in the same position. We can see there’s traffic coming through, but we have no idea what the person has searched for in order to find our pages. The short answer is that there’s no accurate way for dealing with this. The way we’re trying to do it, and we’re still testing multiple models at the minute, is too much click through rate data on your webmaster tools to try and create our own click through models that we can then apply to your rankings within Google Organic and give you some indication of what keywords might be driving to a website. But it is just an indication. We can then group landing pages together, and when traffic comes to your landing page try to estimate how much of it might be from non brand organic search or an organic search full stop. That’s the best, really, we can offer. As marketers it means that we can’t go out there looking at our keyword data.

I guess it ties into Hummingbird. We can’t go out and create a landing page and optimise it perfectly with keywords that we might have seen around [inaudible 27:28] dates, and maybe this is what Google wants. Instead we just have to focus on the user and providing the best answer for them. Not based on keyword data, but based on our own research and understanding of our customers and users of the website.

Yeah. It’s a nightmare. I don’t agree with Google’s move on it, but everyone is in the same position and we have to work with what we’ve got. It makes things difficult. It doesn’t make SEO any less important. It just makes it more difficult to optimise and understand the value.

Matthew: From my point of view, I think for me, I have always come from a data heavy background doing lots of data research and things like that. So losing this information in analytics is a really big blow to SEO in general, but I think there are ways we can try to get as much information from other sources as possible. If you’re running PPC campaigns you’re going to be getting lots of data in terms of what keywords people are using to search your site, if you have, you know, make sure you use broad match. Maybe not all the time. Maybe just run it for a week or so. Put your spend up a little bit and use broad match to try and understand what kind of variations people might use to search for your site [inaudible 28:38].

That might help in that sort of sense. Tim says, you know webmaster tools datas becoming more important. We’re starting to look at that and use that a little bit. It’s not hugely accurate, but at least it gives us a bit of a flavor of what people are using, and also the click throughs that people are using. I think there are a few ways around it. We just need to work out the best ways and try to understand that much data as much as possible.

Tim: Okay guys, I think a few people are still having problems with sound. If you can make sure your laptops, computers, desktops whatever are muted, and that should help the sound, and obviously your microphones on Google+ and that might actually help with the sound.

Just to move back on to link building a bit there, and to answer some questions on that, any resources for pulling together a disavowed file?

Are we talking about pulling together links or are we talking about pulling together a disavow file? If we’re talking about pulling together the disavow file, in my opinion it’s fairly straightforward. Get as many links as you can. Manually classify them. Are they SEO or are they naturally occurring links? If they are for SEO add them into your disavow file at domain level. The domain level only, and just add them into the text file and upload it to the disavow tool.

I have seen some files that include annotations, stuff like that, but just forget all that. I don’t think anybody is actually going to read it. Put them in the file. Everything at the main level, if you’re looking for manual action to be revoked, let Google worry about which domains are all bad or whether it’s just certain posts. You make sure you have highlighted all the domains that you have manipulative links on. That would be my advice for putting together a disavow file. Again, if you want to add anything.

Matthew: We have created a lot of in house tools mainly through using just Excel. It’s fairly straightforward in terms of, just, some manipulations using some formulas that are regularly available in Excel, really. Once we have the links from our various different sources to then create the text that we need from the disavowed file . . . As Tim says, you know, it’s fairly easy to put an actual disavowed file together. It’s just the syntax of it you need to get right in terms of domain colon, if you disavowing the whole domain, or if it’s the just the URL that you’re wanting to disavow then you just put that in.

So it’s fairly straightforward to create those disavow files. It’s literally just looking through all the links you’ve got and trying to understand what those links are that sort of maintain consumer process throughout that [inaudible 31:36].

Tim: Another question here. What if you could practice to start disavowing links without any manual action? Any advice and webmaster tools?

Absolutely. Yes. If you have bad links that you know about, and you have seen the impact of bad links, whether it be penalty algorithm or devaluation, then yes, get them in the disavow tool. You can guarantee Google are I’m guessing they’ve found most of them already. They are going to keep finding bad links. They are going to keep devaluing certain signals that these manipulative links give off. So absolutely add them into the disavow file and save yourself any sort of trouble with a manual action.

If they are going to lose value – if they are bad links they are going to lose value anyway, so it’s best to get rid of them now and save yourself from dealing with any sort of penalty.

Matthew: Yeah. We have worked with a few newer clients that have come on board. No real signs of any penalties or any issues with Penguin, but we still, we advise that we will have a look through the links to make sure that we’re not at risk even in the future updates. It’s not just about now. It’s those future updates, and as I said earlier Penguin is getting much better, going from 2.0 to 2.1 it does seem like there has been a real shift there in what they have done.

So definitely keep looking at your back link profile on a regular ongoing basis. People keep saying, ‘Oh, is SEO dead?’. I think this is probably one of the biggest things that we need to look at in terms of SEOs. We have to make sure the on page stuff is right. We need to really make sure the back link profile is correct and it looks as natural and as nice as possible, and then from there it’s more about marketing rather then SEO. It’s about that PR, that strategy that you’re looking at there to create the new natural links through things that you’re using on sites.

So for me it’s definitely important to have a look at your back link profile whether you have been hit with a penalty or not.

Tim: I think we don’t have any more questions, I don’t think. Unless you have any more, please add them or send them through to us.

Just something I wanted to touch on. Everyone is asking, ‘How do we create links now? How do we go about link building,’ or whatever you want to call it. If you want to call it link building.

The truth is, the way you go about it is you become worth talking about. If you have nothing for anybody to talk about, to have a conversation online about your business and your website, then any link you build, any link you acquire, is going to look unnatural because you have nothing there to link to. It’s just a link to your internal pages that has certain anchor texts and it’s even branded, but there’s nothing there to really link to within a list of products or a really generic description of a service that you offer.

That is not worth linking to. So before you even think about how many links you need to acquire or where you stand in comparison to your competitors, why are you worth talking about? Why is your site worth being constantly produced? Worth having a great outreach team or PR team and going out there and approaching bloggers and journalists with this piece? Why is it worth it? And if you begin to get that bit right, the outreach and the promotional efforts of your content and ultimately the links that come to your site will be much easier.

There is no way to manipulate anymore. Even if you create the best post on the best site with a brilliant, natural link in the middle, but it points back to your site for no other reason than your site exists, it’s not going to stand you in good stead in the future. Your links need to be pointing back to something worth linking.

Matthew: Yeah, I think a great example for anyone who didn’t see Lab Cuts webmaster tools video yesterday, he’s talking about guest posting and how Google now sees that. In the video he describes exactly that same thing. It’s okay to guest post, it’s just that you have to have a real genuine reason to guest post. There’s no point just being a link in that guest post for the sake of there being a link in there. It really has to be, you’re the expert in your field, you are the ones that are running these websites that have this information on there.

So really just make sure that you’re creating that guest post because you’re the expert in that field and you’re giving someone a really good piece of advice, a really good piece of content about that. Lab Cuts is saying that guest posting shouldn’t be necessarily, it has not all been wipe out by Google by any means, it’s just that you really need to make sure that those guest posts are worth being there, and not just for the sake of a link in the post, really.

That’s one of the reasons we have created – well, we have had our PR team now for probably over a year now, I imagine. We are not a traditional PR company in that sort of sense, but we’re really making sure that the sort of activity and the content we’re creating is working well with people out there. Journalists are really interested in putting in those kind of things, you know, in their articles, on their sites, both print and online as well. It really helps reinforce that thing where we have created something that people generally want to talk about.

We just need to make sure that it’s getting linked to there because it’s there as well when people talk about it.

Tim: Yeah. We had a comment at the bottom that being old school PR tactics in terms of generating links, yes it is. It is old school PR, Perhaps with a digital twist on it, the way we sort of architect it and bring it about. I guess it has the SEO knowledge infused within it. Absolutely. Why would anybody want to talk about my business, my products, my services? We have to focus on that. Sometimes your product and service might not be that interesting, so you have to look at your target audience.

What conversations are they having online and how do I engage them, whether that be with content, whether it be with outreach, whatever it is. How do I engage in that particular conversation, and ultimately generate people talking about your website and business.

We had another question about manual penalties. So after you have had a manual penalty revoked, how often would you refresh your link profile?

As often as possible is the answer, but in terms of when you can actually physically do it, a lot of these link tools will only update sort of six to eight weeks, so I would say anywhere between a month and two months you should be trying to pull the data again and seeing what additions or what additional link information you can find. So anywhere between six and eight weeks refreshing is ideal. You might want to do it a little bit more.

If you have a huge link profile with 10,000 domains in it because you’re likely to find a lot more links when you do the refresh, but typically we would say six to eight weeks.

Matthew: You’ve got to look Google webmaster tools. They kind of update. If you do a reconsideration request on Google their master tools updates its information, so it’s a good point after you have submitted that reconsideration request and Google’s processed that data to then refresh your web masters tools links and go through that as well, so I would definitely suggest that even if you’ve got a manual penalty revoked you can’t really just sit back and wait.

It’s a case of you need to be a bit more active and carry on, go into those links and understanding what is in there, and if there’s anything that should be removed again.

Tim: Just one last bit before we wrap up, guys. I think we’re due to finish about now anyway, but talking about how much time you should spend on doing this new way of requiring link spending on PR. In terms of how much time you should spend on it, let’s presume you have the assets to talk about and we’ll just talk about the outreach side of it. I guess how much time you spend on it depends on how good your team are.

But you probably need a couple of days to make contact with the correct people and to follow up in a typical month, but then it just depends on the size of the campaign and how much reach you actually want to have. It depends on your objectives for the campaign. It is more about, as I say, becoming worth talking about and then going out there and doing some really effective outreach.

Anyway guys. If you’ve got any more questions please send them through. Please email brunch@branded3? Awesome. Send them through to there. We will get back to you and answer those questions a quickly as possible, and hopefully we will be ready before Christmas and we will see you on the next one.

Matthew: Just to quickly mention, we are running a seminar at the end of the month. There is one at the Leeds office on the 24th of October, and we’re also running a seminar on the first of November down there. Our office is in Farringdon, London. So if you’re interested in attending either of those events Tim will be speaking and Patrick our head of search, will be speaking, Andy Machin our head of creative will be speaking, as well as a couple of guest speakers.

So have a look on the website. There’s some details on there. You could get some tickets on there as well if you want.

Tim: If anybody wants to come to any of them get in touch with us. Thanks a lot, guys.

Matthew: Thanks.

Tim: Bye.

By Felicity Crouch. at 4:42PM on Monday, 21 Oct 2013

Felicity is Branded3’s Marketing Manager and develops and executes a creative marketing strategy for the agency to encourage new business. With a background in journalism and digital project management, Felicity manages a large number of marketing channels effectively to raise Branded3’s profile and facilitate the growth of the company. Follow Felicity Crouch on Twitter.

comments

  • Junaid Khan

    How can i recover from penguin 2.1 . Is it possible to recover from it??