Link-based penalties have been around for a long time now – and quite frankly, they’ve been covered to death!
The next logical thought that will probably be running through your mind right now then is ‘Why contradict yourself and write another post on such trodden ground?‘
The answer is actually pretty simple. And it’s based on real data from the past few weeks of working at the coal face on this stuff. The game has changed.
To recover, from either partial match Google Manual Penalties or the even more unpredictable Penguin algorithmic issues requires more time and data analysis than ever before.
While it was once possible to utilize automated link classification and removal software, it seems those days are gone. To truly ‘guarantee’ removal of a manual or a link profile worthy of recovery from Penguin in no more than two attempts the answer is labour intensive manual analysis. Let’s look at what my team and I have been finding over the past couple of weeks…
Anchor texts is the text that an HTML link is attached too, pointing out to another website or source on the internet. A more common approach, which has become significantly dated, by a lot of websites was to link through the keyword that you were attempting to rank for. This technique was hugely successful, but has since been stamped as unnatural.
Various tools can help you identify your anchor text profile, such as Majestic SEO, Ahrefs, Cognitive SEO, Open Site Explorer etc and below is an example of how it should appear in relation to the number of links coming from those anchor texts:
- Brand Name
- White noise
- Keyword 1
- Keyword 2
Many people in SEO believe that as long as you maintain this natural appearance in your anchor text profile, you’ll remain off Google’s radar. However, this is proving to be far from the truth.
If you approach link building with the idea that ‘some exact match anchor text is ok’, you are running the risk of being caught out. Some websites are still incorporating this into their strategy and are being caught out – it doesn’t matter about the strength of their domain or link profile.
Avoid linking unnaturally through to your website and follow a natural approach of linking through a variety of branded anchor texts, as well as white noise – such as ‘click here’ and ‘website’ etc.
Google is improving at understanding co-citation in articles without links and there has been a recent study that showed the positive impact that rel=”nofollow” attributes assigned to links has! Stop thinking about the ‘link equity’ a site can give to yours and think about the value that link has in that piece of content on that website. If you’re creating great content and placing it in highly relevant areas where your audience engages, you will get eyeballs through to your website, and with more eyeballs there’s an increased chance of people linking to your site.
Businesses in particular niches, such as banking, exchange, travel etc. all have the need for widgets as they’re placed on partner websites.
If the link from that widget isn’t correctly handled, your website can be in trouble. These widgets often have several links, not just one, so this practice must be applied to EVERY single link – not just the one visible to the user.
Common links from the widget can be through the logo, text surrounding the widget fields and through an iframe. Links inside iframes, contrary to popular belief, do pass value and are crawled by Google so it’s important to analyze the quality of the site that you post this onto.
If links from widgets, that often reside site wide in the sidebar, are coming through branded anchor text, many are lead to believe that this is OK. However, site wide links aren’t good to have and I would immediately recommend applying the nofollow attribute to that link.
The best form of practice to follow is this: if your website is distributing widgets and it will sit in a site wide position (often in the sidebar), automatically nofollow all links to your website from that widget. If the widget is going to be positioned on a specific page, and only that page, assess the quality of the website and if you’re going to allow a followed link to your site, ensure it’s through branded anchor text.
If you aren’t able, or don’t have the capacity, to individually check each website that requires your widget, simply give all websites the nofollowed version of your widget where ALL links are nofollowed.
Historic guest posts can play havoc on your website if the websites that you have had previous relationships with have since been hit by a Google related penalty.
Matt Cutts recently condemned Guest Posting and signalled it will be coming to an end soon. A lot of businesses and people are concerned by this ousting of one of the most utilised methods of acquiring links.
Google’s finger is very much on the pulse of guest posting and they’re actively searching for more and more networks practicing this method. The critical thing here to bear in mind, however, is what is being classified as a ‘guest post’. There is nothing wrong with creating good content and sharing it with relevant sites. We would class this as a ‘guest post’ and it certainly is not against Google guidelines. The issue comes when that process is based on poor quality content with paid links. This, in my opinion, is not guest posting, just article spam in a different set of clothes.
My advice for (low quality) guest posting is to avoid it and move your time more into Digital PR and creating relationships that will require great content to be created to promote your brand into the arenas that have more eyeballs, not ones that will solely place that will allow content with a link in it.
Check ALL historic link building campaigns and the metrics of the website to see if they have been hit by a penalty. Using the following methods will help you identify whether or not a site has been hit, where your content has been placed historically:
- Google search for the website, is it indexed? If not, it has been removed by Google.
- Does the website have any PageRank? If it has PR 0 or PR n/a – I would remove any content.
- Check the Search Visibility of the website by using Searchmetrics (or something similar) and if you notice any drops, compare it to a history of Google algorithm updates to identify if it’s penalty related.
- Using Majestic SEO or Ahrefs, check the websites referring domains – has it taken a hit? If so, it’s most likely due to websites removing links from the website because of poor quality.
I have seen many articles and blogs that completely dismiss directory links, for the most part – it’s true.
Directories were the easy way of getting a large number of exact match links. If you placed your content on the wrong directory however, it would automatically spin and replicate and you would have reaped the rewards then – but could now be left on your knees grovelling.
Cognitive SEO has a great feature, which allows you to identify your links by webpage type – one of them being directories, which speeds up this process. If you have to do this manually and identify all the directories, this is much more thorough as software is only usually 90% correct.
There are still some highly valuable directories in existence, which will significantly help your search visibility. What I am stressing here is, DON’T automatically disavow/remove ALL directory links. Analyse each individual directory link and if the metrics are powerful, ensure it has branded anchor text and keep it.
If the directory fits all of the right metrics and has exact match anchor text, try and change it to branded and if you can’t, remove it.
However, if you have already submitted a reconsideration request and the sample link data from Google are directories – you will have to become much more aggressive with the directory links that you have previously approved.
The final point I want to cover is specialist link penalty software, such as LinkRisk and Link DTOX. These pieces of software are an essential part of our link removal process, but they shouldn’t be 100% relied upon.
They have both helped us successfully recover some of our clients from their respective penalty, so this isn’t dismissing them completely as they’re integral in analysing large link profiles and analysing how disavow files have had an impact on the general health of a website’s link profile.
Once you have analysed your link profile with your respective penalty software, you should manually review each link in their respected group. We have noticed a definite shift in Google’s approach to penalties, especially if you have Penguin or a partial manual action. When you’re getting very close to successfully recovering from a penalty, Google WILL pick out individual links to hold you back on, so you have to be very thorough and where necessary, aggressive, with the use of removal and your disavow file.
The main example we’ve seen of this is exact match anchor text on very good sites.
If there was one piece of advice I can leave you with here it is to be thorough. Your first priority HAS to be exact match anchor texts and guest posts, as this is what Google are being very aggressive with in regards to declining reconsideration requests and there’s a high chance it’s included as part of penguins algorithm!
If you’re submitting reconsideration requests, study and analyse the three sample links that they send back as this tells you A LOT in regards to what Google are looking at in your link profile. Look at the anchor text, the type of website, the metrics and the page the link is pointing to on your website – are there patterns? If so, follow it up!
In several recent messages we have seen examples of very good quality sites being picked out as examples of ‘spam’ simply because they were using partial match, or compound, commercial anchor texts. Simply being able to filter by a certain level of authority, or a certain metric just will not cut it any more and while we will use an automated classifier to help initially prioritize link lists the only sure way of checking is to manual inspect each one for the following:
> Anchor text > partial AND exact match should come under a LOT of scrutiny here.
> Trust and Citation > Majestic’s metrics can be really useful in segmenting links for manual inspection. Follow or guide here for how to do that.
>DA > Or ‘Domain Authority’ can help to a degree but we use it only as a secondary check, preferring to use the above as a measure of quality and trust.
> On page check > Check for obvious signs of paid link activity – lots of links our (especially using commercial anchors), little content above the fold, rarely updated content etc.
> External backlinks > This can take an age so to keep it quick check the number and link type. If it’s a high number and all directories and the site doesn’t deserve that many then probably stay away.
Would you add anything to our list? If so please share them in the comments below. More data helps us all!