What You Need To Know About Google Penalties

Stuart Shaw 13 years ago

If your website experiences a sudden fall in ranking for its main keywords it could be down to a number of issues, but an algorithm change by Google could be a main factor.

Google have released a number of algorithm changes over the years, and each time a new one is released there are winners and losers, with the drop in rankings usually incorrectly blamed on Google penalties.

If you suspect a Google penalty, it first makes sense to check on any of the social networking sites or SEO Forums to see if any Google algorithm changes have been made, which could be the cause of the problem.

Where the traffic reduction from Google non-paid search is very extreme, then a penalty is much more likely, especially if the site has disobeyed one or more of the Google Webmaster Guidelines.

Contents

Panda

Link based penalties

Myths on penalties

How to check if it's a penalty, not an update issue

Panda

TLDR - Google has changed brand related search queries meaning it is the beginning of the end for being able to leverage other popular brands with high search volume for your own gain.

We have been closely monitoring the SERP flux since it was announced that Panda 4.2 was rolling out a few weeks ago and there does now appear to be an eye-opening pattern emerging. And it hits at the very heart of our understanding of Panda.

Alarmingly, many sites that had historically appeared for other brand related searches have seen a massive drop in search visibility and it appears to be related to how Google is increasing its entity and semantic understanding.

While in the past a search for a brand such as Facebook would have featured a real mix of domains it now appears that patterns are emerging – patterns that fill SERPs ONLY with URLs that directly relate to the entity that is Facebook.

There also appears to be a separation of what we are describing at commercial and non-commercial intent brand searches. For example, searching for brands like eBay, PayPal and Rightmove is different to Sony TV. The first three brands are entities and non-commercial searches but the latter clearly has some commercial intent behind it as a search term. As a result SERPs contains a mix of domains and commercial options, while the former is very, very different.

Brands like House of Fraser, Debenhams, Currys, etc will rank for a large number of brand queries due to the products they stock. They have the right to rank for these queries and it’s great to see that these have not been wrongly impacted. This type of query has implicit intent behind it to make a purchase decision, whereas queries such as PayPal, eBay and Skyscanner want to return their brand entities – website, brand information, etc.

Let’s dive into the data to understand this in more detail.

Meaningful traffic vs Meaningless traffic

A website ranking for a large brand keyword is seen as meaningless due to the nature of the search. Users who search for the likes of eBay, Skyscanner and Rightmove aren’t looking for an alternative website.

Google has now taken this intent and run with it, and as a result, a search for Skyscanner returns:

  • Skyscanner website
  • Skyscanner related ‘In the news’
  • Skyscanner mobile website
  • Skyscanner app on iTunes
  • Skyscanner Facebook page
  • Skyscanner Google Play app
  • Skyscanner Twitter page

There is no other commercially related brand/website on the first page for this brand related query, when before there was.

1 - Skyscanner

This is incredibly relevant to the user’s search for Skyscanner as it displays a lot of different options of Skyscanner entities, from its app to the Facebook page.

The inclusion of apps in organic desktop results is something we’ve not seen a lot of before, but we can see from the iTunes website in the US that they’re now appearing for a lot more search terms, which has increased their search visibility exponentially over the last month – a 36% growth.

2 - iTunes Apple

We are also now beginning to see huge growth in the UK for the apple iTunes store for brand related queries who have apps.

3 - iTunes Apple

Another website to have thrived through this update is the BBC. This is a sign to me that Google has created a list of verified sources that deserve to rank for brand related keywords. Another to have done well through the update is London Stock Exchange.

5 - BBC

Facebook, Twitter and Google Play have all seen an increase in visibility and rankings for brand related queries. Facebook’s largest growth this week has come from brand queries - see below. This is evidence that Google is pushing brand entities in search results through Panda 4.2.

4 - Facebook

Google Play used to be nowhere for a lot of brand related queries. Now, if you have an app on Google Play, it is beginning to appear on Page 1.

It is also interesting to note that this pretty significant change may actually be tracking under the radar to a degree for a couple of key reasons.

If we look at popular SERP volatility trackers such as Mozcast we can’t see hugely drastic shifts but this could be explained by the fact that they monitor key commercial intent SERPs and don’t monitor, or heavily weight, change in brand search results as much.

The other point is then an internal one. While search visibility may note huge drops, the affect of not ranking for low converting searches will have a much less pronounced effect on the metric that matters – revenue.

With updates like this, there are always losers. If you have historically gained search visibility through utilising other brand’s search volume through content, you will see a drop.

A great example of someone being impacted by this is Windows Phone – windowsphone.com – who have experienced a 90% drop in search visibility.

All of the significant drops that contributed to this loss are mainly brand related keywords that the website used to rank for. Their apps used to rank highly for the keywords that have been impacted (see below), but we have noted there is some technical questions to be asked about the website through their current redirect structure. Nonetheless, it’s odd.
6 - Windows Phone

Where they haven’t got it quite right…

Soundcloud have seen a large increase in search visibility as a direct result of this update due to the vast amount of authors/DJs that they have with brand mentions forming part of their name.

For example, DJ Paypal now ranks above PayPal’s twitter.

7 - PayPal

 

8 - Soundcloud

 

9 - Soundcloud

Another example of difficulty in understanding the nature of the term is ‘Broadband Choices’, a large brand in its own right, but the nature of the keyword is vague and open to subjectivity.

The search result for this lists the brand and then GoCompare in third, clearly showing the update has not impacted this SERP and that Google are perhaps targeting large/global brands first.

What does this mean for marketers?

In conclusion, we’re seeing a huge shift in brand related queries. If you have a website ranking well for a brand that isn’t related to your own, you are more than likely going to see this drop in the next few weeks, if not already.

This is because Google appear to be favouring ‘credible sources’ over other websites ranking well in organic search for brand terms that aren’t their own.

To boost your own brand presence in organic search, it’s a good idea to get the following set up to be on the positive side of this update.

  • Wikipedia
  • If you have an app, list it on iTunes
  • YouTube
  • News
  • Stocks, if listed

We’re seeing this now in the UK and I imagine that this has already rolled out aggressively across the US.

The main takeaway from this learning is stripping it right back, again, and always focusing on the thought of building an awesome brand. As a brand, you want to develop it across the right channels to ensure this presence is earned.

Creating a brand footprint is essential and this can be done through earning citations in big pieces of content; the distribution of award winning content; creating a large social following with great engagement; developing mobile friendly websites and apps for all devices/platforms; and making noise through offline activity to earn all of the above naturally as an added extra value.

Link based penalties

To recover, from either partial match Google Manual Penalties or the even more unpredictable Penguin algorithmic issues requires more time and data analysis than ever before. While it was once possible to utilize automated link classification and removal software, it seems those days are gone. To truly 'guarantee' removal of a manual or a link profile worthy of recovery from Penguin in no more than two attempts the answer is labour intensive manual analysis. Let's look at what my team and I have been finding over the past couple of weeks...

Anchor texts

Anchor texts is the text that an HTML link is attached too, pointing out to another website or source on the internet. A more common approach, which has become significantly dated, by a lot of websites was to link through the keyword that you were attempting to rank for. This technique was hugely successful, but has since been stamped as unnatural. Various tools can help you identify your anchor text profile, such as Majestic SEO, Ahrefs, Cognitive SEO, Open Site Explorer etc and below is an example of how it should appear in relation to the number of links coming from those anchor texts:

  1. Brand Name
  2. www.brandname.com
  3. White noise
  4. Keyword 1
  5. Compound
  6. Keyword 2
  7. ETC

Many people in SEO believe that as long as you maintain this natural appearance in your anchor text profile, you’ll remain off Google’s radar. However, this is proving to be far from the truth. If you approach link building with the idea that ‘some exact match anchor text is ok’, you are running the risk of being caught out.

Some websites are still incorporating this into their strategy and are being caught out – it doesn’t matter about the strength of their domain or link profile. Avoid linking unnaturally through to your website and follow a natural approach of linking through a variety of branded anchor texts, as well as white noise – such as ‘click here’ and ‘website’ etc. Google is improving at understanding co-citation in articles without links and there has been a recent study that showed the positive impact that rel=”nofollow” attributes assigned to links has!

Stop thinking about the ‘link equity’ a site can give to yours and think about the value that link has in that piece of content on that website. If you’re creating great content and placing it in highly relevant areas where your audience engages, you will get eyeballs through to your website, and with more eyeballs there’s an increased chance of people linking to your site.

Widgets

Businesses in particular niches, such as banking, exchange, travel etc. all have the need for widgets as they’re placed on partner websites. If the link from that widget isn’t correctly handled, your website can be in trouble. These widgets often have several links, not just one, so this practice must be applied to EVERY single link – not just the one visible to the user. Common links from the widget can be through the logo, text surrounding the widget fields and through an iframe. Links inside iframes, contrary to popular belief, do pass value and are crawled by Google so it’s important to analyse the quality of the site that you post this onto.

If links from widgets, that often reside site wide in the sidebar, are coming through branded anchor text, many are lead to believe that this is OK. However, site wide links aren’t good to have and I would immediately recommend applying the nofollow attribute to that link.

The best form of practice to follow is this: if your website is distributing widgets and it will sit in a site wide position (often in the sidebar), automatically nofollow all links to your website from that widget. If the widget is going to be positioned on a specific page, and only that page, assess the quality of the website and if you’re going to allow a followed link to your site, ensure it’s through branded anchor text. If you aren’t able, or don’t have the capacity, to individually check each website that requires your widget, simply give all websites the nofollowed version of your widget where ALL links are nofollowed.

Guest posts

Historic guest posts can play havoc on your website if the websites that you have had previous relationships with have since been hit by a Google related penalty. Matt Cutts recently condemned Guest Posting and signalled it will be coming to an end soon. A lot of businesses and people are concerned by this ousting of one of the most utilised methods of acquiring links. Google’s finger is very much on the pulse of guest posting and they’re actively searching for more and more networks practicing this method.

The critical thing here to bear in mind, however, is what is being classified as a 'guest post'. There is nothing wrong with creating good content and sharing it with relevant sites. We would class this as a 'guest post' and it certainly is not against Google guidelines. The issue comes when that process is based on poor quality content with paid links.

This, in my opinion, is not guest posting, just article spam in a different set of clothes. My advice for (low quality) guest posting is to avoid it and move your time more into Digital PR and creating relationships that will require great content to be created to promote your brand into the arenas that have more eyeballs, not ones that will solely place that will allow content with a link in it. Check ALL historic link building campaigns and the metrics of the website to see if they have been hit by a penalty. Using the following methods will help you identify whether or not a site has been hit, where your content has been placed historically:

  1. Google search for the website, is it indexed? If not, it has been removed by Google.
  2. Does the website have any PageRank? If it has PR 0 or PR n/a – I would remove any content.
  3. Check the Search Visibility of the website by using Searchmetrics (or something similar) and if you notice any drops, compare it to a history of Google algorithm updates to identify if it’s penalty related.
  4. Using Majestic SEO or Ahrefs, check the websites referring domains – has it taken a hit? If so, it’s most likely due to websites removing links from the website because of poor quality.

Directories

I have seen many articles and blogs that completely dismiss directory links, for the most part – it’s true. Directories were the easy way of getting a large number of exact match links. If you placed your content on the wrong directory however, it would automatically spin and replicate and you would have reaped the rewards then – but could now be left on your knees grovelling. Cognitive SEO has a great feature, which allows you to identify your links by webpage type – one of them being directories, which speeds up this process. If you have to do this manually and identify all the directories, this is much more thorough as software is only usually 90% correct. There are still some highly valuable directories in existence, which will significantly help your search visibility.

What I am stressing here is, DON’T automatically disavow/remove ALL directory links. Analyse each individual directory link and if the metrics are powerful, ensure it has branded anchor text and keep it. If the directory fits all of the right metrics and has exact match anchor text, try and change it to branded and if you can’t, remove it. However, if you have already submitted a reconsideration request and the sample link data from Google are directories – you will have to become much more aggressive with the directory links that you have previously approved.

Penalty Software

The final point I want to cover is specialist link penalty software, such as LinkRisk and Link DTOX. These pieces of software are an essential part of our link removal process, but they shouldn’t be 100% relied upon. They have both helped us successfully recover some of our clients from their respective penalty, so this isn’t dismissing them completely as they’re integral in analysing large link profiles and analysing how disavow files have had an impact on the general health of a website’s link profile.

Once you have analysed your link profile with your respective penalty software, you should manually review each link in their respected group. We have noticed a definite shift in Google’s approach to penalties, especially if you have Penguin or a partial manual action. When you’re getting very close to successfully recovering from a penalty, Google WILL pick out individual links to hold you back on, so you have to be very thorough and where necessary, aggressive, with the use of removal and your disavow file. The main example we’ve seen of this is exact match anchor text on very good sites.

Penalty Recovery

If there was one piece of advice I can leave you with here it is to be thorough. Your first priority HAS to be exact match anchor texts and guest posts, as this is what Google are being very aggressive with in regards to declining reconsideration requests and there’s a high chance it’s included as part of penguins algorithm! If you’re submitting reconsideration requests, study and analyse the three sample links that they send back as this tells you A LOT in regards to what Google are looking at in your link profile. Look at the anchor text, the type of website, the metrics and the page the link is pointing to on your website – are there patterns? If so, follow it up! In several recent messages we have seen examples of very good quality sites being picked out as examples of 'spam' simply because they were using partial match, or compound, commercial anchor texts.

Simply being able to filter by a certain level of authority, or a certain metric just will not cut it any more and while we will use an automated classifier to help initially prioritize link lists the only sure way of checking is to manual inspect each one for the following: > Anchor text > partial AND exact match should come under a LOT of scrutiny here. > Trust and Citation > Majestic's metrics can be really useful in segmenting links for manual inspection. Follow or guide here for how to do that. >DA > Or 'Domain Authority' can help to a degree but we use it only as a secondary check, preferring to use the above as a measure of quality and trust. > On page check > Check for obvious signs of paid link activity - lots of links our (especially using commercial anchors), little content above the fold, rarely updated content etc. > External backlinks > This can take an age so to keep it quick check the number and link type. If it's a high number and all directories and the site doesn't deserve that many then probably stay away. Would you add anything to our list? If so please share them in the comments below. More data helps us all!

Myths, busted

Google’s increasingly cryptic advice surrounding manual penalty recovery has created a void that is increasingly being filled with mis-information and advice and spawned a cottage industry of poorly informed vendors feeding off that lack of understanding.

It’s something I see day after day now as site owners come to us at their wits end having, as far as they are concerned, done ‘everything’ they can to remove said penalty.

I’m going to put this out there: Some of the ‘advice’ written on manual penalty recovery has been guess work and some of those attempting to recover sites have little clue about just how thorough you need to be in order to achieve a revoke.

As harsh as that may sound over 12 months or so of fixing this for others I have seen more than enough evidence to support this opinion and today I want to attempt to tackle some of the myths surrounding the harshest of all Google’s penalties to date.

Having played with various ways of segmenting link data, including our own process based around link trust, I wanted to share some of my own ‘opinions’ to help counterbalance other miss-guided information out there about manual penalties.

Myth: Using sample data from Google Webmaster Tools

Fact: Make sure you grab ALL link data

The most common misconception when it comes to dealing with unnatural links is just grabbing your link data from Google Webmaster Tools or Majestic SEO and to go through your links. The truth is this is not enough. You have to remember that when Google has placed a manual penalty on a site, yes they are only looking at a snapshot of your link profile but we have no idea which part it is. The only true and concise way of dealing with a manual penalty is to grab all the link data you can get hold of.

At Zazzle Media we have had good success in using the following combined data sources:

  • Majestic SEO
  • AHrefs
  • Open Site Explorer
  • Google Webmaster Tools (Both Sample data and Latest)

If you grab your link data from all of these, copy it all into one Excel spread sheet, remove the duplicates and you have your entire link profile in one document. From here you can start going through and finding the links that are causing the issues.

Tip: We have seen a lot of people exporting the standard CSV exports from link data sites. Majestic for example will only give you the first 2500 links from the CSV export in site explorer. If you site has more than the 2500 links you will need to run an advanced report in Majestic to get all the links. This is the same for Open Site Explorer and in A Hrefs you will have to complete a raw export.

Myth: Automated link analysis tools can do the job

Fact: Do not fully trust link analysis software

There are some very good link analysis tools becoming very established in the market and are able to help detect unnatural links. Link Detox seems to be the largest and most popular, others are emerging on the scene such as Cognitive SEO who has recently launched the automatic unnatural links detection feature in their software.

These tools are good and you are able to feed all your link data into them from other sources however they are not entirely accurate. They are a good start but we have seen a lot of healthy links being pulled in as unnatural and also a lot of unnatural links being reported as healthy. The tools are good in the sense they will check if the links are indexed in Google and have several ways of detecting spammy link networks but there are many more factors that they fall down on.

One example would be links built with spun content. We have seen a lot of sites that have entire link profiles built by submitting artificially spun articles to websites with links placed within them. The issue here is these links can be placed on perfectly healthy websites that will not be detected by the unnatural links tools but Google can easily work out that this link should be penalised because of the content on the page.

By all means use the link analysis tools as a guide but always go through your links manually to make sure you get all the unnatural ones and make sure you do not remove any that are healthy and may be providing a lot of link juice. You can quite easily cause a lot of damage if you remove the wrong links to your profile.

Myth: Just remove low quality and low DA links

Fact: Do not leave a single unnatural link in the profile

A lot of people when trying to deal with the manual penalty on their site are very careful in what links they remove. They may find an unnatural link that is a PR3 and want to leave it because it may be helping, big mistake. It is in fact these links that are causing the problem and must be removed.

The best way to think of this situation is to think that when you reconsider your site with Google, a human, a real person will be looking over your links to see if you can be allowed back into their trusted zone. When a human is involved you can be assured they can turn down any reconsideration if they see a single unnatural left.

For this reason it is critical that all the unnatural links are captured and dealt with before reconsidering.

Myth: Remove the unnatural links, disavow any that cannot be removed

Fact: Make sure you disavow ALL unnatural links before starting link removal

Once all the spammy links pointing to a site have been gathered, its now time to start the disavow and link removal process. In this situation, most people start the link removal process first, get a good number of links removed and then disavow any links that they have failed to remove. This is not the correct way and the reason behind this is simple. Even though you have removed the unnatural links flagged. Google will not know this until they recrawl the site and register this. This ultimately means that you could file a reconsideration with Google and they may turn it down simply because they don’t know about a link that you have already removed. If you disavow all your unnatural links before the link removal process, you are effectively telling Google about all the unnatural links you want to deal with before you have dealt with them.

We have had sites revoked from their penalties when we have reconsidered the day after submitting a disavow file for the site. We have taken this info and worked out that the human reviewers at Google must overlay the disavow profile immediately when looking at the site. This ultimately means that if you make sure you disavow ALL the unnatural links and show a good percentage on them removed, you will get revoked.

Myth: A simple outreach email asking for the link to be removed is enough

Fact: Spend time on writing a good removal email to get the highest Conversions

It is well known now that although not impossible, it is very difficult to recover from a manual penalty by just disavowing links. However it is possible to recover by disavowing all your unnatural links and remove some of them. At Zazzle Media, we have had recoveries by only removing around 15% of links so this is a good start to go on. Obviously the more links you are able to remove the better.

When it comes to removing links, create a good, well written email that explains the situation and details the links that need to be removed. Some of the email we have seen are quite threatening in the sense they brutally request for the links to be taken done or else. From the conversion rates we have seen this is not needed.

A simple email explaining the situation and that you have lost traffic because of it often does enough? It can help to add a small amount of fear in and explain that sites providing unnatural links to other sites can often be targeted for penalisation often helps conversion too.  Always remember, you need to have as many links removed as possible to improve your chances of being revoked. Do not skimp on this part of the process.

Myth: When a site recovers, rankings will be restored

Fact: A site will not increase or may drop even further in rankings before it starts to increase

When we pull a site out of the manual penalty, most people expect to see an increase in rankings because of it. Unfortunately this is simply not the case and it is quite simple to see once you think about it.

If you think, a site may be on page 3 for its keyword term when the site starts up. At that point they may have used spam to game the algorithm and progress their site to page one of even position one. At this point the site owners are going to be extremely pleased with the results and expect this consistently. Then, the manual penalty hits and they are reduced to lower rankings than when they started.

From this point they remove a load of links and recover. Now originally and even in the manual penalty stage the site may have still been propped up by the page rank of the unnatural links. Now these links have been removed there is only one thing that will happen and that is the site will drop even further in rankings.

The only way to start to see the site retuning in rankings is to build a consistent number of good quality links that will restore the lost page rank.

There is a lot of information about manual penalties and a lot of information that isn’t necessarily correct. However the main points I have raised cover off many of the most common issues we've come across in relation to manuals. Hopefully, then, you guys can read through this, take away some points and push your site through the recovery process faster.

How To Test For A Google Penalty

When a penalty is suspected, you can start by checking the number of URL's Google has indexed. If none are indexed then probability of a Google penalty is high, especially if you know that your site was previously indexed.

Where your site or certain pages have been dropped from the index, always investigate any recent Web Server outages. Google remove websites that return 404 when Googlebot crawling takes place.

Searching for the exact company and domain names in Google is another method. If you no longer rank for these terms having previously ranked highly for your own brand name, a penalty is likely to have been incurred. The exception to this rule is a new website with few backlinks, which may not be Google indexed or if it lacks enough trust to rank well.

However, not all of the penalties handed out by Google result in a loss of Page Rank. Various filters can be triggered by unnatural irregularities in backlinks, by excessive link buying or reciprocal link exchange, particularly when you use similar anchor text in links.

To avoid a penalty or SERP filter, take particular care when embarking on any link-building program. Specifically, avoid reciprocal link exchange becoming the mainstay of your SEO campaign and if you do buy links, make sure that they are not placed in the footer of the site you're listed on or under sub-headings such as 'Our Sponsors'. Links placed naturally within content on relevant sites are much more sensible and less detectable.

If you’re of the suspicion that your website has received a penalty, check your site for breaches of guidelines and, where appropriate, request website reconsideration from your Webmaster Tools account.

Intriguingly, some web sites which are in violation of the guidelines receive an e-mail from Google advising them to clean up their act, warning of a penalty and website de-indexing. When the breach is removed from the offending site, Google usually clears the penalty and re-indexes the site, as many so-called penalties are actually 'filters' triggered by irregularities found by Google's algorithm.

Why have I got a Google penalty?

If your website has received a penalty, here is a guide to help you diagnose the cause and find a resolution.

Link building

In-bound link over-optimisation is the most common cause of a penalty. Poor SEO techniques including aggressive link building using the same keywords in link anchor text causes problems, especially when the website is relatively new, picking up rankings quickly.

When managing link building campaigns, vary the link text and incorporate a variety of different keywords. Also ensure you have a fair spread of back-links using your name or URL and that links are acquired to pages other than just your homepage.

There is strong evidence that Google has recently introduced some new automatic over optimisation filters into their algorithm. These seem to have the effect of applying a penalty to a page over optimised for the same keyword by link building.

Keyword stuffing

Remove excessive keyword stuffing in your website content. Always use natural, well-written web copywriting techniques – a bit like this fine piece of literature.

Banned Domain/URL Links

Run a test on all outbound links from your site to see if you are linking to any sites that have been banned. These will be sites show Page Rank 0 with a greyed out Toolbar Page Rank indicator.

If your site gets a significant number of links from banned sites or sites that have been identified as manipulative link networks or other Black Hat sources, then your ranking may drop. Always remove links from Black Hat link farms or banned domains. Ultimately, adopt ethical link building practices to avoid these problems in the first place.

Devalued Links

When you sit there scratching your head trying to work out the reasons for a drop in Google’s rankings, it's worth bearing in mind that Google continually applies devaluation to links from manipulative sources, considering spammers to be exploiting it to artificially raise the ranking of their sites. Paid links are a particular target for the ‘spam team’ and Google applies continual algorithm tweaks to combat link spam.

When link devaluation is applied, as it has with reciprocal links as well as those from many paid text links, low quality web directories and link farms, the recipient website may suffer a drop in ranking. The severity of any fall is associated with the website's reliance on that particular type of linking.

Spammy Link Neighbourhoods

Check you are not linking to any bad neighbourhoods, link farms or doorway pages.

If in doubt, we recommend quality checking all of your outbound links to external sites using the Bad Neighbourhood detection tool. Whilst this isn't perfect, it may spot "problem sites". Another good tip is to search for the HTML homepage title of sites that you link to. If the sites don't come up in the top 20 of the SERPs, then linking should be avoided.

Cross Linking / Sitewide Links

If you run more than one website and the Google penalty hits all sites at the same time, check the interlinking between those sites. Extensive keyword optimised interlinking of websites, particularly on the same ISP can be viewed as "link schemes."

Site-wide links such as those offered in a Blogroll should be avoided, particularly if they’re keyword optimised. The reality is that site-wide links do little to further improve visibility in the SERPs, as Google only counts one link from a site to another. There is some evidence that the extensive use of unnatural looking site-wide links can lower Google’s trust in a site, which can reduce its ranking.

Paid Links

There is significant evidence that link buying can hurt rankings and Matt Cutts (head of Google's spam team) mentions this on his Google SEO blog. Matt states that Google also devalues links from companies selling text links, so that they offer zero value to the recipient in terms of improving website ranking or Page Rank.

Reciprocal Links

Excessive reciprocal linking may trigger a penalty or cause a SERP filter to be applied when the same or very similar link anchor text is used and large numbers of reciprocal links are added in a short time, leading to a high link accrual rate. Reciprocal linking should be restricted to companies you have some business relationship with, rather than solely for SEO benefit.

Lots of reciprocal link building with low quality sites or websites that have an unrelated theme is not recommended. This can lead to a Backlink Over Optimisation Penalty, better known as a BLOOP to SEO experts, which causes a sudden, often drastic, drop in SERP ranking.

Duplicate Content and Websites

Duplicate content in its own right is not thought to trigger penalties, but it can be responsible for the removal of content and for placing all duplicate pages into Google's supplemental index, resulting in pages not ranking in the SERPs. This can result in a sudden loss of traffic to a site, similar to that caused by a penalty.

Google will not index duplicate content and any site that utilises large amounts of content featured elsewhere on the Internet is likely to suffer.

Many webmasters fail to realise the negative effect of having multiple sites on the web serving up the same content. Having duplicate content domains dilutes your link equity as there is a tendency for people to link to more than one, spreading links across multiple domains.

Aliased domains must be removed (retaining the domain with the most search engine visibility) and each one 301 redirected to the domain being kept.

Algorithm changes made around the time of the Google Panda update have tightened up on duplicate content and Alias Domains can trigger the removal of website content from the Google index.

Not Using 301s For Redirects

Meta Refresh and JavaScript automated re-directs regularly result in penalties as the pages using them are perceived to be doorway pages. To avoid penalties, use a 301 re-direct or Mod Rewrite technique instead.

Content Feeds

Whilst content feeds are widely used online, there is some evidence that pulling in large amounts of duplicate content through such feeds can have a negative effect on ranking; particularly post-Panda.

Non Compliance With Google Webmaster Guidelines

Since 2007, Google may alert webmasters via the Webmaster Console who they feel might have broken their guidelines, advising them that their site has been removed from Google for a period of time.

However, blatant spam or significant breaches of the rules will result in a site being banned or filtered from the SERPs without notification. Where notification of a violation of Google's guidelines is received, it usually encourages that the problem/s is resolved and then submit a 'reconsideration request.'

Google SERPs Filters

There is clear evidence that over-optimising a single keyword by adding too many backlinks or site-wide links can trigger a Google filter whereby the recipient page of these links no longer ranks in the organic SERPs for the particular keyword.

The Google Trust Rank of the website may be affected leading to a ranking reduction for other keywords. Affected websites can retain ranking for other long tail keywords which have not been over optimised, particularly on pages which have not been subjected to aggressive link building, but may have one or two decent natural links.

Recovering Takes Time

Recovering from a Google penalty normally involves fixing the cause of the problem/s and then waiting for Google its sanctions. To fully recover Google ranking may take 2 to 3 month, although some have recovered in a matter of weeks following full and thorough resolutions to the infringements.

The Google algorithm can automatically remove penalties if the affected website is still Google indexed. If your website has been de-indexed and lost Page Rank, then you will need to make a re-inclusion request.

 

Stay in touch with the Zazzle Media family

Sign up for our monthly newsletter and follow us on social media for the latest news.

Our website uses cookies for various purposes and to enhance the site’s functionality. This helps us understand how you use and interact with the website.

Settings Accept Cookies