Firstly, what a year it's been - a year in which Google has improved and allowed their algorithms to become more aggressive and proactive in its crawling of websites. This means that more websites than ever are getting intertwined with their updates.
With Panda becoming incorporated into the main Google algorithm through ‘soft updates’, this means your website will be analyzed every time Google crawls to assess the quality of content. Whether or not it’s above the fold, potentially duplicated and spammy content will all be taken into account and measured.
They have also started taking the first steps into doing this with Penguin and it will become more and more intertwined with the core algorithm, much like Panda has.
Google is also continuing to tackle spam filled niches, such as payday loans and the gambling/online gaming industries. Cloaking and link network beneficiary websites are still ranking well for highly competitive terms and making a lot of money.
This yearly round up will look at the major updates, what the update contained and useful reading.
Google has refreshed their page layout algorithm which penalises websites that are delivering a poor user experience by displaying too many ads above the fold. Websites that use this heavy ad approach are interested in one thing, making money from AdSense and third party cost per click facilities.
There were some major fluxes experienced in ‘algorithm flux’ trackers, such as Mozcast – the dates were narrowed down to the 24th and 25th March.
There was a lot of chatter on the Webmaster forums and in the Twittersphere amongst the SEO community about this being the ‘soft’ integration of Panda to the main algorithm.
However, this was never confirmed or denied by Google.
The 2.0 version of the Payday Loan algorithm was released just prior to Panda 4.0, but the exact date of its release was hard to pinpoint as the updates were so close to one another. This update was later defined to target specifically spam filled websites, not queries.
Google released the original Payday Loan algorithm on June 11th 2013 and have since been refining this specific algorithm to try and combat this spam filled niche. It’s an important niche to conquer due to the high volume of searches and money made through this specific niche. Loan sharks via black hat methods are targeting a lot of vulnerable people and it genuinely poses a threat to the everyday Joe looking for a legit payday loan.
A new, more aggressive and integrated Panda 4.0 algorithm was released which positive and negatively affected a lot of websites.
eBay was the standout website that was publically announced as being affected the most by Panda 4.0. This was due to a lot of pages that they had specifically created to target specific keywords – landing pages, effectively.
eBay.co.uk saw a 48.44% drop in search visibility and eBay.com saw a 48.1% drop.
They were created to try and deliver a more user-friendly approach to eBay visitors, but were put live before the necessary content was uploaded to make them Google friendly. You can read more about this here.
eBay have since begun to recover from this and it was determined that this was a ‘manual Panda’ action, which means that Google had seen this and had manually applied the Panda 4.0 algorithm to these pages – this is a very unique case.
Matt Cutts announced this update at a SMX Advanced keynote. It was made apparent by Matt that this update was specifically released to target spam filled queries, not websites like the 2.0 update and this is why the updates were so close to one another.
John Mueller, Webmasters trend analyst at Google, surprisingly announced on the 25th June that Google would be dropping all authorship photos from the SERPs. The 28th June was the date that the drop was complete and that rel=”author” markup was made almost completely redundant after heavy promotion of how useful this implementation is for your websites.
We believe that Google are becoming better at understanding for themselves who the influencers are in each niche on the Internet, therefore removing the purpose of rel=”author”.
After a EU ruling, Google introduced the ‘Right To Be Forgotten’. This allows users to ask Google to remove irrelevant or outdated information about them when conducting a search query for their name.
However, the implementation and practicality of this has been questioned by many as the results are only removed from your local search engine and are still available in others, whilst completely going against the US first amendment for free speech.
Google began to promote modern websites that displayed content using up to date methods that were both more user and search engine friendly. I’m sure this would have caused massive rifts amongst large corporations that hadn’t updated their website practices and as a consequence dropped in the rankings over night.
Warnings in the search results are displayed on URLs that are using out of date methods, which will deter users from clicking through to the website.
Pigeon was a further update from Venice and other smaller updates as to how Google Local is presented and identified. This update brought the local and core algorithm closer together to display better local results for related search queries and in turn, altered a lot of local results.
Google further backed their support for websites that encrypt their data by publically announcing that websites who sit behind a HTTPS/SSL certificate will receive a small ranking boost.
We have since moved a few websites across to HTTPS and have seen only minimal increments, if any at all. However, Google has said that the signal will start out on a small scale and increase if the change proved a positive one.
One thing to be aware of if you’re planning on moving to HTTPS is to ensure that you strictly follow Google’s guide (see below) and ensure that you have all the appropriate HTTPS tracking profiles on Webmaster Tools, otherwise you won’t be able to see how Google’s data on your website.
There was chatter amongst webmasters and the SEO community about either a small SSL update or mini panda as ranking fluxes was witnessed. However, Google did not confirm or deny this speculation.
August 28th 2014 - Authorship removed
After the update on the 28th June that removed authorship photos, Google announced that authorship is to be completely removed – it will no longer be processed or read by Google.
By the 29th August, all authorship bylines, for example ‘by James Perrott’ had been completely removed by Google.
Panda 4.1 was announced to include an algorithmic component, which further intertwined the analysis of content structure, quality and potential duplication as part of the main ranking algorithm.
Google announced that it would be a ‘slow rollout’, which made the exact timing of this update a mystery, but it was estimated that between 3-5% of all queries were affected.
The one that everyone had been waiting for… Penguin 3.0 finally arrived!
After more than a year since Penguin 2.1, Google publically announced that the latest Penguin algorithm had been released after several weeks of suspense. The update was much smaller and had less impact than everyone was anticipating as only <1% of US/English queries were affected.
This update was seen at the beginning to only really impact google.com and websites that reside in the US. However, it soon became clear that this update was a very slow, worldwide rollout as everyone began to see movement across the globe. This was the long anticipated wait that a lot of webmasters that had been waiting for to allow their websites and businesses to recover from the first penguin update.
Google fought the illegal downloading website niches before, more than two years ago, in August 2012.
Since the original update, there has been a lot of press about piracy firms being seized by the US government, such as megaupload. This is a battle that will happen for as long as the Internet is alive, but Google are now doing their bit to aid the industry that is suffering because of illegal downloads.
This update further combatted the software and digital media piracy niche, which caused huge drops for a lot of hugely popular illegal download websites.
Over Thanksgiving there was a lot of discussion about further fluctuations in the rankings with people within the community witnessing website recoveries.
Google did confirm that this was Penguin 3.0 still rolling out. There is no end date for this rollout and we’re continuing to see movement on websites via this algorithm.
We certainly saw a large number of algorithm updates in 2014 and some have stemmed from public media pressure and government initiatives, such as the DMCA piracy update, payday loan and the right to be forgotten – all good updates. We’ve also seen Panda and Penguin become more aggressive and integrated with the main Google ranking algorithm. The main advancement of Penguin 3.0 was the ability to target specific pages within a website that has used underhand tactics to acquire links to manipulate Google’s search rankings.
Other useful updates and tools announced by Google are below, check them out:
I believe we’re going to see more of these updates again next year. Panda and Penguin will become part of the core ranking algorithm and Penguin will become more aggressive, but easier to recover from, as it’ll crawl your website more frequently.
The introduction of the mobile friendly tool indicates that Google are taking mobile much more seriously. This has already been proven by the introduction of the ‘mobile friendly’ tags on mobile search tags. Ensuring your website is mobile friendly has to be a major priority when optimizing your current website and looking at re-launching your site in the future.
‘Good links’ are becoming more and more difficult to determine due to the high number of metrics that make up a link; anchor text, placement, quality of website, context, relevancy, trust, equity etc. The world of guest posting will become smaller in 2015, with Digital PR firmly taking its place.
Sign up for our monthly newsletter and follow us on social media for the latest news.