The 9 Laws of Advanced A/B Testing

Tom Smith 1 year ago

Whilst A/B testing focuses predominantly on the user’s experience, testing anything large and small to try and squeeze that extra point to conversion figures, it is important to always have SEO at the forefront of your mind.

Below we share nine issues you must consider when running an A/B test in order to stay clear of potential problems such as avoiding penalties and preserving hard earned rankings.

  1. Only run as long as necessary

As with any tests, an A/B test is only successful when it achieves statistical significance; i.e. it is highly probable that the results would occur again. Whilst the amount of time required for a test to reach statistical significance will vary depending on a number of variables, one thing is certain; once you have gathered enough information you must end the test, period.

Once your data supports a winning variation, you should remove all variations and implement the winning test onto the original URL. In the case that no clear winner is found during a set time, turn the test off and leave the original URL alone. Running a lengthy split test may start to appear manipulative to Google, which in turn will risk a rankings penalty.

  1. Use 302s not 301s

When running an A/B test, a user will typically land on the original URL only to then be redirected to a new URL variation (if they fall into one of the test groups) – this is standard practice as you typically want to preserve the original URL as your control.

Always use a 302 (temporary) redirect when redirecting users in the test pot. Using a 301 redirect signals to Google that the original page has been removed and replaced by the test page; this is not suitable for A/B testing as the test page is only temporary and will be deleted once the test has concluded. A 302 redirect will ensure the original page does not pass any link juice to the temporary pages, which will make sure it maintains its indexation status during the testing period.

  1. Avoid A/B testing if it conflicts with sites SEO

Whilst call to actions, button text and colours are all suitable testing variable, altering H1s, body text, or other on-page ranking factors can have a negative effect on your pages’ organic rankings.

Best practice for testing should all be trying to improve your under performing pages. Performing whole scale changes to your high value pages is testing suicide for the obvious reasons that a failed test will lose you a large amount of leads and conversions. Whilst you might think that testing H1 tag variations might raise your conversion by a point of a percent, that doesn’t matter when you lose rank and as such, traffic. It is estimated that for every position lost in rankings, your organic clicks will half.

If in doubt, if you have a high value page perfectly optimised for SEO then it may be best to leave it alone.

  1. Avoid cloaking

Originally a black-hat technique, showing one version of your website to search engines and another to regular visitors is called cloaking. Whilst it may be tempting to implement when testing, perhaps with the idea of showing the original version to try to maintain organic rankings, cloaking is in direct breach of Google’s webmaster guidelines and should never be implemented, regardless of if you’re running a test or not.

When running a test which displays multiple variations of a single webpage, make sure you’re not segmenting your traffic based on user-agent. Ensure that Googlebot has the same experience as that of a regular visitor to avoid any nasty penalties.

To ensure the original page remains the sole indexed page within the test group, just implement the next two strategies.

  1. Use noindex meta robots over disallow robots.txt directive

To avoid a duplicate content issue you will want to specify to Googlebot not to index the newly created test pages. This can be done by either specifying the individual page not being indexed within the head of the document, or adding a directive to the site's robots.txt file, specifying the test group of pages not to be followed.

When dealing with A/B testing however, you should always avoid mentioning test pages within your robots.txt file. Googlebot will not be able to follow any internal link to the blocked pages whilst it is crawling a website, but it can still land on the blocked page by following inbound links that may be created - thus raising the possibility of the blocked pages becoming indexed.

Tests do not occur in the void and inbound links can always be created without your knowledge. For example, someone, after attempting to access the original page, may be redirected to the test page. Not thinking, they bookmark or copy down the page's URL and add to their blog. This link can then be followed by a search spider, causing it to be indexed.

Adding a noindex meta robots tag into the head of the test page will prevent this from occurring as no matter the path which the robot has arrived on, it will read the meta robot directive on the page, preventing its indexation. The more specific the directive is always followed; so whilst an inbound link will beat a site-wide directive, a page specific link will always beat an inbound link.

Meta robots should not be used in isolation, however. You may specify all test versions of the page be noindexed; however, Google may mistakenly interpret the original page as a duplicate, and through all versions out of the index.

  1. Use rel=”canonical”

You want to minimize the amount of repetitive content on your site. Pages of near duplicate content could - at worst - result in a rankings penalty or at best cause Google to choose which page to index over the other. This could be very damaging to your previous SEO efforts as the new test page could become indexed over your original page. Imagine appending months gaining quality links to a page only to be wasted due to improper testing planning!

To prevent this, indicate your preferred version of the page you are testing with the rel canonical tag. The rel canonical tag must be placed on every test page and must all point back to the original URL. This serves to consolidate your rankings to the original page, thus maintaining your previous SEO efforts.

Using the rel canonical tag on all variations of the page will help Googlebot understand the test URLs are near duplicates of the original URL and do not require indexation. Rel canonicals work in synchronisation with your noindex meta robots to ensure only the original page is indexed.

  1. Avoid presenting wrong version to Google

This point follows on from point one. Once you have complete achieved statistical significance, some might be tempted to run the winning version to 100% of users for a period of time, just to reinforce your previous test results. Doing this will cause all users attempting to visit the original page being redirected to the test URL.

This is never recommended as it may cause Google to index the test page even if you follow the correct meta robot and canonical practices. If the test page is indexed it probably won't have the same ranking potential of the old page (less links, not linked directly from internal pages). Plus, when you remove this page and implement the winning features onto the original page, that one won't be ranking anymore and you will lose rank as a result.

Once a test has reached statistical significance, implement it onto the original URL and delete all traces of the test pages from your site, 301 redirecting the test page URLs back to the original page.

  1. Get rid of old test pages and URLs

Once you have found your winning version you will need to update the original page with the best version then erase all digital signs of your failed versions immediately. This will prevent Google for finding one of the versions at a later point and indexing it. 301 the dead pages back to the winning version to ensure Google never finds any trace of the failed test URLs.

  1. Can't use A/B software for SEO campaigns

Whilst it is common to test on-page factors to try to increase conversion, you cannot A/B test your on-page attributes for SEO. As there is only one Googlebot you cannot split its visits to two versions of a page. Doing so will only cause Googlebot to see the pages as near-duplicates and throw one (or both) out of the index. Even ignoring this issue, Googlebot will consider the age of the page, its prominence within the site's internal linking structure, as well as the pages’ current performance within its assessment of the page. All these factors make it impossible for us to perform a traditional A/B test on SEO elements of a page.

The above demonstrates the importance of always keeping SEO at the front of your mind. Even a task such as CRO, which does seem very distinct from SEO, must always consider the impact it will have. The big takeaway from this is to always inform your SEO team or agency whenever you are conducting A/B testing or, better still, have them involved from the outset.

Stay in touch with the Zazzle Media family

Sign up for our monthly newsletter and follow us on social media for the latest news.