A Practical Guide To A/B Testing Your Marketing

Header Image

Let me explain A/B testing in the only way I know how…with ice cream

Have you ever sampled multiple ice cream flavours before choosing the one? Well, I want you to think of yourself as our test subject in an A/B test.

You see, all flavours are technically ice cream. Each has an equal chance of giving you a brain freeze. And yet, one will always be tastier than the rest.

Not only is this a great way to sneak free ice cream, but a great way to explain the concept of A/B testing.

A/B TESTING DEFINITION

The A in A/B testing is your control variant. Your OG (original) strategy, in this example, is your ride or die fave flavour. Your control flavour if you like. And the B is your experimental variant, the one that is modified to create the comparison for testing.

HYPOTHESIS EXAMPLE

What can you A/B test?

An easier question would be, what can’t you test?

Let’s start with where can you test.

The ability to A/B test spans multiple mediums including, but not limited to: email marketing, your website, blog titles and paid advertising.

Although you can facilitate your own tests, A/B testing works best with platforms that allow a random split of your audience and collect the metrics needed to determine a winner. Think platforms like Facebook and Google ads, most ESP’s (Email Marketing Service Providers) or your website using the like of Google Optimise or VWO. All have varying degrees of testing functionality, but testing functionality nonetheless.

Now let’s talk about what you can test!

There are two main categories of A/B testing: User experience and Design.

User experience testing relates to changes to the layout, useability and overall experience of your content.

For example, changing the position of a button, the length of your copy or what your copy says with the goal of streamlining the user experience.

USER EXPERIENCE

Design testing is about modifying the overall look and feel of your content. For example, trialling variants in colour, fonts, image content, music etc.

DESIGN

Why do we A/B Test?

Over time, you will generally get an idea of what your customers like and dislike. A/B testing simply speeds up this process!

It’s a way of taking the guesswork out of marketing. You don’t have to decide yourself whether your audience would prefer chocolate or vanilla ice cream, you will know based on the winning majority.

Though that one should be obvious, who on earth orders vanilla!?!

Big brands such as Dell have experienced a conversion rate increase of 300% when they tested their landing pages against web pages (FinanceOnline)

Data doesn’t lie. Use your results as a compass and you will find yourself inching towards this kind of success one test at a time.

Sounds great, right? Let’s get into how you can do it in just 5 simple steps.

#1 Research – Focus on what your audience needs

Before you start testing anything, you should carry out extensive research on your customers’ behaviours. This is where all those fancy surveying and tracking tools come into play.

Such as, but again, not limited to:

  • Survey Monkey (online customer surveying tool)
  • Eye Tracking (a method of recording the movement and intensity of the gaze when the user is on a platform)
  • Heat Maps (a visualisation map allowing to see where and how visitors interact with your experience)
  • Google Analytics (well derr!)
  • Previous tests (History of test insights, what worked, what didn’t)
  • Phone interviews (get on the blower, talk to your customers hear it from the horse’s mouth)

These tools are amazing for researching and understanding your customer base before developing your hypothesis. Each of these can provide unique insight into how your customers interact and interpret your current strategies. They provide strong direction and credibility to the theories you will be basing the likelihood of your hypothesis on.

Before making and big-scale changes always, always, always, check that the behavioural data collected during your research matches your survey data.

Surveying tools are highly efficient. However, humans can be weird and unpredictable, meaning there is always a risk of outliers skewing your data. Watch out for experimental errors that can really stray you from the path to conversions.

#2 Developing a Hypothesis

Who remembers writing a hypothesis in high school science? Well, it’s time to whip out the bunsen burner, because we are going to take you back.

GIF

If you listened in school you might remember the basic formula of a hypothesis is…

“I believe X will result in Y because of Z“.

HYPOTHESIS FORMULA
See more simple strategies and sensational memes on our Insta!

In the game of hypothesising for A/B testing, nothing really changes.

You are using the same hypothesis formula to test an experimental variant against your control based on an assumption about your customer’s preferences.

However, remembering back to Step #1 Research, this assumption can not be made out of thin air. Your assumption should be developed with deep knowledge, surveying and insight into customer satisfaction and experience.

It’s not really an assumption at all, it’s a theory based on cold hard factual evidence.

Don’t Just Follow Others Blindly

Deciding on what hypothesis to test is like getting bangs!

Remember that time everybody got bangs. Like EVERYONE. Paris Hilton got bangs and bangs looked good on her. But then you got bangs, and like…they did not look good?!

growing kyle jean-baptiste GIF

That’s exactly like choosing what to A/B test.

Bangs Or No Bangs? Is it wise to follow the leader?

Don’t blindly follow the leader. What’s working for your competitors, even what’s considered industry best practice, won’t always work for you. Like bangs, certain styles simply don’t work for everyone.

Don’t merely base your A/B testing only on what experts and competitors believe or do. Ensure your testing is suited to your audience their top pain points.

For example, there are plenty of articles out there telling you which colour of button works well. Yet, if you blindly implement just because someone said so, you may end up with an assault to the eyeballs and your conversions could tank!

This is where you can reduce, reuse, recycle that research you did for your hypothesis. Develop an educated plan of what user experience or design elements are actually worth testing for your brand. Question how they will directly improve your conversions.

Did you know Google tested 41 shades of blue to get the optimum click colour!?

#3 Don’t Test All Your Best Practices At Once

When it comes to A/B testing, DO NOT put all your eggs in one basket. Make sure you are throwing your eggs strategically at specific customer subgroups.

being selective with A/B testing

By eggs, I obviously mean tests.

While executing an A/B test, avoid multi-variant testing, especially when starting out. Get into a good rhythm first, get good at it, use your insights strategically and THEN try multi-variant.

Testing all your hypothesises together will not allow you to draw proper conclusions from the data. As it will not be clear which variant actually caused the result.

To allow for comparisons to control variants to be drawn, you must take your time and test each proposed variant individually.

And for the full suggested time for that testing medium and desired action. E.g. 24 hrs for email or at least 7 days for a website change.

We looooove quick wins!

To get started and get some scores on the board, prioritise quick win tests first e.g. subject lines, button CTA, colours of elements.

Summarise your testing options down, distinguish those that are relevant to your target audience and the ones that have the potential of producing higher conversion rates.

For example, there is no point in A/B testing your button colour if your open rate is already super low. Start with your subject lines!

#4 Define success

Once you find your rhythm, A/B testing can be super addictive. So, how do you know when to stop?

A better question would be, how are you going to measure when a test has told its story?

You need to define success metrics.

For example, let’s say your variant is a modified version of your mailing list signup because you wanted to acquire more email subscribers. Your success metrics could be defined as something like…

“When X new customers subscribe to the mailing list” and if you wanted to be more specific about the ‘how’ you could add “by clicking Y”.

Ensure that your success metric isn’t so broad that it is unmeasurable or unreachable. You can’t be setting goals like “When X is a Billionaire, with a Ferrari and a Birken bag”. Metrics should be organisationally focused and consider available resources and time. Also what is going to be most beneficial and realistic for your brand at that point in time.

So thinking back to that hypothesis you developed at the start, what result or outcome to that statement would your brand consider a win?

Defining success metrics is not only necessary to know when a result is satisfactory. It is also crucial to have a final outcome to properly measure the effectiveness of a test for future reference.

Knowing when a test has been successful is an extremely important step in A/B testing. Without a finish line, you will be running a pointless race, and quite possibly ending up right back where you started.

That’s not to say a test can’t be revisited as your brand changes and grows. That’s where documenting lends a helping hand.

#5 Document

documenting your A/B testing

Put simply: Create an archive of everything that has come and gone from your A/B testing.

You don’t have to manilla folder everything, but documenting tests and their results will be hugely beneficial in the long run.

Having strong documentation of past tests will help avoid retesting or repeating previous mistakes in future A/B testing.

Bonus! By building a library of tests and results, you’ll also have valuable data to aid your future site redesign.

Don’t just throw all your data and statistical analysis in a random document. Ensure results are organised and documented with clear relevance to the tests.

In saying that, your archive doesn’t have to be fancy-schmancy. A simple well-formatted Excel document or free Trello board would suffice.


Remember though there should be an end to a test, there is no end to testing. A/B testing should be part of your business ritual. It’s an ongoing cycle. A strategy that changes, develops and improves with your brand.

And that’s that, our practical guide to A/B testing. And now I’m craving ice cream


If CRO is on your mind, we have the program to make it a reality!

Time again, we were finding that our retailers were trying to solve their CRO woes while running major campaigns or project roll-outs. Putting out spot fires and kicking the can down the road. 

The truth is, conversion rate optimisation takes focus, time and testing.

We’ve customised a 12-week program to audit, analyse, optimise and test your crucial conversion points. 

So, before you jump into your next major campaign, project or product launch, let’s get CRO-savvy. Reach out to us at hello@miacademy.com.au or click here to book your FREE 30-minute training exploration.

Till next time! Read some of our recent blogs.

Scroll to Top