A/B testing
A/B testing, also known as split testing, is essentially data-driven decision making. We use the data we gain from running tests to inform the decisions we make, or to validate our original hypothesis if it’s a success.
A/B testing takes the guesswork and subjectivity out of the decision making and uses data to validate the decisions, making it a valuable part of the UX and Conversion Rate Optimisation (CRO) process.
What is A/B testing?
At its simplest, A/B testing is taking two or more versions of a design (or anything which could add value to a digital product or service, i.e. copy, images, layout, forms, etc) and testing which version performs better, Design A or Design B.
The hypothesis should specify the key metric, such as conversions, click-through rates, bounce rates, session duration, page speed, sign-ups, or share rates. After the A/B test, we review the data to refine the designs. A/B testing is just a starting point, not the final solution.
How to A/B test
We start A/B testing with a hypothesis – ‘doing X should achieve Y’ – along with a control version (existing design or layout) and an alternative version(s).
Anything can be A/B tested, from completely redesigning and restructuring a sales funnel to something as simple as changing an image with the aim of improving click through rate.
First, we aim to gain an understanding of the product/service, work out the pain points or problems and look for areas of opportunity to improve it. Next, we create a hypothesis to test against, and design an alternative solution(s). That solution is tested against the existing one based on a percentage split (50/50, 33/33/33, 20/80, etc).
If the test performs better, great! If the test performs worse, we review and try to understand why, and then repeat the process until we’re all happy with the results.
Looking to A/B test something?
A/B testing engagement on social media
As well as things like webpages, email designs and CTA formatting, we can A/B test things like heading styles and designs on social posts. Testing which gets more engagement, so you can continue using the most successful format going forward.
Testing things like this means that design decisions aren’t based on opinions. Say the marketing manager prefers one design and the CEO prefers another. The decision will be based on actual results, rather than personal preference. See our example below, which we A/B tested for our own podcast promotion posts.
A/B testing forms
We regularly design multiple versions of forms for our clients to see which performs better. The tests can either be carried out by us, or your own development team. See an example below of a form which was A/B tested for our client, DMFA. The one on the right (or below, if you’re reading this on mobile) achieved better results, so that one is now live on their site. As you can see, things like backgrounds, buttons, and visual differences can all be changed to compare one version against another.
Don't just take our word for it
Brand,
Digital
& Motion
We believe every brand is a sleeping giant just waiting to be awoken. To rise up and show the world what it's made of. It's our job to chip away at the stone and realise its potential.
You need a website that does it all. That inspires and informs, builds confidence and trust. That enhances experiences and turns casual browsers into lifetime customers.
How your brand moves says things words can’t. Every movement either raises hairs or raises eyebrows. Delights or deters. It's important to make every movement count.