CRO

A/B Testing

A practical UX and optimisation method for making controlled, data-driven decisions by comparing real user behaviour across design variations.

How to use A/B testing to compare live variations, measure performance reliably, and optimise based on evidence rather than opinion.

07 September 20144 min read

Quick take

If you want to know what actually works, test one version against another.

Related Services

What it is

A/B testing is a UX and method where two or more variations of a design are tested against each other to see which performs better.

Users are split into groups, with each group seeing a different of the same experience.

is measured using defined metrics such as , clicks, or .

The focus is on and outcomes, not opinions.

The goal is to make -driven decisions by identifying which delivers better results.

A/B testing is most useful when you need objective evidence of what improves outcomes in a live environment.

When to use it

Use this method when you want measurable improvement.

It is most useful when:

You are optimising an existing product
You have enough traffic to generate reliable data
You want to improve conversion or engagement
You are comparing specific design changes
You need evidence to support decisions

It is less useful when:

traffic is too low for meaningful results
the change is too large or complex
you are exploring early-stage ideas
A/B testing is often used in optimisation and live environments.

Key takeaway

Use A/B testing when the decision can be isolated into clear alternatives and measured with reliable behavioural data.

How to run it

Set up properly.

Before you start, be clear on the you are testing, the variations being compared, and the success metrics.

Only test one meaningful change at a time.

Run the method.

A/B testing is controlled and -driven.

Split users into groups. Show each group a different . Run the test for a defined period. Collect . Ensure conditions remain consistent.

Avoid changing variables mid-test.

Capture and make sense of it.

The value comes from measurable results.

After the test: compare between variations, assess , identify the winning , and apply learnings to future tests.

Use this to drive continuous improvement.

What to look for

Focus on:

Performance
Which version performs better
Behaviour
How users interact with each variation
Metrics
Conversion, engagement, or completion rates
Significance
Whether results are reliable
Impact
Size of the improvement

Where it goes wrong

Most issues come from:

If the test isn’t controlled, the results are meaningless.

testing too many variables at once
insufficient traffic or sample size
ending tests too early
unclear hypotheses
misinterpreting data

What you get from it

Done properly, this method gives you:

clear, data-driven decisions
measurable improvements
reduced reliance on opinion
ongoing optimisation opportunities

Key takeaway

It helps you prove what works.

Get in touch

If this sounds like something you need, we can help you run A/B tests that deliver measurable improvements and remove guesswork from .

No assumptions. No opinions. Just that proves what works.

FAQ

Common questions

A few practical answers to the questions that usually come up around this method.

What is A/B testing in UX?

It is a method for comparing two of a design to see which performs better.

When should you use A/B testing?

Use it when optimising live products with sufficient .

What can you test?

, content, CTAs, , and .

How long should a test run?

Long enough to reach reliable, statistically significant results.

Does A/B testing improve UX?

Yes. It helps optimise based on real .

LET'S WORK TOGETHER

Ready to improve your product?

UX, research and product leadership for teams tackling complex digital services. The work usually starts where things have become harder than they need to be: unclear journeys, inconsistent products, competing priorities, or teams trying to move forward without a clear direction. I help simplify the problem, shape the right next step, and turn complexity into something people can actually use.

Previous feedback

Will Parkhouse

Senior Content Designer

01/20