Skip to main content
A/B test a campaign
Abbey Garland avatar
Written by Abbey Garland
Updated over a week ago

A/B testing, also known as split testing, is a way of working out which of two campaign options is the most effective in terms of encouraging opens or clicks.

In an A/B test you set up two variations of the one campaign and send them to a small percentage of your total recipients. Half of the test group is sent Version A, while the other half gets Version B. The result, measured by the most opens or clicks, determines the winning version. This is then sent to the remaining subscribers.

When an A/B test is in progress you can't change the campaign, but you can cancel it before it is sent to your entire list.

Setup your A/B test

You can turn an email campaign into an A/B test at any point before sending it. To get started:

  1. Click Clients in the top navigation, then select the relevant client.

  2. Click Campaigns.

  3. Click Create a campaign and name your new campaign.

  4. Click the A/B test toggle near the top of the page to open up the A/B test configuration window.

    The A/B test toggle

Choose your primary goal

When creating A/B tests it’s important to think about what you are trying to improve or optimise. Is your intention to get more people to read your email, drive more traffic to your website, or encourage recipients to click on a specific call to action? Your primary goal determines the winning metric for your A/B test.

Set up your A/B test

Alternatively, you could:

  • Increase open rate - when you are trying to get more recipients to open and read your email.

  • Increase click rate - when deeper engagement with the email is more important or you’re trying to drive more traffic to your website.

  • Increase visits to a specific URL - especially useful for emails with a clear and focussed call-to-action (CTA). The version with the highest percentage of clicks on a specific link (rather than all links) will be the winning version.

    • For this kind of test, the specific link will be chosen just before the campaign is ready for sending.

Choose your goal and click Continue.

Decide what you want to test

Next, choose what you want to test in Version A and Version B.

An A/B test set up with increase click rate as the primary goal

We will recommend one of these options based on the primary goal you have selected, but you can choose any configuration you wish.

  • Different subjects - the campaign subject is highly visible in email inboxes and can influence whether or not a recipient opens your email.

  • Different senders - the From name and email address are prominent in the inbox and can also be influential in whether or not a recipient opens your email.

  • Different designs - once an email is opened, the body of the email takes over. Email design and content can impact whether a recipient reads to the bottom of your email or clicks through to your website.

Once you've made your choice, click Save.

Different subjects

For this test, campaign versions A and B are identical except for the subject line (and preview text if you wish). A subject line test is recommended to increase open rates, however, varying the subject is also a valid tactic when the goal is increased clicks.

For example, you could test to see if a generic subject gets more open than a longer subject line that's more specific or interesting.

An A/B test of two different subject lines

Alternatively, you could:

  • Test two completely different topics as the subject line, to see what content is of most interest to subscribers.

  • Add personalisation to identical subject lines to see if a first name greeting, for example, gets a better response.

  • See what kind of promotion works best by offering "Free Shipping" versus "15% Off".

  • Keep the subject identical and test two different preview texts to see what impact that has on engagement.

Different senders

Sender details are important because many people will not open an email if they don't recognise who it's from. With the From name test, you can use a different name and email address for Version A versus Version B, or just change one or the other.

The best approach depends on your relationship with the subscriber. Consider if they are more likely to recognise an individual's name, your company name, or the product name your campaigns are about.

Different designs

This is to test different elements of the campaign itself, for example, section titles, article length, calls-to-action, header images and more. You might even test two completely different designs to see which one gets the most clicks.

Campaigns for this type of test can be created using one of your saved templates or designed externally and imported (if that option is available to you). If you're doing a design test, follow the on-screen instructions to set up Version A first. You can’t begin to work on the design for Version B until the content for Version A is in place.

If you're using the same template and only making minor changes to content, save time by copying the content from Version A to create Version B. Click Design email for Version B and then select Copy version A.

The "Select how to design your email" window, with the "Copy version A" option selected

Choose recipients

When both versions of the campaign are set up you'll be prompted to choose the subscriber list, or lists, to send to. If new subscribers are added to a list after the A/B test has started they will not be sent the campaign. Any subscribers added while a test is in progress will have to be targeted separately.

Test before you start

It's important to test and check for errors before you start an A/B test because you can't make changes when it is in progress.

To double-check everything you've set up so far, we summarise it for you in a campaign checklist. If you need to change something just click the Edit buttons on the right of the snapshot.

The campaign checklist, showing an A/B test campaign

If you're testing email content using a template-based campaign, the checklist is where you can check the plain text version of your emails. What looks great as HTML might need a little work in plain text. Below the email design preview, in the campaign checklist, click Manage plain text version to see how your text version looks and make any necessary changes.

When you're satisfied that everything is good to go click Test and define delivery. Run a Quick test to send yourself previews of both versions or click Full test to run a fully automated design and spam test.

Prepare to send

Once all elements of your A/B test are in place, you’re ready to send. Click Prepare to send in the top right corner.

The next step is selecting the size of your test group, deciding how the winner will be chosen (only for tests where the goal is clicks on a specific link), and setting a length of time to run the test.

Test group size

Use the slider to define a test group, which should be a small subset of your recipients, say 20–30%. Recipients in the test group are selected randomly. Half of them will be sent Version A while the other half are sent Version B. The remaining recipients will be sent the winning version.

The A/B test settings window, where you can select what portion of your audience is sent the A/B test and how long the test should run

When you start a test we send all emails (Versions A and B) to the test group at the same time. The remainder is held back for when the winner is decided.

There’s a brief reminder of the way you’ve configured the A/B test. Click Change if you’ve made an error and want to redo the setup.

Once you’re done configuring the testing group split, click Continue to decide when to send the A/B test.

Choose how long to wait before deciding the winner

The length of time you can run an A/B test is from one hour, up to five days. After this time passes, the system will look at the data collected against Versions A and B and decide the winner (with reference to the goal of the test). The winning version is then sent to all remaining recipients. Or you can manually select a winner while the test is in progress.

Send now or schedule for later

You can start the A/B test immediately or select Schedule for later to pick a time and date.

The "When would you like to start sending" window, with the options to send immediately, or schedule for later

If you schedule the test to start later, we'll send you an email notification when it starts.

Cancel an A/B test

You can cancel an A/B test any time before a winner is selected. When you cancel an in-progress A/B test:

  • Reports are generated for both the A and B versions of the test.

  • If you're paying with email credits, you will not be refunded for unsent emails as a result of cancellation.

  • If you're on the basic monthly plan, only the emails sent before cancellation will count towards your monthly limit.

There are multiple places from which you can cancel an in-progress A/B test. The first is from the Overview or Campaigns section of your account. In the Sending section, to the right of the A/B test campaign you want to cancel, click Cancel.

The cancel link on the "Overview" page for an in-progress A/B test

Finally, you can cancel an A/B test from its results page. To do so:

  1. Click Clients, then select the relevant client.

  2. Click Campaigns or Overview.

  3. Below "Sending", click the test you want to cancel.

  4. To the right, you will see a countdown timer. Below it is the option to Cancel test.

The A/B test countdown timer, with cancel link below it

Send an updated campaign

When you start the cancellation process, you will see a link to download a CSV file. This file contains the subscribers who have and haven't been sent a version of your campaign. If you made an error in your campaign and want to send an updated version, this data can help you to figure out who to send the updated version to.

To do so:

  1. Set the "ReceivedABTest" column to "A new text field", to create a custom field.

  2. Create segments based on that custom field, to separate those who received the "A" test, those who received "B", and those who received "None".

  3. Use the "Exclude segment" button when sending your updated campaign to filter your audience accordingly.

Monitor A/B test results

While a test is in progress you can monitor results with the A/B test report, which looks like this:

The A/B results page, with a line graph that shows performance of both versions

The graph makes it easy to see which version of the campaign is winning, according to the performance metric you've chosen: open rate, total unique clicks, or total clicks on a selected link. Below the graph you can also see which version is winning for the two metrics you didn't select.

For some A/B tests there's a clear winner before the test time is up. On the report page, pictured above, you can see the time remaining and a link to manually select the winner. Click this to choose the winner yourself and end the test early.

After a winner has been decided, through test completion or manual selection, the remaining recipients will be sent the winning version and a full campaign report will become available. The A/B test results are included in the full report so you can revisit them when designing your next campaign.

If you let the test run to completion we'll send you an email notification as soon as it's finished. If the two versions are tied at the end of the test, version A will be sent to the rest of the list.


Did this answer your question?