Performing A/B tests on an email campaign message

Current version: 9.0

Creating successful email campaigns is highly dependent on how well you know your customers. When you create an email campaign, you base the content and layout of the message on an assumption of what appeals to your customers. However, your assumptions might not be accurate, and your recipients might not respond to your campaign in the way you expect them to.

With A/B testing, you can test and validate your assumptions on a small group of your customers before you send the final email campaign to the rest. This ensures that you always send the most appealing and relevant email campaigns to your customers.

The A/B testing functionality allows you to create two or more variations of the same email campaign and test which one receives the best response from the recipients. You run the test on a limited set of recipients and once the test result is clear, you can send the most successful variant to the rest of your recipients. This ensures that you always send the email campaign that statistically attracts the most customers and generates the most value for your business.

This topic describes how to:

  • Plan the A/B test

  • Create a message variant

  • Start an A/B test

  • Select and send the winning message

Plan the A/B test

The Email Experience Manager (EXM) guides you through the process of A/B testing, but it is a good idea to have a few things in place before you begin. When you know what you want to test and under which circumstances, the rest is straightforward.

How many recipients do you want to include in the test?

You can best determine the number of recipients to include in the A/B test based on your previous experiences with sending email campaigns. To get an indication of the number of your recipients who generally open and read your email messages, you can view reports from previous campaigns.

Two factors indicate the number of recipients to include.

  • The total number of recipients. If you have a small pool of recipients (for example, 2000), you should include up to 10 or 15 percent (200–300) of the recipients in the test to get a reliable result. If the total number of recipients is higher (for example, 50,000), you only need to include five percent or less to get good results.

  • Your confidence in your messaging. If you are confident in your email campaign messaging and just interested in quickly verifying it, you only need to include a small percentage of recipients in the A/B test.

Which part of the message do you want to test?

You can run an A/B test on either the subject line or any of the components in the content area of the message, for example, the background, text, links, or images. However, you should keep the tests simple and only test one or two elements of a message. This helps you to achieve clearer test results and to understand your recipients’ reactions to your message.

How many variants of the message do you want to include in the test?

You can include as many message variants as you like in an A/B test, but be aware that the results of the test might be less obvious, the more variants you add.

For example, if you test two variants of a message, each variant is sent to 50 percent of the test recipients. If, on the other hand, you include four variants, each variant is only sent to 25 percent of the test recipients, and this can result in less clear results. If you want to test more than two or three variants, you need to consider the number of recipients carefully and include more recipients proportionally to the number of variants.

What time of day/week do you want to run the test?

Specify the day and time where you know from experience that your recipients are most likely to open and read your messages. Also, take into consideration the time it takes to dispatch an email campaign. For example, if you dispatch an email campaign to 500,000 recipients, it can take several hours before the dispatch is completed.

How much time should pass before you decide which variant is the most successful?

Use your experience with email campaigns to determine how much time your recipients need to respond to supply you with reliable test results. Depending on how confident you are in your campaign, you can give the recipients more or less time to react to the messages.

For example, if you need a quick reassurance of a an A/B test with simple variants, and assuming that your customers usually react quickly to your campaigns, you might give your customers six hours to respond. On the other hand, if you have doubts about which message variant is the most successful, you may want to give your customers up to 24 hours to respond and thereby achieve broader and more reliable results.

Do you want to select the more successful variant manually or automatically?

Selecting the most successful message variant automatically is the easiest option and often makes most sense. With an automatic selection, you decide by which parameter that you want the variant selected (the best value per visit or the highest open rate), and the successful message variant is automatically sent to the rest of the recipients at the specified time.

You can also select your preferred message variant manually. This gives you full control and lets you select the best message variant based on other parameters, for example, the click rate.

Create a message variant

When you run an A/B test, you test different variations of the same message. For each variation in the message that you want to test, you need to create a new variant of the original message. The original message is the first variant (Variant A), and any new message variants that you create are named Variant B, Variant C, and so on.

To create a new variant of a message:

  1. Open the message that you want to perform an A/B test on.

    Important

    A/B tests cannot be performed on automated messages or on these message types: Simple HTML message, Plain text message, or Existing page.

  2. You can choose to add a new variant or duplicate the original variant and all its content:

    • To add a new variant, on the Message tab, click Add variant . This creates a new message variant based on the same template as the original message but where no content is replicated. Choose this option if you want to test comprehensive differences between the message variations, such as larger components or the layout of the message.

    • To duplicate the original variant, click Actions and then Duplicate this variant. This creates a new variant of the original message by duplicating it and all of its content. Choose this option if you want to test minor differences between the variants, such as the subject line, images, or the name of a button.

    The message variants are displayed on individual tabs where you can easily toggle between the variants.

    Note

    You can create as many variants of the message as you like, but be aware that when you add more variants, you need a larger pool of recipients for the test results still to be clear and reliable. You should test a maximum of two to three variants.

  3. Make the necessary changes to the new message variant, for example, you can change the message layout, change the images, or add new components.

    Note

    If you base your A/B test message on an imported HTML template only the Variant A contains the layout. For the other variants, you have to add the message layout manually.

  4. When you have created all the message variants for the test, click Save.

Start an A/B test

After creating the message variants for the A/B test, you need to determine how you want to send the messages.

To send an A/B test:

  1. Open the relevant message and on the Delivery tab, click the variants that you want to include in the delivery. In this example, the variants A and B are selected.

  2. In the Size of the test drop-down menu, select the percentage of recipients that you want to include in this test. This set of recipients is randomly generated from the full list of recipients.

  3. Under Select winner, specify how you want the winner to be chosen. Select one of the following options:

    • Automatic – the system selects the winning variant for you, based on either best value per visit or highest open rate. In the Automatically select winner after fields, specify how long you want the test to run before the winner is selected.

      If you click Best value per visit, the winning variant is the one generates the most value on the website per visitor. The value is calculated according to the engagement value points that are set up on your website.

      If you click Highest open rate, the winning variant is the one that was opened most often.

    • Manual – you review the reports of the A/B test and select the winner variant based on your own criteria.

  4. Under Schedule delivery, select:

    • Send message now to start the A/B test immediately.

    • Schedule message delivery to schedule the delivery of the A/B test message for a later time. Specify the date and time for when you want to start the A/B test.

    • Schedule a recurring delivery to schedule the A/B test message to send recurrently at a certain interval. Specify the interval, the date, and time for when you want to send the recurring A/B test.

      Note

      When you schedule a recurring delivery of an A/B test message, a recurring message is only sent if the winner of the previously sent test message has been selected, either automatically or manually.

  5. If you want to send a notification to certain email addresses when the dispatch of the A/B test has completed, under Notification, select the check box and enter the relevant email addresses. If you chose to have the winning variant selected and sent automatically, the notification is sent when the final message has been sent.

  6. If you want to send the message in the recipients' preferred language, click the More link and under Multi-language, select Use preferred language.

  7. Click Start A/B test (or click Schedule test if you scheduled the A/B test).

    Important

    If you select Use preferred language in the Multi-language section, be sure to verify the default language in the Message delivery confirmation dialog that appears.

The A/B test starts and changes the status of the message to Sending. When the system finishes sending the message to the limited set of recipients, the button’s label changes to Resume but remains inactive until the winning variant is selected.

If you choose to schedule the A/B test, you can cancel the message up until the date and time it is sent, by clicking Cancel scheduling on the Delivery tab of the message.

Select and send the winning message

You can choose to select the winning message manually or have it sent automatically to the rest of the recipients in the recipient list. The winning message will not be sent to any of the recipients that were included in the A/B test.

Automatically select the A/B test winner

If you choose to have the winner selected automatically, when the specified time has passed, EXM dispatches the winning variant automatically to the rest of the recipients in the recipients list.

If the result of the A/B test is even, the system automatically selects Variant A as the winner and sends it to the rest of your recipient list.

Manually select the A/B test winner

To select the winner manually, you should review the A/B test reports yourself and select the variant that you think is the best. Make sure that you give your recipients enough time to read and react to the message.

To select the A/B test winner manually:

  1. In the navigation menu, under Email campaigns, click In progress.

    In the list, you can see the status of the message. The messages with the Select winner status are ready for you to select a winner.

  2. Open the message and click the Delivery tab. Under Select winner, you can see the results of the A/B test so far.

    Each variant is listed with the current statistics:

    • Open rate – the percentage of recipients that have opened the message.

    • Click rate – the percentage of recipients that have clicked a link in the message.

    • Value – the number of engagement value points that the recipients have generated on the website after opening the message.

    Tip

    We recommend that you review the reports of the A/B tests. They can help you learn about your customers' preferences.

  3. Based on these results, determine which variant you would like to send to the rest of the message recipients and click Choose as winner.

  4. When you are ready to send the winning message to the rest of the recipients, click Resume.

Do you have some feedback for us?

If you have suggestions for improving this article,