Performing A/B tests on an email campaign message
Creating successful email campaigns is highly dependent on how well you know your customers. When you create an email campaign, you base the content and layout of the message on an assumption of what appeals to your customers. However, your assumptions might not be accurate, and your recipients might not respond to your campaign in the way you expect them to.
With A/B testing, you can test and validate your assumptions on a small group of your customers before you send the final email campaign to the rest. This ensures that you always send the most appealing and relevant email campaigns to your customers.
The A/B testing functionality allows you to create two or more variations of the same email campaign and test which one receives the best response from the recipients. You run the test on a limited set of recipients and once the test result is clear, you can send the most successful variant to the rest of your recipients. This ensures that you always send the email campaign that statistically attracts the most customers and generates the most value for your business.
A/B tests cannot be performed on automated messages or on these message types: Simple HTML message, Plain text message, or Existing page.
Planning the A/B test
The Email Experience Manager (EXM) guides you through the process of A/B testing, but it is a good idea to have a few things in place before you begin. Once you know what you want to test and under which circumstances, the rest is straightforward.
How many recipients do you want to include in the test?
You can determine the best number of recipients to include in the A/B test based on your previous experiences with sending email campaigns. To get an indication of the number of your recipients who generally open and read your email messages, you can view reports from previous campaigns.
The following factors indicate the number of recipients to include:
-
The total number of recipients. If you have a small pool of recipients (for example, 2000), we recommend that you include up to 10 or 15 percent (200–300) of the recipients in the test to get a reliable result. If the total number of recipients is higher (for example, 50,000), you can get good results with including five percent or less.
-
Your confidence in your messaging. If you are confident in your email campaign messaging and just interested in quickly verifying it, you only need to include a small percentage of recipients in the A/B test.
Ask yourself the following questions before you start your A/B test:
-
Which part of the message do you want to test?
You can run an A/B test on either the subject line or any of the components in the content area of the message, for example, the background, text, links, or images. However, we recommend that you keep the tests simple and only test one or two elements of a message. This helps you to achieve clearer test results and to understand your recipients’ reactions to your message.
-
How many variants of the message do you want to include in the test?
You can include as many message variants as you like in an A/B test, but be aware that the results of the test might be less obvious, the more variants you add. For example, if you test two variants of a message, each variant is sent to 50 percent of the test recipients. If, on the other hand, you include four variants, each variant is only sent to 25 percent of the test recipients, and this can result in less clear results. If you want to test more than two or three variants, you need to consider the number of recipients carefully and include more recipients proportionally to the number of variants.
-
What time of day/week do you want to run the test?
Specify the day and time where you know from experience that your recipients are most likely to open and read your messages. Also, take into consideration the time it takes to dispatch an email campaign. For example, if you dispatch an email campaign to 500,000 recipients, it can take several hours before the dispatch is completed.
-
How much time should pass before you decide which variant is the most successful?
Use your experience with email campaigns to determine how much time your recipients need to respond to supply you with reliable test results. Depending on how confident you are in your campaign, you can give the recipients more or less time to react to the messages.
For example, if you need a quick reassurance of a an A/B test with simple variants, and assuming that your customers usually react quickly to your campaigns, you might give your customers six hours to respond. On the other hand, if you have doubts about which message variant is the most successful, you may want to give your customers up to 24 hours to respond and thereby achieve broader and more reliable results.
-
Do you want to select the more successful variant manually or automatically?
Selecting the most successful message variant automatically is the easiest option. With an automatic selection, you decide by which parameter that you want the variant selected (the best value per visit or the highest open rate), and the successful message variant is automatically sent to the rest of the recipients at the specified time.
You can also select your preferred message variant manually. This gives you full control and lets you select the best message variant based on other parameters, for example, the click rate.
Running the A/B test
To run an A/B test, you need two or more variants of the same message. You can select the winning email yourself, or allow EXM to automatically select the winner. For information on how to create variants, run an A/B test, and select a winner, see Walkthrough: Running A/B tests on an email campaign message.