A/B/n testing

A/B/n testing is a powerful tool to improve your digital presence across various use cases. It enables you to refine user experiences, optimize digital strategies and calls to action, boost engagement, and introduce new features with minimal risk. By testing different designs or content, you can identify what performs best against your business goal.

Component testing, or A/B/n testing in XM Cloud, lets you create and run an A/B/n test to compare different versions of a component on the same page. By randomly displaying these versions, known as variants, to your website visitors, you can statistically determine which one best achieves a goal, such as increasing page views.

Before you begin with A/B/n testing a component, it's important to formulate a clear hypothesis about your expected outcome, to guide your A/B/n testing process. Start by deciding which component you want to test against your goal. For example, your hypothesis might be, Changing the teaser design will increase page views. This sets a clear guideline for your A/B/n test, defines the specific variations you will experiment with on the component, and identifies the metric you will use to measure success.

A/B/n testing teaser designs: Variant A (Control) features the current design, and Variant B uses a slightly modified version.

To create the A/B/n test, select the component you want to test and compare at least two variants of that component. Best practice is to not change other elements on the page. This approach ensures that any observed changes in performance are due to the tested component. For example, to assess the impact of different teaser designs, you might compare the original teaser (Variant A or Control) with a variant (Variant B) that features a different layout and text. This side-by-side comparison will help you determine which variant leads to a significant increase in page views after you start the A/B/n test and publish the page.

A/B/n testing provides a data-driven approach to analyze the performance of different component variants. When running an A/B test in XM Cloud, there are two possible outcomes. If all variants perform similarly without any statistically significant differences, the test is inconclusive. On the other hand, if a variant shows a statistically significant improvement, it is declared the winner and can then be designated as the default component for your broader audience.

Note

To enable A/B/n testing, make sure you are using JSS 22.1 or later.

For a technical overview of how JSS enables A/B/n testing, refer to Page personalization and component A/B/n testing.

Do you have some feedback for us?

If you have suggestions for improving this article,