1. Work with AI

Run an A/B/n test with an AI-optimized variant

Who wrote it best? Compare two variants of written content – the one you wrote, and the one optimized by AI:

  • You can use Sitecore AI capabilities to optimize the written content of a component.

  • With A/B/n testing, you can compare variants of a component to determine which one is more effective at achieving a particular business goal, such as increasing page views, decreasing bounce rate, or decreasing exit rate.

By displaying these variants to your website visitors, you can statistically determine over time which content best achieves the goal.

This walkthrough describes how to:

Optimize a text component

To optimize written content:

  1. In the page builder, open the relevant page, and select the component or the field where the written content you want to optimize is located.

  2. In the right-hand pane, click Content > Optimize content.

  3. Enter a text prompt, then click Generate or select one of the proposed actions.

  4. Review and optionally edit the optimized content on the page canvas. You can also navigate from the Original to the Optimized version of the content to compare them with each other.

  5. If needed, reiterate with additional prompts and intents until you are satisfied. Click History to access previous prompts and results.

  6. To test the optimized component against the original one, click A/B/n test. An A/B test is created, with A being the component with the original content item and B being the new variant of that component based on the optimized content item.

  7. Enter a name for the test and click Save.

  8. Optionally, add additional variants to the test. For example, you can build new variant by using other prompts in the Optimize content tool.

Configure the A/B/n test

When you have all your variants, you can configure the A/B/n test.

To configure the A/B/n test:

  1. In the right-hand pane, click Configure.

  2. In the Configure test goal section, click the goal you want to track:

    • Increase page views to measure the views of one or more other pages of the site.

    • Decrease bounce rate to measure the percentage of visitors who enter a specific page on a website and then leave without continuing to view other pages.

    • Decrease exit rate to track how many visitors leave your site from a specific page after navigating through other pages.

  3. If you selected the Increase page views goal, you must specify one or more pages where you want to track the goal; in the Site tree section, select one or more pages from the drop-down menu.

  4. Click Save.

(Optional) Assign traffic to variants

By default, if your test includes only two variants, the traffic is split equally, assigning 50% to variant A (with the original text) and 50% to variant B (with the optimized content). You can override this default to assign specific percentages to each of the variants in the test, if needed.

Important

The total percentage shared among all variants must equal 100%.

To assign traffic to variants:

  1. In the A/B test settings dialog, click Assign traffic.

  2. Enter the traffic percentage for each variant using whole, rounded percentages. To evenly distribute the total visitors among all of your variants, click Evenly distribute.

  3. Click Save.

(Optional) Set automated actions based on the test outcome

By default, the A/B/n test is configured to assign all visitor traffic to the winner variant if there is one, or to keep running the test if the results are inconclusive.

You can change what happens when the A/B/n test reaches statistical significance based on the test outcome of the test:

  1. In the A/B test settings dialog, in the Optional configuration section, click Automated actions.

  2. Select the action you want to happen automatically after your A/B/n test reaches statistical significance:

    • If there is a winning variant, you can Assign all the traffic to the winning variant or Assign all the traffic back to the control variant.

    • If the test results are inconclusive, you can Keep running the test or Assign all the traffic back to the control variant.

  3. Click Save.

Start the A/B/n test

When you're ready to make your A/B/n test live, you can start the test and its status changes from Draft to Pending. When you publish the page that includes your A/B/n test, the test begins and its status changes to Live.

To start an A/B/n test:

  1. In the right-hand pane of the page builder, click Start.

  2. If your test does not meet all of the prerequisites, a dialog will display what you need to do. Click OK to dismiss the message, then review your test configurations, make the necessary changes, and try starting the test again.

  3. Click Continue. In the right-hand pane, the status of your A/B/n test will be shown as Pending. While it's pending, you can no longer configure the test's settings and it will remain in this state until you publish the page.

  4. Click Publish. The page is now published and the A/B/n test status changes to Live.

While the A/B/n test is live, visitors will be assigned one of the variants according to the traffic distribution you configured.

View analytics

Within 24 hours of starting the A/B/n test, you can view analytics to track how these variants are performing against one other. This lets you see which variant is leading or whether a winner has been declared.

To view analytics:

  1. In the top-left corner of the page builder, click to open Sites.

  2. On the Sites page, in the top header, click Analytics.

  3. On the Analytics page, click A/B Tests.

  4. On the A/B tests page, you can view the performance analytics of your variants.

If you have suggestions for improving this article, let us know!