Preview and start a content test

Current version: 10.4

You can create a content test to evaluate alternative versions or combinations of your website's content and find out which pages, components, or combinations of components are the most effective with visitors.


This topic describes how to start a basic content test or a component test. If you want to compare the effects of multiple versions of a page, or two completely different pages (rather than two versions of the same page), you must create and run a page test.

After you edit a page, add personalization rules to components, or create variants of one or more components, you can start a test in the Experience Editor. This topic outlines how to:

Preview a test

To preview a test in the Experience Editor:

  1. Save and submit the item that you want to test.

  2. In the notifications at the top of the page in the Experience Editor:

    • If you are using a workflow, click Approve with Test:

      The Experience Editor notification bar with workflow commands
    • If you are not using a workflow and you have enabled test notifications, click Create a test:

      The Experience Editor notification bar with the Create a test link
  3. Enter a comment and click OK.

    The Preview and start test dialog box opens. On the Preview tab, you can preview the test and predict the outcome:

    The Preview and start test dialog box
  4. On the Preview tab, you can see the experiences you have created that you can test. Select a different experience from the carousel to see a larger preview of it.

  5. To set a filter on the experiences that are displayed in the carousel, click the View all drop-down and select a row in the table to display only one type of variation:

    The Preview section of the dialog box
  6. In the My expected effect of changes section, make a guess about the effect that you expect your changes will have:

    The My expected effect of changes section

    You can select:

    • I expect a negative change in engagement value

    • I expect no significant change in engagement value

    • I expect a positive change in engagement value

    In a test with multiple experiences, you are guessing about all experiences against the original.


    There are situations where the I expect a negative change in engagement value option is useful. For example, the content author disagrees with a suggestion about a change in the content, and uses this option to start a test to show that they are right about the suggestion.

    Sitecore calculates a score to show how good you are at guessing. You can view the performance reports to see the results and your score.

Adjust test parameters

You can adjust a number of test parameters. However, some options lower the quality of the test result. You must decide which trade-off between quality and time best suits your situation.

To adjust the test parameters:

  1. In the Preview and start test dialog box, on the Variables tab, in the Variables in test section, reduce the number of experiences by clearing the check box for one or more variables:

    The Variables in test section

    The number of experiences is the product of all enabled variations.

  2. In the Percentage of visitors exposed to test section, set the percentage of visitors exposed to the test. For example:

    • Set the percentage at 40%. This means that 40% of visitors will see one of the experiences that you have created (and the original version is part of these).

    • The remaining visitors – 60% in this example – will see the original version.

    The Percentage of visitors exposed to test section

    Sitecore calculates test results based on the visitors that you expose to the test.

  3. In the Statistics section, set the statistical confidence level required for the test to declare a winner. You can select 90%, 95%, or 99%. The default is 95%.

  4. On the Objective tab, in the Test objective section, in the Test objective field, select what the test measures. You can select:

    • Trailing Value/Visit – the engagement value based on page views occurring after the visitor has encountered the page being tested, divided by the number of site visits.

    • Any one of the goals that have been set up in the Marketing Control Panel.

    The Test objective section
  5. In the Select how to pick a winner field, specify the method for selecting a winner of the test:

    • Automatically select a winner based on the test objective – this is the default if the test objective is Trailing Value/Visit.

    • Automatically select a winner based on the test objective, unless it significantly decreases engagement value – this is the default for objectives that are goals. You cannot select this option for the Trailing Value/Visit objective.

      When you select this option, no winner is declared if the best experience (based on the goal) has decreased the trailing value/visit by more than 20%. Instead, the test continues.

    • Manually select winner – the user who created a test can select an experience in the Test Results dialog box and declare the experience as the winner of the test. When you choose a winner, consider the effect that the test had on the overall engagement value generated on the website.

  6. In the Duration of the test section, specify the minimum and maximum time for the test to run:

    • Minimum – select 3, 7, or 14 days.

    • Maximum – select 14, 30, or 90 days.

Start a test


After you start a test, you cannot modify it. For example, you cannot add or edit the test variations or the components that you are using in the test.

To start a test:

  1. In the Preview and start test dialog box, click Start test.


    If you start a content test on a version of a page that has a publishing restriction with a Publishable from date in the future, the test overrules the publishing restrictions. The version of the page is published when the test is published, regardless of the publishing restrictions, and the test becomes active immediately. 

  2. To make the test go live and publish the test to your website, you must publish the page. In the Publish Item dialog, select Smart publish. We recommend that you also select the Publish related items option.


    If you use a workflow that includes Auto Publish, you do not need to publish manually.

Do you have some feedback for us?

If you have suggestions for improving this article,