View performance analytics in Sitecore Personalize
When your experiment has been live for 24 hours, you can start viewing how the variants are performing against each other on the Performance screen. For the data to populate the dashboard, you must perform the following:
-
Add a goal to the experiment - you must assign a primary goal to the experiment. This determines the metric that Sitecore Personalize uses to choose the winning variant.
-
Start the experiment - after you start an experiment, personalization is in effect and any variants can potentially be shown to guests. You can view how the variants are performing against each other within 24 hours, and see how the assigned goals are being met.
Additionally, when the experiment is live, you can immediately see its operational data.
For web and interactive experiments, you can also view the A/B Test Analysis report for exploring test metrics such as average session duration, views, and searches.
To view performance analytics:
-
On the navigation menu, click Experiments. The experiment list displays.
-
To view experiments by status, click the Live, Paused, or Completed tab. To refine the list further, click Web, Interactive, or Triggered to display experiments by type.
-
Click the name of the experiment you want to view. The Performance screen displays.
Here's a partial image that shows the top of the Performance screen:

This table contains a description of each screen item on the top of the Performance screen:
Screen item |
Description |
---|---|
Name |
The proper name of the experiment you're viewing. |
Status and Last updated details |
The current status of the experiment, and the date, timestamp, and user who last updated the experiment. The current status affects the Status change button, as detailed in the row below. |
Start or Pause button |
If the experiment has a live status, a Pause button displays. If the experiment has a draft or paused status, a Start button displays. |
Preview or Test button |
Click Preview to test and preview a web experiment. To test the API request for a web experiment, click Preview API. To test and configure the API response for an interactive experiment, click Test. The Test button only displays for interactive experiments. |
Vertical ellipsis |
Click to duplicate, delete, or view the history of the experiment. |
Status and schedule banner |
The status of the experiment and duration it's scheduled to run. |
Performance |
The Performance screen is selected by default when you open an experiment with a live status. |
Operational data |
Click to view operational metrics for the experiment. |
Summary |
Click to view details of the experiment including when it was created and who created it. |
Analyze & Export |
Click to see an in-depth, day-by-day analysis of variant performance that you can download. |
Add dashboards |
Click to view various analytics dashboards to better understand performance. This option is available to customers who have Sitecore CDP in addition to Sitecore Personalize. |
Here's a partial image of the middle part of the Performance screen:

This table contains a description of each screen item on the middle part of the Performance screen:
Screen item |
Description |
---|---|
Winner, Test is Running, or Completed status banner |
Displays whether the experiment has a winner variant, is completed, or is still running. If the test has a winner, it displays the name of the winning variant. The experiment must have reached statistical significance for a winner to be declared. The winning variant is measured against the primary goal. |
Days running |
The number of days the experiment has been running since it started. If you pause and restart the experiment, this number resets. |
Total Unique Sessions |
The number of current unique sessions in respect to the primary goal. |
Primary Goal bar chart |
The performance metric of each variant against the primary goal. The primary goal is the metric that the variants are measured against. |
All Goals By Session table |
View the performance of all variants against all goals. You can also see the number of current unique sessions for each variant against each goal. |
Here's a partial image of the lower part of the Performance screen:

This table contains a description of each screen item on the lower part of the Performance screen:
Screen item |
Description |
---|---|
Breakdown by Goal line graph |
This line graph lets you compare the performance of all variants against a goal. Click the goal whose metrics you want to see on the line graph, from the drop-down list. The Breakdown by Goal and Details table update with metrics for the selected goal. The y axis displays the metric that the variants achieved, as measured against the selected goal. The x axis displays the date range that you want to view results for, based on what you selected from the Time filter. Hover your mouse over a data point to see its value. |
Device filter |
Click the Device you want to filter by. The line graph refreshes to only show metrics for the selected device. This is set to ALL by default. |
Time filter |
Select the time span by which you want to filter results. You can select Last 7 Days, Last 30 Days, or Select date to enter a date range. This is set to ALL by default. |
Details table |
This table contains metrics on all variants as measured against the goal you clicked from the Breakdown by Goal drop-down list. |
Variant Name column |
The name of each variant in the experiment. |
Sessions column |
The number of current unique sessions. |
Selected Goal column |
Contains the metrics of the goal you clicked from the Breakdown by Goal drop-down list. The symbol ± represents the margin of error associated with the value. |
Goal Metric column |
Shows a metric related to the goal you clicked from the Breakdown by Goal drop-down list. In this example, the conversion goal shows the Uplift % column. The uplift percentage is the percentage difference in conversion rate between the control and variant. |
Confidence Index |
A type of estimate calculated from the statistics of the selected goal. This estimate is only reliable when the minimum sample size is reached. The higher the confidence index, the higher the probability that the results of the experiment are accurate. |