Test a decision model variant
You can test a decision model variant to identify any errors and ensure optimal performance. We recommend that you test a decision model variant as you build it so you can see how long it takes to compile and run. It's best practice to test when the decision model variant is in the draft state, so you can still edit the decision model variant if there are any errors. When you test the decision model variant, you select a guest to run the decision model variant against. You can make changes directly to the JSON and see if it results in an error or produces a different offer or content.
You can view the output of any programmable, by adding a print() statement in the programmable.
To test a decision model variant:
-
Click the decision model variant that you want to test, and then in the decision canvas, click Test Canvas.
-
In the Test Canvas dialog, you can test against the default guest's attributes, or click in the search box and choose Recent to select from a list of recently active guests or Bookmarks to select from a list of bookmarked guests. To select a specific guest, enter their unique identifier into the search box. You can search for a guest using, for example, an email address, other unique identifier, or browser ID.
NoteIf there is a guest whose data you often use when configuring or testing code, just click the star icon to bookmark the guest so you no longer have to search for them.
-
The Request tab displays the guest context. Click Test Canvas.
The decision model runs as it would in production, against the data of the selected guest. The output of the decision model variant, either on the offer or content, displays on the Offers tab.
-
Decide how you want to proceed, depending on the offers that were returned:
-
If the returned offers are what you expected, you can run the decision model variant against a different guest, re-run it against the same guest but change some of the guest data (this is only temporary and does not change any attributes associated with the guest), or if you want to place the variant into a silent test, you can change the status of the variant to Test.
-
If the returned offers are not what you expected, but no errors display on the Offers tab, examine the guest data and the entities of the decision model such as the decision tables and programmable decisions to understand why there is a discrepancy between the returned offers and what you expected.
-
If you receive a No offers returned message, continue this procedure to troubleshoot your decision model variant and re-run a test.
-
-
Click the Executions tab. The total milliseconds it takes for the decision model variant to compile and execute displays. You can see the number of milliseconds it takes for each component on the canvas to finish executing. Input components and knowledge sources are not included as speed is not an issue. You can identify any components that have a longer speed than expected, and experiment to see if you can improve performance. The number of decision tables, programmable decisions, data systems, or AI Models can also impact overall performance.
-
On the Logs tab, examine any errors that display. You can examine the log to check the JavaScript in a programmable decision. To do this, you must have included print statements in the programmable decision.
-
Click the Full Response tab. The full response contains a JSON representation of all the outputs of the individual components on the decision canvas, as well the ultimate decision that was output.
You can use the Test Canvas feature to further troubleshoot the decision model variant.