Evaluating the Experiment


Read more about Data Workbench’s End-of-life announcement.

After running the experiment until the required minimum number of visitors have participated in the experiment, you can be assured of sufficient statistical confidence to evaluate the results of your experiment.

Using Insight, compare whichever metrics or key performance indicators were defined as part of the hypothesis to determine whether the experiment was a success (that is, the hypothesis was validated with the specified confidence.)

In our example experiment, our hypothesis is proven correct if the Visitor Conversion increases by at least 1.5%, which is the success criterion we defined earlier.

The following workspace example shows that the Conversion for the index2 test group was actually 1.8% higher than for the control group, proving our hypothesis.

Summarizing the Experiment Results

Using Insight, you can create detailed reports to summarize and illustrate the results of your experiment.

You then can use your reports, as shown in the following example, to make recommendations based on the results, which are backed up by the visual information you have provided in your reports:

Taking Action Based on the Results

After the results are clear, you are ready to act on those results by making production-level changes to the tested pages, applying these same changes to other areas of your website, and making sure to completely document the test, its results, and the changes that you have made.

Monitoring Your Actions

After the controlled experiment is complete and you have implemented the appropriate changes, make sure to continue to monitor the changes that you made by, for example, viewing validation metrics, creating control charts, and providing dashboard metrics.

Always be prepared to re-test your hypothesis if you think the changes that you tested and made are not bearing out the original results.

On this page