The A/B Test History report provides an overview of your A/B test program (experiences with A/B test allocation, and at least 2 variations or 1 variation and a control group). The report helps you answer questions such as:
- What were the most successful A/B tests?
- How many tests resulted in increased conversions?
- How many tests have you run on your site?
- Are there as many tests now as previous quarters?
Downloading the Report
You can download A/B Test History report by clicking the A/B Test History button in the A/B Tests dashboard tile.
Report Metrics
The report is a CSV file, in which each row is a an A/B test version (i.e. if you had a running test that was edited in the middle of the test, it will appear in 2 rows). Only versions that were served to at least 500 users and were active at least 2 days are included.
Column | Description |
---|---|
Campaign Name | The name of the campaign the experience belongs to. |
Experience Name | The name of the experience. |
Start Date | The date the test version started. |
End Date | The date the test version ended, or Still Running. |
Day Running | The duration of the test version in days. |
Users | The total number of users that were exposed to the test. |
Leading Variation | The name of the variation that performed best. |
Conclusive Results | Whether or not a winning variation was declared. |
Primary Metric | The primary metric used to determine defined for the A/B test. The primary metric affects the Probability to Be Best, Uplift, and Winner Declaration calculations |
Primary Metric Uplift | The uplift of the leading variation over the control group in terms of the primary metric. |
Revenue Uplift | The uplift of the leading variation over the control group in terms of revenue. |
Report | A direct link to the report of the experience. |
Note: Only test versions that were active after January 1, 2018 are included. For customers using Dynamic Yield's EU data center, tests must have been active after January 28, 2019 to be included.