The A/B Test History report provides an overview of your A/B test program (experiences with A/B test allocation, and at least 2 variations or 1 variation and a control group). The report helps you answer questions such as:
- What were the most successful A/B tests?
- How many tests resulted in increased conversions?
- How many tests have you run on your site?
- Are there as many tests now as previous quarters?
Downloading the report
You can download A/B Test History report by clicking A/B Test History in the A/B Tests dashboard tile.
Report metrics
The report is a CSV file, in which each row is an A/B test version (if you had a running test that was edited in the middle of the test, it appears in 2 rows). Only versions that were served to at least 500 users and were active for at least 2 days are included.
Column | Description |
---|---|
Campaign Name | The name of the campaign the experience belongs to. |
Experience Name | The name of the experience. |
Start Date | The date the test version started. |
End Date | The date the test version ended, or Still Running. |
Day Running | The duration of the test version in days. |
Users | The total number of users that were exposed to the test. |
Leading Variation | The name of the variation that performed best. |
Conclusive Results | Whether or not a winning variation was declared. |
Primary Metric | The primary metric used to determine defined for the A/B test. The primary metric affects the Probability to Be Best, Uplift, and Winner Declaration calculations |
Primary Metric Uplift | The uplift of the leading variation over the control group in terms of the primary metric. |
Revenue Uplift | The uplift of the leading variation over the control group in terms of revenue. |
Report | A direct link to the report of the experience. |
Note: Only test versions that were active after January 1, 2018 are included. For customers using Dynamic Yield's EU data center, tests must have been active after January 28, 2019 to be included.
Executive A/B Test Program Analysis – Excel template
The A/B Test Program Summary Analysis template provides a visual layer to the raw A/B test history CSV file that you can download from the Dynamic Yield console. The template can help you share the success and impact of your A/B testing program within your organization and gain new and valuable insights and identify further optimization opportunities.
To get started, download the A/B Test Program Summary template:
https://www.dynamicyield.com/
The template includes macros for some of the data crunching. When you open the file for the first time, you might see a message asking for confirmation that you trust the source of the macros. Click Enable Macros.
After the template opens, you might see a second message at the top of the table. Click Enable Content.
What’s in the A/B Test Program Summary template?
The template contains 4 sheets: The summary sheet, which displays the dashboard; a sheet that holds the raw A/B test history data; and 2 additional sheets where the data manipulations and calculations behind the dashboard are done.
The summary dashboard includes 5 areas that provide valuable insights:
- KPIs that summarize the success of the A/B testing history, as well as benchmarks to see where you stand compared to hundreds of brands that are in the Dynamic Yield customer base:
- Tests: The number of A/B test versions (if you edited a running test during the test, it will appear in 2 rows).
- Running tests: The number of tests that are currently running.
- Avg. Monthly tests: A ratio of total tests over the total number of months since the first test launch.
- Learn rate: A ratio of tests with conclusive results over the total number of tests. This can give a high-level idea of whether your A/B tests were set up correctly in terms of personalization and optimization. For example, overly restrictive targeting rules or too many variations can make it more difficult for the system to reach any conclusions within a reasonable time.
- Win Rate: A ratio of tests with conclusive results and a winning non-control group variation over the total number of tests. Here, in addition to personalization and optimization criteria, you have an indication of the percentage of tests with a control group and whether your initial hypotheses are well-defined.
- Median Uplift: A median of all the A/B test primary metric uplifts. This is a general indication of the success of the variations against their control groups.
- A chart visualizing the volume and cadence of new tests launched.
- A section of notable tests in terms of revenue impact (generated and potential). Use this section to go back and learn what moved the needle.
- A reminder of recently-launched tests and optional action items to be applied in your Dynamic Yield console.
- A breakdown of some KPIs by the primary metrics defined in your tests.
How to upload your data
- Download the A/B Test History CSV file from your site's console homepage.
- Open the template, and go to the Import Data Here tab. Select all the data and then click Delete. When the alert pops up click Yes.
- Go to the File tab in the Excel top menu and then click Import.
- In the Import dialog, select CSV file, and then click Import.
- In the browse window, find your downloaded CSV file and click it. Then click Get Data. The Text Import Wizard appears.
- In the Import Wizard Step 1, set the suggested file origin to Unicode (UTF-8) and then click Next.
- In the Import Wizard Step 2, make sure to select Comma (and deselect Tab if necessary). Click Next.
- In the Import Wizard Step 3 just click Finish.
The Import Data dialog appears.
Important: Make sure the location of the newly imported data is the first cell of the table =$A$1. Click OK.
Your underlying data is now updated. - To complete the import process, go to the Calcs tab and locate the 2 pivot tables (both have blue headings). Right-click anywhere in the first pivot table range, and then select Refresh. Repeat this step for the second pivot table.
Your Summary sheet is now updated with the CSV data and is ready for your inspection.