A/B testing in Experience Email enables you to test two variations of Dynamic Content or Recommendations blocks, to determine which performs better. With our A/B testing you can:
- Compare two variations to optimize email campaign performance.
- Dynamic Content: Test two variations of a Dynamic Content block with unique designs or messaging to determine which performs better with your audience.
- Recommendation: Test two variations of recommendation strategies (algorithm and filtering rules) and design templates to identify the best-performing configuration
- Make data-driven decisions using performance metrics, such as Uplift and Probability To Be Best.
- Enhance engagement and boost conversions with personalized content.
Setting up your A/B test
For both Dynamic Content and Recommendation A/B tests, note the following:
- You can compare only 2 variations in your test block.
- Only one block can participate in the A/B test, whether it's Dynamic Content or Recommentation (you can still add other content blocks that aren't tested).
Dynamic Content block
- Add a Dynamic Content block to your email.
- Select A/B Test for your allocation method.
- Create 2 variations. Customize the design, content, and rules, and name your variations.
- Set the allocation for your variations if needed (the default is 50% each).
- Preview your variations and make any adjustments, then click Save & Exit.
Recommendation block
- Add a Recommendation block to your email.
- Create your first variation.
- Click Create A/B Test. A second variation appears.
- For both variations, select the algorithm and configure filtering rules, customize the design, and name your variations.
- Set the allocation for your variations if needed (the default is 50% each).
- Preview your variations and make any adjustments, then click Save & Exit.
Post-code-generation restrictions
After the campaign code is generated:
- You can no longer change allocation methods or traffic percentages.
- Variations can't be added, paused, or removed.
- Content, design, and recommendation strategies can be edited, but changes might affect reporting accuracy.
Analyzing A/B test results
You can begin analyzing the test results in the performance report area as soon as the campaign is live. Report columns include the following metrics:
-
Uplift: Indicates the percentage improvement in performance metrics between the variations.
- Probability to Be Best: Shows the statistical likelihood of one variation outperforming the other, with credible intervals.
Use these metrics to identify the winning variation and apply insights to future campaigns to improve performance.
Best practices for A/B testing
- Focus on key differences: Test clear differences between variations, such as content (Dynamic Content) or strategies (Recommendations), to better understand what drives performance.
- Limit the number of A/B tests in a campaign: Restrict campaigns to one A/B-tested block to ensure clean, actionable data.
- Monitor reports: Continuously review performance reports to refine future campaigns.
- Duplicate campaigns for changes: If significant changes are required after generating the code, duplicate the campaign rather than editing the original, to preserve data integrity.
FAQ
Can I add more than two variations for A/B testing?
No, the feature supports only two variations to simplify setup and ensure clear reporting.
What happens if I make changes after generating the code?
Content, design, and strategies can still be edited, but changes to allocation methods (Dynamic Content) or testing methods are restricted. Post-code edits may affect report accuracy.
What metrics are available in the performance report?
Both Uplift and Probability to Be Best are provided to help identify the best-performing variation.
What can be tested using the Experience Email A/B test?
- For Dynamic Content: Test different designs and messaging.
- For Recommendations: Test different recommendation strategies and design templates.