Tips & Tricks: Amplify Your A/B Test Roadmap for Recommendations
Date Sent: January 12th, 2021
Hi there,
Testing for optimization is a crucial component of the campaign lifecycle, but planning and managing the testing process can be an overwhelming undertaking. Having a roadmap, or even a basic outline, of next steps for testing can be incredibly helpful and dispel uncertainty for which tests your team ought to run and when. Continuously personalizing and optimizing experiences will increase the likelihood that your campaigns will drive users to convert.
Start the new year off right and help your team to outline a testing roadmap for Recommendation and other types of campaigns with our A/B test ideas and pro tips.
Test Inspiration #1: Test for the Optimal Strategy
Testing which Recommendation Strategy is the most optimal to deploy on a certain page for a given experience is a great starting point for optimizing a Recommendation campaign. Run tests between variations that use Strategies with different algorithms to determine which one serves item results that drive more users to convert.
Test Setup Overview
Let's say you want to find the best Strategy to use for a Product Page widget. Start by creating (2) variations within an experience of the campaign. The variations should have identical design variables, with one exception: the Strategy. For example, you could choose a Strategy that uses the Similarity algorithm for the 1st variation and then choose the Viewed Together Strategy for the 2nd variation. Then, define the A/B test settings. Your settings could be similar to the following:
- Allocation: Manual Allocation (i.e., a traditional A/B Test) and allocate 50% of user traffic to either variation
- Primary Metric: Add to Cart (Conversion) or Click Through Rate
- Variation Stickiness: Sticky for the User
- Attribution Start: Variation is Served and Clicked
- Attribution End: 1 day or 7 days
Plan to run the test for at least 2 weeks or until one of the variations reaches a significance level of 95% and is declared the winner.
Post-Test
Evaluate the results in the Experience Report and analyze the data in terms of both the Primary and Secondary metrics. Using the Strategy Report, compare the Strategy of the winning variation to other Strategies deployed on pages of the same Page Type across the site. Consider running a second test to compare the Strategy of your winning variation to a different Strategy that performs well on other pages of the same Page Type.
Pro Tip
If you realize that one of the Strategies you want to test needs to be modified while you're setting up your variations, you can edit the Strategy directly from the variation design window without needing to save and close out of the campaign. When you save your changes to the Strategy, you'll be automatically redirected back to the campaign:
Test Inspiration #2: Test the Widget Design Template
Once you determine the optimal Strategy to deploy with a given widget, you could then test for the best design template for the widget to use. Dynamic Yield offers several out-of-the-box (OOTB) Recommendation design templates and you can test to determine if there's a certain widget template that gets more user engagement over another. Run tests between variations that use different design templates but that have identical design variables.
Test Setup Overview
Create (2) variations within an experience and choose a different design template for each, or modify one of your existing variations. Try to keep the design variables of the variations as close to identical as possible. In other words, use the same widget title, number of item slots, price detail setting(s), font colors, etc. for both variations. Select the same, or similar, A/B test settings as those that are outlined above for Test Inspiration #1. Your variations could look similar to these examples of (2) variations that are being tested on a Category page:
Post-Test
When running any test to determine the best variation to serve long-term, the test duration should be at least 2 weeks or until the system declares a winning variation. After the test concludes, evaluate the results in the Experience Report. Consider testing individual design variables of the winning variation as your next test.
Pro Tip
Create a new test version each time you make a noteworthy change to your variation(s) so that you can accurately track and analyze results over time across the entirety of the campaign lifecycle. For example, let's say a test that you ran to identify the best long-term Strategy to use for a widget just concluded and you decide to run a new test to find the optimal widget template to use. Once you modify and save the variation that will use a different design template, select to create a new test version. This will enable you to see the results of the 2nd test independently from those of the 1st test in the Experience Report:
Within the Experience Report, see the data on specific tests that have run in the past by clicking the Current Version dropdown in the top left corner of the page and selecting Previous Versions. Then, in the Select Test Version window, click a previous test version to see the isolated report data for that particular test:
Test Inspiration #3: Test Price Details Design Variable
Showing users the Price Details of the item results in a widget could potentially either encourage or discourage users from further product exploration. Perhaps users on your site have a tendency to be more curious about products if they can see the item prices right away, or perhaps users that are price sensitive are less likely to click on products in a widget if the prices are shown. Get a better sense of whether the Price Details are advantageous or not to show on a widget by testing the Price Details variable.
Test Setup Overview
Using Click Through Rate as your Primary Metric, configure a test between (2) variations within an experience. Use the same exact widget design template for both variations, and keep all of the design variables for both variations the same except for the Product Price variable. From the variation window Design tab, choose to Show the price details for one variation, and to select to Hide the price details for the other. Your variations could should look similar to this:
Post-Test
Check into the results in the Experience Report after the test concludes. Consider choosing the Price Details setting (show vs. hide) that won for both variations and then test a different design variable between your variations for your next test.
Pro Tip
Prevent your test(s) from being influenced by confounding variables by testing one change at a time. Use the same template design for both variations and keep all of the design variables the same for both variations with the exception of showing vs. hiding the Price Details, which is the specific variable that you're testing. This way, you can have greater confidence that the results of the test are conclusive and accurate.
Until next time,
Ashley
Ashley Berman
Sr. Customer Education Manager
Please sign in to leave a comment.
Comments
0 comments