Shopping Muse serves as a conversational discovery chat, with multiple entry points designed to work together seamlessly. However, deploying Shopping Muse entry points in isolated campaigns or directly from the source code can be challenging and complex, making it difficult to test its impact and perform iterative improvements. To address this, we'd like to suggest the following methodology for deploying and analyzing Shopping Muse and its various entry points.
Serving Shopping Muse with a Multi-Touch campaign
- In the Web Personalization app, click New Campaign.
- Select Multi-Touch.
- Give the campaign a name.
- Set the primary metric and define attribution logic.
- Create an entry point and define the insertion or trigger settings.
- For a quick kick-start - use one of our predefined templates.
- Repeat the process until all entry points are ready.
- Save the campaign as a draft until Shopping Muse is fully production-ready.
Shopping Muse has the power to transform how your users discover products—but only if they engage with it. To unlock Shopping Muse's full potential, it's essential to effectively introduce, promote, and educate your users. This is only possible if enough entry points are deployed across the conversion funnel (read more on entry points)
Analyzing the Multi-Touch report
A Multi-Touch campaign enables you to create multiple variations, each featuring different touchpoints. This enables you to iterate and test various entry points and their calls to action or compare them against a control group that doesn’t have access to Shopping Muse.
All touchpoints are tracked and reported in the unified Multi-Touch report, which offers both:
-
Tracking the click rate of each individual entry point
This helps you understand which entry points work well, and which need to be improved. -
Tracking overall performance
Across events (Add to Cart, Purchases), AOV, and revenue.
A/B Testing
Shopping Muse will naturally attract only a subset of the general traffic, and those who do engage will likely have a higher level of intent than the average user. A standard A/B test will compare the effect on users who are potentially exposed to the Shopping Muse entry point against those who aren't. However, because only a small subset of those in the test group will interact with Shopping Muse, we're essentially trying to measure the overall impact of a very small group (users who had the chance to interact with Shopping Muse and decided to do so, vs. users who didn't see any entry point and are therefore not eligible for Shopping Muse).
This poses a challenge: The experiment becomes vulnerable to outliers and might take a long time to reach statistical significance, if at all, even if you have a large amount of traffic. To more effectively measure the impact of Shopping Muse we recommend doing the following:
- Optimize engagement: Before testing, focus on driving more traffic to Shopping Muse by optimizing your entry points and, thus also, the open rate. This will enable you to drive more users to Shopping Muse and potentially reflect the real value of the tool.
- Increase test duration and iterations: Prolong the test or employ sequential testing to accumulate more data over time, which helps to mitigate the influence of outliers and achieve statistical significance.
- Remove outliers: Ensure that the test focuses on relevant users by setting custom attribution eligibility in the A/B test settings. For example, set it so only engagements of users with more than 2 page views are counted for the report.
The Shopping Muse report
This report provides application performance data post-engagement with Shopping Muse. It employs a post-engagement attribution with a 1-day window and enables you to track the following metrics.
- Essential e-commerce metrics: Includes key indicators such as Revenue, Add to Cart, Purchases, and others. These metrics are measured and tracked after a user meaningfully engages with Shopping Muse.
- Special Shopping Muse metrics: These metrics focus on specific interactions within the Shopping Muse app, such as Messages Sent, Product Clicks, and Similarity Clicks.
An engagement is defined as the user either clicking a product or sending a message within Shopping Muse after opening it.