Although many platforms collect and display data about things like pageviews and clicks on your site, each has a unique way of calculating these numbers. For example, defining when a day starts or when a session ends can vary between platforms and cause discrepancies when comparing the data.
This article can help you investigate any discrepancies between Dynamic Yield and other platforms to help verify whether they're based on the platforms themselves or represent a genuine issue that needs to be resolved.
This article uses Google Analytics to demonstrate the process of handling discrepancies, but the same ideas can be applied to other third-party analytics tools, such as Heap or Adobe Omniture.
Note: Due to different implementation types (Google Tag Manager, execution after the page is loaded, and so on), most analytics platforms tend to have small discrepancies. 5% is typically accepted between analytics platforms if you are sending events directly to your analytics platform, and up to 10% if there is a transfer medium in between (like a data layer).
Note differing metrics on different platforms
Metrics can differ between platforms. For example, the Session metric differs because of different use cases on each platform. In Google Analytics, user identification is done by the cookie only, and in Dynamic Yield, a localStorage is used as a backup if the user clears their cookies. Read more about Dynamic Yield definition for sessions and users.
Discrepancies in the Dashboard and Audience Explorer reports
The data in the dashboard or audience explorer reports could potentially impact the results of all of your tests. If you encounter discrepancies between these reports and an analytics platform, use the following steps to clarify the issue.
Step 1: Check the entire site data
The most common causes of true discrepancies are implementation issues in Dynamic Yield or the analytics platform. To verify whether such an issue exists:
- Go to the Experience OS dashboard
- View the number of purchases (or any other metric that appears at the top of the dashboard).
- Compare this number to the number provided by your analytics vendor.
If numbers are similar (up to ~5% discrepancy), implementation is not the cause of the discrepancy. Skip to Discrepancies in experience reports to investigate the specific campaign that has a discrepancy.
If numbers are significantly different (more than 5%), continue to step 2.
Step 2: Compare results to your source of truth
If the discrepancy in the number of purchases is larger than 5%, compare the number of purchases in Dynamic Yield to the number of purchases in your CRM, shop system, or order system. This step is important to ensure that Dynamic Yield data is compared to undeniably true data.
Note: Make sure that you look only at purchases that are relevant to the site or app measured in Dynamic Yield.
If numbers are similar (up to ~5% discrepancy), it means that Dynamic Yield reports are accurate, and the discrepancy is in the analytics platform.
If numbers are significantly different (over 5% discrepancy), this indicates that there's an issue, most likely related to implementation. Check the following resources:
Discrepancies in experience reports
If there are no discrepancies in the dashboard and Audience Explorer, but there is a discrepancy in a specific experience report, follow these steps, depending on the analytics platform integration method you are using.
When the Google Analytics integration is enabled on the site and the campaign, Google Analytics events are fired whenever a variation or control is served (every impression).
Step 1: Make sure the integration is turned on
- Go to Settings › Integrations (this screen is available only to the account admin)
- Make sure the integration is turned on
If the integration is not turned on, turn it on.
If the integration is turned on, verify that it's turned on for the campaign with the discrepancy. Dynamic Yield enables you to turn it off and on per campaign, to provide you with more control of the events that are fired in your Google Analytics property. To verify it's turned on at the campaign level:
- Go to the campaign and click Edit.
- Click the Edit icon at the top of the screen to modify the campaign settings.
- Click Advanced Settings.
- Locate the Fire Google Analytics event option.
If the integration is turned off: Turn it on and save the campaign. From this moment onward, Google Analytics events will be fired.
If the integration is turned on: Continue to the next step.
Step 2: Verify that Google Analytics is tracking the pages the experience is running on
We recommend using the Google Analytics official Chrome extension for debugging Google Analytics, but if you are familiar with the developer tools, you can check the network calls.
Go to a page on which the experience is running (for example, on which impressions are counted), and verify that:
- The Google Analytics script is implemented on the page. If it isn't, implement Google Analytics on this page to ensure the tracking event is detected.
- A Google Analytics event is fired upon a variation impression. If it isn't, contact Dynamic Yield Support.
Step 3: Complete if you use Google Analytics integration via Tag Manager
You can use the Google Analytics integration with Tag Manager. Data is sent to a data layer instead of being sent directly to Google Analytics. This creates an additional step that might fail.
- Make sure the data layer is loaded with all the necessary data from Dynamic Yield and make sure the data layer is consistently available with this data until the start of the next page load. To do this type: ‘datalayer’ in the Chrome browser console:
- Set a proper benchmark. Reporting Google Analytics through the data layer is affected by network quality (which is asynchronous). This means Google Analytics reports are less accurate by definition. So before comparing Dynamic Yield reports to Google Analytics reports, it's important to understand how inaccurate Google Analytics reports are. To do so, compare Google Analytics report to your source of truth (CRM/Shop System). When you have an understanding of the inaccuracy, you can compare Dynamic Yield reports. For example, if your Google Analytics has a 7% discrepancy from the shop system, and a 7-10% discrepancy between Google Analytics and Dynamic Yield, you can attribute the discrepancy to the Google Analytics implementation.
Step 4: Build a comparable report
Note: Most of the Dynamic Yield reports focus on user data, while Google Analytics is more focused on session data.
Create reports in Google Analytics that correspond to the data you see in Dynamic Yield. There are two ways to approach this:
- Check the events directly under Behavior › Events.
- Create a segment for users who were part of this experiment and analyze them as a group. Make sure the segments are scoped by sessions and not users. Also note that Dynamic Yield enables you to create attribution conditions based on other interactions (clicks or event triggers) and that the attribution window can be set to extend beyond the session of engagement with the variation. If this is the case, create your segments to mimic the attribution settings of the Dynamic Yield experiment as nearly as possible.
Step 5: Make sure the data is comparable
Before comparing results between Dynamic Yield and your analytics platform, make sure the data itself is comparable. Make sure the following items are aligned:
- Time zone
- Selecting the full test version. Make sure the selected time frame covers the entire duration of the version. In Dynamic Yield experiment reports, users are counted only once, at the moment they were first exposed to the test. If a user is seeing the test for the second time, they aren't included in a selected time frame if it doesn't include their first interaction.
- Scope of tracking. This means that all the pages that the experience is running on and conversions are made in are tracked by your analytics platform, and you're not looking at pages in the analytics platform or conversions that are not tracked by Dynamic Yield.
- IP filters