Although many different platforms may collect and display data about things like pageviews and clicks on your site, each platform has a unique way of calculating these numbers. Things such as defining when a day starts, or when a session ends often vary between platforms and can cause discrepancies when comparing the data.
This article will help you investigate any discrepancies between Dynamic Yield and other platforms to help verify whether a discrepancy is based on the platforms themselves, or represents a genuine issue that should be resolved.
This article uses Google Analytics to demonstrate the process of handling discrepancies, but the same general ideas can be applied to other third party analytics tools such as Heap or Adobe Omniture.
Note: Due to different implementation types (Google Tag Manager, execution after page is loaded, etc.), most analytics platforms tend to have small discrepancies. 5% is typically accepted between analytics platforms if you are sending events directly to your analytics platform, and up to 10% if there is a transfer medium in between (like a data layer).
Discrepancies in the Dashboard or Audience Explorer Reports
The data in the dashboard or audience explorer reports could potentially impact the results of all of your tests. If you encounter discrepancies between these reports and an analytics platform, use the following steps to clarify the issue.
Step 1: Check the Entire Site Data
The most common cause of true discrepancies are implementation issues in Dynamic Yield or the analytics platform. To verify if there is such an issue:
- Go to the Dynamic Yield Dashboard
- View the number of purchases (or any other metric that appears at the top of the dashboard).
- Compare this number to the number in your analytics vendor.
If numbers are similar (around 5% discrepancy) it means that implementation is not the cause of the discrepancy. Skip to Discrepancies in Experience Reports to investigate the specific campaign that has a discrepancy.
If numbers are significantly different (more than 5%), continue to step 2.
Step 2: Compare Results to your Source of Truth
If the discrepancy in purchases is larger than 5%, compare the number of purchases in Dynamic Yield to the number of purchases in your CRM / shop system / order system. This step is important to ensure that Dynamic Yield is being compared to data that is undeniably true.
Note: Make sure that you only look at purchases that are relevant to the site or app that is measured in Dynamic Yield.
If numbers are similar (around 5% discrepancy) it means that Dynamic Yield reports are accurate, and the discrepancy is in the analytics platform.
If numbers are significantly different it indicates that there is an issue, most likely related to implementation. Check the following resources:
Discrepancies in Experience Reports
If there are no discrepancies in the Dashboard and Audience Explorer, but there is a discrepancy in a specific experience report, follow these steps, depending on the analytics platform integration method you are using.
Once the Google Analytics integration is enabled on the site and the campaign, Google Analytics events will be fired whenever a variation or control is being served (i.e. every impression).
Step 1: Make Sure the Integration is Turned On
- Go to Settings › Integrations (this screen is only available for the account admin)
- Make sure the integration is turned on
If integration is not turned on, turn it on.
If integration is turned on , verify that the integration is turned on for the campaign with the discrepancy. Dynamic Yield allows you to turn it off and on per campaign, to allow you to have more control of the events that are fired in your Google Analytics property. To verify it is turned on at the campaign level:
- Go to the campaign and click Edit.
- Click the Edit icon at the top-right corner of the screen to modify the campaign settings.
- Click Advanced Settings.
- Locate the Fire Google Analytics event option.
If it's turned off: Turn it on and save the campaign. From this moment onward Google Analytics events will be fired.
If it's turn on: Continue to the next step.
Step 2: Verify that Google Analytics is Tracking the Pages the Experience is Running on
We recommend using the Google Analytics official chrome extension for debugging Google Analytics, but if you are familiar with the developer tools, you can check the network calls.
Go to a page in which the experience is running on (i.e. impression will be counted), and verify:
- Google Analytics script is implemented on the page. If not - implement Google Analytics on this page to ensure the tracking event is detected.
- A Google Analytics event is fired upon a variation impression. If not - contact Dynamic Yield Support.
Step 3: Google Analytics Integration via Tag Manager
You can use the Google Analytics integration with Tag Manager. In this case, data is sent to a data layer instead of being sent directly to google analytics. This creates an additional step that might fail.
- Make sure the data layer is loaded with all the necessary data from Dynamic Yield and make sure the data layer is consistently available with this data until the start of the next page load. To do this type: ‘datalayer’ in the Chrome browser console:
- Set a proper benchmark. Reporting Google Analytics through the datalayer is affected by network quality (which is asynchronous). This means Google Analytics reports are less accurate by definition. So before comparing Dynamic Yield reports to Google Analytics reports, it is important to understand how inaccurate Google Analytics reports are. To do so, compare Google Analytics report to your source of truth (CRM/Shop System). Once you have an understanding of the inaccuracy, you can compare Dynamic Yield reports. For example, if your Google Analytics has a 7% discrepancy from the Shop System, and a 7-10% discrepancy between Google Analytics and Dynamic Yield, you can attribute the discrepancy to the Google Analytics implementation.
Step 4: Build a Comparable Report
Note: Most of the Dynamic Yield reports focus on user data, while Google Analytics is more focused on session data.
Create reports in Google Analytics that correspond the data you see in Dynamic Yield. There are two ways to approach this:
- Check the events directly under Behavior › Events.
- Create a segment for users who were part of this experiment and analyze them as a group. When using this method, make sure the segments are scoped by sessions and not users. Also note that Dynamic Yield allows to create attribution conditions based on other interactions (clicks or event triggers), and that attribution window can be set to extend beyond the session of engagement with the variation. In these cases, create your segments to mimic as well as possible the attribution settings of the Dynamic Yield experiment.
Step 5: Make Sure Data is Comparable
Before comparing results between Dynamic Yield and your analytics platform, make sure the data is indeed comparable. Make sure the following items are aligned:
- Scope of tracking. Meaning, all the pages that the experience is running on and conversions are made in, are tracked by your analytics platform; and you're not looking at pages in the analytics platform at conversions that are not tracked by Dynamic Yield.
- IP filters
Step 6: Be Mindful of Different Definition
Google Analytics is an analytics platform, and Dynamic Yield is a personalization platform that serves campaigns and run A/B tests. As a result, there are several conceptual differences that might lead to differences in how metrics are presented.
Sessions: In Dynamic Yield, as opposed to Google Analytics, a new session does not automatically start at midnight. This ensures that late-night site visitors are tracked more accurately.
Users: In Google Analytics, user identification is done by the cookie only, and in Dynamic Yield, a localStorage is used as a backup if the user clears their cookies.