For more information on the experiment, see our Personalization Impact on SEO web page.
Understanding the risks during website optimization
From a technical point of view, there are four main risks to consider when optimizing websites:
Cloaking involves presenting one version to search engine bots and a different version to human users, for the purpose of manipulating and improving organic rankings. This is a huge issue in terms of SEO risks and a harsh foul against Google’s official webmaster guidelines. Running optimization initiatives that target specific variations to search engine user agents (such as GoogleBot) and others to human visitors is considered cloaking, which is a big no-no.
At Dynamic Yield, as a strict principle, we always treat Googlebot and other search bots the same way we treat real human visitors. With Experience OS campaigns, your optimization or personalization initiatives will never be considered cloaking manipulations.
Wrong kind of redirect
URL and content duplications
URL Duplications: Sometimes, when running online experiments, the system duplicates some of the site’s URLs for different test variations. Generally speaking, from an SEO point of view, onsite duplications are a relatively minor risk in terms of a real penalty, unless they were initially created as an attempt to manipulate organic rankings. Nevertheless, to resolve this potential risk, we recommend a fairly easy fix: Implementing a canonical tag element on each duplicated URL. You can also block access to those duplications using a simple Robots.txt Disallow command or a Robots meta tag with a noindex value.
Content Duplications: With onsite optimization and personalization, websites deliver different content variations that are tailored to different individuals. When the original content plays a big part in determining a web page’s organic rankings, it's vital to keep this content onsite, rather than replacing it with fully dynamically-generated content. In addition, keeping the original content can also act as a failover mechanism for unsupported test groups or individuals.
Web loading performance
Site speed is a major user experience issue, and is therefore a deal-breaker. It has also been incorporated into Google organic web search ranking algorithms since 2010, among 200+ other ranking signals. Some users are concerned that optimization and personalization tools might negatively affect their web page loading time, and the truth of the matter is that sometimes it just might. But there are ways of minimizing the effect.
One way is that the Dynamic Yield script is loaded asynchronously, and as such, does not affect page load time. In addition, we work with Amazon Cloud Servers and deliver content directly through their CDN servers.
We can also send content to your company’s own CDN, if necessary. Lastly, it's important to note that we have an uptime of almost 100%. We power more than 10B page views a month for some of the world’s most demanding customers, who wouldn’t stand for latency on their site, just like you.
- Results on search results pages are valid for all Dynamic Yield implementation methods.
- Google acknowledges the use of A/B testing as a conversion optimization solution.
- You should be testing all the time and improving your site and overall user experience.