How to avoid breaking your SEO when doing A/B tests
“[…] we’re glad you’re testing! A/B and multivariate testing are great ways of making sure that what you’re offering really appeals to your users.”
– Google
We do A/B tests to improve the quality of our website and, ultimately, conversion. But we don’t want to lose traffic during the test or as a result from it. Low value pages, speed loss and crawl budget loss may cause a drop in the search results. Or do they?
Here we pinpoint the reasons why A/B tests could be bad for SEO and then give 5 rules to avoid this.
The bottomline: CRO and SEO aren’t meant to bite each other, but to work smoothly together for better results
Why could an A/B test be bad for SEO?
- A/B testing means having at least two different versions of a page. Here’s a risk of duplicate content issues.
- Redirect loops or wrong redirects (301 instead of 302) may occur.
- Badly set up tests can result in bad user experience (conflicting messages, bugs, etc.)
- The site speed may go down because of extra code, redirects, modifications, etc.
- A/B tests may win on conversions, but lose on SEO, if the SEO impact is not evaluated (e.g. changing a headline or copy, removing a sidebar, etc.).
The 5 rules to avoid CRO from negatively impacting SEO
1. Do not allow CRO and SEO items to influence each other
- By this, we mean that you shouldn’t do ‘SEO things’ that might interfere with CRO results. For instance: never change anything on a page while a test is running, it will mess up your test results. If you do need to make a change to a page, stop the experiment, update the page, and start a new test.
- If you want to see the impact of any SEO changes, try running it as a variation first, see this helpful article from Moz.
2. Follow the guidelines communicated by Google
- No cloaking
- Use a canonical
- Use 302 and not 301 redirections
- Only run the experiment as long as necessary.
Read more here.
3. Make sure the control version is the one that’s really live
When you are experimenting with test versions, this shouldn’t cost you rank losses. Therefore, the test versions should never be those that are indexed.
It’s very important that the original (or control) version remains the canonical one indexed by Google. This because we don’t want to influence the ranking of that page during the test. When we’ll have the outcome of the test, we will be sure that (from a user experience point of view) that is the best version, and we can let that page become the canonical.
4. Communicate to all teams about your tests
- Make sure to communicate ahead of time, not only when you start running the test.
- Have a centralized calendar to which A/B tests can be added, alongside marketing campaigns, development releases, content publications, industry events, etc.
Sharing your intentions saves a lot of time for teams trying to figure out why a website suddenly ‘changed’, or why users are reporting bugs, rankings are influenced, etc.
Things to tune with the teams involved:
- SEO team: will changes impact SEO? Are there any SEO plans that might impact the test?
- Marketing team: what about banner campaigns? Traffic mix changes? Upcoming events?
- Content team: major publications = traffic mix? Representative traffic in test?
5. Work with experts in both CRO and SEO
CRO and SEO are different disciplines, often practiced by different teams in the company. To avoid the one from having an undesirable impact on the other, it’s important to work closely together, and:
- Take SEO already into account when planning new tests.
- Make sure you are updated on the latest Google updates and changes.
- Be prepared to handle any unforeseen issues with CRO + SEO.
- Put together the best possible team.
If you follow the recommendations above, you’ll make sure that your A/B tests have no impact on your site in search results.