A/B Tests

A/B testing is an experiment that has two variables — A and B. It’s used in website design, marketing campaigns, ads, and content strategy to test different versions of a specific asset. The goal of A/B testing is to identify the more valuable variable. For example, if you’re adding a call to action (CTA) on your website, it’s recommended to A/B test different CTA designs until you find one that generates the best outcome.

Why should you conduct A/B tests?

A/B tests, also known as split testing, is one of the most valuable tools you have because it means that you don’t have to always make the right decision about design or word choice the first time around. It’s also an important tool for conversion rate optimization (CRO). 

Supposing you design a nifty text-and-icon CTA for your newsletter and you expect it will generate dozens of clicks to your website. When it turns out that the CTA is underperforming, you can run an A/B test.

Variable A: Text-and-icon CTA

Variable B: Large, single-color CTA button 

Results: The larger, single-color CTA gets 10% more clicks

A split test won’t typically tell you why an asset underperforms, but it is guaranteed to tell you which of two variables performs better.

What elements can you A/B test? 

It would be easier to answer this question by saying there’s not much you can’t hold up to scrutiny with a split test experiment.

Here are some examples of the types of variables you can A/B test:

Calls to Action (CTA)

People go to your website to read your blog, purchase products, download media files, and subscribe to your newsletter. 

  • Test different versions of CTA text. Identify which text generates more conversions.
  • Test your CTA button in different locations on your website.
  • Test several CTAs on a page versus one large CTA. 

Funnels

Your goal with a funnel is to get people from point A to point B. You may see a large number of drop-offs if it takes too long for customers to advance through the funnel.

  • Test your original funnel against one that has one or more fewer steps.
  • Test a version of your funnel that has fewer on-page distractions, like product offers or extraneous images.
  • Test different versions of copy in your funnel. Version A may be written in the familiar, first person and version B could be formal, third person.

Advertising

If you’re not A/B testing your ads, you could literally be leaving money on the table. 

  • Test different headlines on your paid campaigns.
  • Obviously you can test different ad designs, too.
  • You can even A/B test different variations of the landing page that your ads direct users to. 

Social Media

A/B tests can help you increase likes and retweets. 

  • A/B test different social media icon-design styles, sizes, and on-page placement.
  • Tweak your social media voice until you land on a style that your audience responds to in a positive way.

Mobile

There are entire countries that identify as “mobile first.” That means that most of their online activity occurs on a mobile device rather than a PC. So there’s not a lot of tolerance out there for mobile websites that are “noisy” or badly designed. 

  • A/B test the placement of links that take users to desired locations.
  • Test different versions of your mobile site to see which one users stay on longer or return to more often.
How does your product stack up to the competition?

Get the 2019 Product Benchmarks report to learn more.

Download Report

Does A/B testing work?

Here are some examples of how A/B testing worked for real companies. 

Gekko is an accounting platform for self-employed professionals. The company’s Mixpanel dashboard identified a large number of drop-offs in their onboarding funnel. So the product manager A/B tested different versions of the funnel’s content and design. The new version of the funnel outperformed the original and funnel completions increased 58%.

Yummly is a searchable recipe database. Editors use key metrics to drive the site’s design and content. Prior to implementing a recent update, the product manager set out to test what type of impact the redesign would have on new-user registrations. She presented one user cohort with the original design and the second cohort saw a design update. A week later, Mixpanel Insights revealed that conversion rates on the new site site was 11% higher than the control version.

How to choose variables to test

A/B testing can be a bit addicting. Who doesn’t like seeing clicks and conversions soar? A good rule of thumb is to stick with testing a new campaign, introducing a new feature, or making upgrades to your site before implementing the update.

1. Choose a variable to test – Many variables will present themselves to you for testing. But if you’re looking to actually measure performance by percentage or confidence level in an independent variable, just be sure you’re isolating that asset in your split test. 

Here are some examples of independent variables you can A/B test:

Asset

     Test

Email

     Subject line A vs. Subject line B

CTA button

     1 large button vs. 3 small buttons

Form fields

     Landing page with a form field vs. one without

Offers

A free ebook vs. a free trial

2. Identify your goal.

  • If you’re split testing email subject lines, your goal is to increase the percentage of your email open rate. 
  • For a CTA button, you can quantify your click through rate (CTR). 
  • You can also look at whether certain elements affect conversions negatively. Supposing your team is debating whether to include a form field on your landing page, but half the team thinks it will drive down conversions. An A/B test can quantify conversion rates for a landing page with a form and a landing page sans form.
  • Free offers are a great way to attract new customers or subscribers. But what freebies would appeal to your audience? A/B testing different products will serve two purposes: It will reveal what types of offers drive subscriptions for your specific business and you can also look at the statistical significance between two types of freebies. For example, if you have a movie review website you can calculate by how much your subscribers are driven by free movie tickets versus a free ebook of your all-time favorite movie reviews.

3. Create your test variables — versions A and B.

4. Identify the user groups for your split test

5. At the end of the testing period, review the results.

  • If there’s a clear leading asset, implement that change. 

6. Repeat.

  • You can continually set new goals and tweak subsequent A/B tests. 

What you need to know about A/B testing

A/B testing sounds really easy. Just present groups of users with an A-variable and a B-variable, and then go with the version that produced a better outcome, right? Well, not so fast. Good results, or better results, do not always mean the results are statistically significant.

If the results of an A/B test are statistically significant, it means that the results are not random or incidental. Calculating the statistical significance of A/B test results can provide numerical proof that the result is reliable or meaningful.  

Another thing you should know about split testing is that there are limitations. As noted earlier, your A/B tests will yield the best results when you’re comparing two variables, like two email subject lines. Multivariate testing is a way to test an asset that has multiple versions and variables, like 10 combinations of a landing page with three headlines, two CTAs, and two images. The goal is to identify the best performing combination of variables out of all of the possible combinations.

Learn how to pick the metrics that matter for your business

Download the free Guide to Product Metrics today.

Get the Guide