A-B-BlocksTwo Big Truths of online marketing are 1) small changes, like changing a green button to red, can increase clicks by users, sometimes by a lot…and 2) nobody can reliably predict which small changes will produce big results.

So smart online marketers are constantly testing tweaks to their email messages. You should too, and it’s actually pretty easy, using the A/B testing tools offered by the major email service providers.

In A/B testing, you create two groups from your email subscriber list, send a different version of the message to each group and then track which one performs better. The versions might differ in one specific way, so you determine the effectiveness of one detail at a time — like green button vs. red. Or the messages might be very different in look and feel, a messier approach that might tell you which pitch has a better emotional appeal overall.

Goals and metrics for email A/B testing

You’re experimenting to see what moves the needle on one or more of these key email metrics, compared to current performance:

  • Open rates – how many users open your email message.
  • Click-through rates – once the email is open, how many users click on anything.
  • Unsubscribe rates – ouch, how many users have you annoyed to the point they click the unsubscribe link in your message.
  • Conversion rates from email click-throughs – the ultimate metric. Once they click through to your site, how many users fill out a form or otherwise turn into leads?

What to test

  • The from line, for instance, from a person’s name vs. from a company name. Metric: open rates.
  • The subject line, the biggie for trying to improve open rates. Many options to test here, starting with how you word the call to action: Does a hard sell (“50% off now!”) do better than a funny/intriguing subject (“Don’t forward this email…”)? Do specific words (“sale” vs. “limited time offer”) make a difference? A question vs. a statement? Short subject line vs long one? How about personalizing with the user’s name vs. no name?
  • Design of the email, including the size and color of the font, the placement or choice of images, the wording, color or size of the buttons – or maybe a plain text link performs better. Metric: click-through rate.
  • Message content. Headlines, calls to action and overall length of the message are variables to test for click-throughs. Could a P.S. line be a winner?
  • Audience segments. New vs. long-time customers, men vs. women, may well respond better or worse to the same message.
  • Time of day, day of week and frequency. In general, open rates for email campaigns tend to rise as the week goes on, but your mailing list may just love Mondays, so test. The number of messages you send in a month may raise or lower your success, too.

How to test

The major email vendors offer tools that make A/B testing as simple as filling in a form with your A and B choices for a subject line, for instance, or selecting A and B send dates from a calendar. They also handle the chore of randomly creating two lists from your mailing list to receive each version.  

A general rule of thumb from email vendor ExactTarget is to send your tests to 5% of your list if you have more than 50,000 subscribers or 10% if you have less than 50,000 subscribers, then send the winner to the rest of the list 24 hours later.

Important caution: If you have a small list (for instance, under 500), your results may not be statistically significant, in other words, too few to give you confidence that results weren’t skewed one way or the other by chance (try this statistical significance calculator from HubSpot to get a feel for this). 

When to test          

Always and forever. Why miss a chance to improve your metrics? Send at least one small test with every message.