4 Examples of A/B Testing in Email Marketing

What is A/B Testing?

Katie Kinsley
4 min readJul 9, 2021

A/B testing, also known as split testing, is essentially an experiment that compares two versions of something, in this example it’s email marketing, to discover which version performs best for your audience based on measurements like clicks or conversions.

In order to have a valid A/B test, you must first plan a hypothesis for your test. You may, for example, want to improve the open rates of your emails. Your subject lines currently follow a format consistently looking like “Monthly Newsletter,” which may be overlooked by a reader. Your internal team suggest that you try something like “You Won’t Want to Miss Out!” which conveys a sense of urgency. In this example, you could split your audience 50/50 to test which subject line had higher opens rates.

Below are four examples of the types of A/B testing that you can implement into your email marketing program.

Test #1: Subject Lines

For this test, you’ll want to send the same exact email with two different subject lines. You can vary the length of the subject line, make it more or less formal, or even add in emotion.

When you write with emotion, you’re changing your message to focus on urgency, curiosity, inspiration, etc. You can decide how you want your audience to interpret the subject line.

Analyze: Which one performs better? If your audience responds to inspiration, but ignores urgency, that’s valuable insight for future emails.

Test #2: Copy Length

After you’ve determined what type of subject lines resonate with your audience, now it’s time to start testing the insides of your emails. How much copy is enough?

According to Hubspot, an ideal length of a sales email is between 50–100 words. That’s short! If you’re reading up to this point, I’ve already surpassed 200 words.

Please take their advice with a grain of salt. There are different types of marketing tactics and different types of consumers in the world. What works best for sales doesn’t necessarily correlate into a fundraising email campaign.

Start this test by creating one email with long-form copy (think about 200–250 words) and test it against a short-form copy (between 50–150 words). You’ll have to decide on the goal of your email first before you can decide if short or long copy works best for your organization. Make sure that your copy takes your reader down a path and converts them with a call-to-action (CTA). If you don’t have a goal for your email — why are sending it?

Test #3: CTA

If you tested different copy-lengths with your last email campaign and didn’t come to any conclusion, then it may be time to test the CTA itself.

What kind of CTA does your email campaign have? A button, hyperlinked text, a graphic — or all of the above? Is your CTA happening too soon or not at all in the message?

For this test, you’ll want to try it a couple of times.

  • Hyperlinked Text: Is your copy compelling enough for the reader to click?
  • Hyperlinked Graphics: Are you receiving clicks on an image of the “Donate” button? If not, you’ll want to try to test that element against an HTML button.
  • HTML Button: Does the color of your button factor into how many clicks it receives? Try testing various colors.
  • Multiple Buttons: Do you have more than one button in your email? Track which placement receives more clicks.
  • Too Many Options: Does your email campaign contain all of the elements above? Try removing one element from each campaign to see what gets the best response from your audience.

Test #4: Timing

A post discussing split testing wouldn’t be complete with discussing the timing of your email campaigns — and this one isn’t any different. What day of the week and at what time do you send your emails?

Email is designed to be timely and effective. If you’re not reaching your audience when they’re in their inbox, then they could be missing out if your email has a deadline to meet.

Start by looking over historic email data and record the day of the week each email was sent and how each one performed with opens, clicks and, if you have it, conversions. If, for example, your conversion rate is better on Tuesdays, but your open rate is better on Thursday, then you may want to send solicitation emails on Tuesday and information emails on Thursday.

Conclusion

There are a lot of options to consider when you’re beginning to decide that A/B testing is right for your organization. Above all, you want to make sure that what you’re testing can come to a solid conclusion. By staying focused on the data behind the decisions and the outcomes each test can produce, you’ll be able to determine that solid conclusion. A/B testing is invaluable when it comes to improving your email marketing.

Have you tried A/B testing in your email marketing? What worked for you and what didn’t? Let me know in the comments below!

--

--