How to conduct a systematic and effective email A/B testing

How To Conduct A Systematic And Effective Email A/B Testing

Introduction

As marketers, we are always looking for techniques that highlight our product amidst severe competition. We draft customised emails, send newsletters, drop by our clients' mailboxes every morning, and whatnot! Our approach varies a lot from each other, too, but there's one thing we miserably fail at –  

Figuring out how our audience will perceive the content that we create!

No matter how hard we try to put ourselves in their shoes, confirmation bias will always get the best of us. So, how can you be assured that your target audience will appreciate all your early-morning hustles and late-night efforts? 

Through email A/B testing.

Sounds unfamiliar? In about ten minutes, by the time you are through with this article, you will understand all that is to know about email A/B Testing. A strong statement to make? Let's see how we both perform!

A/B testing

Source: bluetent.com

What is email A/B Testing?

Covering the obvious question first – what does email A/B testing even mean?

Also known as split testing, email A/B testing refers to the process of creating two versions of the same email content with the change in one or more variable(s) and sending them out to two respective audience subsets to analyze which one performs better. 

Too much to grasp in one sentence? Let's try to make it easier for you.

Have you seen those posts and polls on LinkedIn and Instagram, with options between two different UI, asking you to choose one?

Which UI is more accurate?

Source: LinkedIn

Sometimes only one of the elements is different, but still, you can have a preference among them. Maybe it's the minimalistic one? Possibly it's the colourful one? Depending on your choices, even a slight difference can make you like one UI more than the other. This is a practical example of A/B testing.

In the case of email A/B testing, it is all about pitting two emails against each other to see which one works better for your target audience. By sending out two variations of the same email to your mailing list, you can identify the best-performing elements and use them to design that one ultimate email campaign.

Should marketers care about email A/B Testing?

Think about this. You are a product developer. Will you ever release your product without running trials and conducting a test? No, right? Then, as a marketer or content creator, why should you bother crafting email campaigns that won't work?

With email A/B testing, you can: 

  • Increase the open rate and click-through rate of your mails
  • Enhance the success rates of your email campaigns by increasing the conversion rate
  • Run simultaneous tests for different variables
  • Procure data on your customer preferences and implement them in upcoming email campaigns
  • Get creative with your variables and come up with challenging tests to maintain a competitive advantage

And these are just some benefits of email A/B testing.

Now ask yourself – should you care about email A/B testing? You know the response.

What exactly are we testing?

To conduct systematic and effective email A/B testing, you should be familiar with all the email components that you can test. This will offer you much-needed food for thought before crafting every element of your email – from the design to the time of sending out – and you can come up with so many of your own elements too.

To give you a kick-start, here are the ten most common components to test:

  • From name

One of the main elements that define an email without even opening it is the sender's name. While there is a lot of scope for experimentation, try to mention who you are or which organization you belong to. For example, Trello uses just its name to send account-specific updates but to send out newsletters, Taco from Trello seems to work better for them.

Email sender's name sample

Email sender's name sample 2

Source: Gmail Inbox

Note: Do not stylize too much, or your mail may look spammy.

  • Subject line

The subject line is the most common variable that can increase open rates. Try different positioning, tones, lengths, and writing styles and notice what works the best. Personalizing with names, or using emojis without overdoing them also enhances open rates. 

Email subject line sample

Gmail subject line sample 2

Source: Gmail Inbox

You can also juggle between short and long subject lines. Subject lines with offers are also a hit. For example, this edX mail uses the subject line as a hook.

Email subject line sample 3

Source: Gmail Inbox

  • HTML vs. plain text

If you are used to sending plain text emails, try experimenting with HTML-formatted designs to see how it works with your mailer list. 

Plain text email

HTML email

Source: Litmus

You can also conduct a few tests to determine which section of your audience likes a certain layout so that you can mail them accordingly.

  • Copy

Did you know that our attention span has sunk lower than a goldfish, currently at eight seconds? So, always know which kind of copy can hook your subscribers. Test around with the tone and positioning. Common variables to change in the copy category are:

  1. Body copy
  2. Headlines
  3. Button copy
  • CTA

To drive actionable data, your CTA must work for the readers. Try testing the following 5 elements of your CTA:

  1. Button vs. Text
  2. Placement
  3. Colour
  4. Copy/Length
  5. Shape

For example, Airbnb initially sent this CTA copy when they were opening up.

 

Airbnb CTA

Source: Sendx.io

Now, their CTA copy looks like this.

Airbnb CTA sample 2

Source: Sendx.io

  • Social media button

As a marketer, you should know if your subscribers appreciate exploring your social media page from your emails. Proper email A/B testing can help gather adequate statistics on it.

  • Personalization

One of the most common and successful ways to personalize mail is by including the receiver's name. Other personalization factors like purchase history, previous emails, subscribers' status, etc., can also be juggled.

  • Length

It is also great to play around with the length of your copy. Try different lengths for different demographics, define an ideal length for common emails, and understand how your subscribers interact with your emails.

  • Images

Images can either increase or decrease your click-to-open rate. So, figure out what images can do for you and test the type of images, placement, quality, etc. These pictures are the same mail – with and without the image. They are examples of email A/B testing.

 

Images in an email sampleImages in an email sample 2

Source: Sendx.io

  • Time of sending

While most email marketers focus more on the content during email A/B testing, it is equally important to schedule your mail appropriately. Instead of relying on generic data, conduct your tests and analyze at what time your emails are mostly opened. For example, travel companies mostly prefer sending emails on weekends, while B2B companies opt for Monday and Tuesday. But, to each, its own, so conduct more A/B email tests.

By tweaking each of these variables in multiple A/B tests, you can truly touch the nerves of your subscribers, and boost your email marketing campaigns, greatly.

Steps to run your first email A/B testing

Most email campaign software support built-in tools for email A/B testing. The commonly used ones are:

  • MailChimp
  • Campaign Monitor
  • Send Pulse
  • Active Campaign

If your email campaign software does not have built-in tools, you can set it up manually. Randomly distribute your current mailing list into two parts, and send one version of the mail to each list. Export your analytics data to a spreadsheet and compare the results manually too.

The quickest steps to get started with email A/B testing are:

Identify your problems and benchmark

As a marketer, you already know your clients and their preferences. Based on those parameters, study your email content. Have a look at previous email campaign statistics. Try to figure out the problems with your conversion funnel. Also, set some test benchmarks to monitor in each campaign. Common benchmarks to test include email open rates, conversion rates, etc. A proper analysis can help you set realistic goals and alter relevant variables for successful email A/B testing.

Set a clear goal 

Having a clear goal is indispensable for good email A/B testing. If you are a beginner in email marketing and your subscribers are not even opening your emails, then alluring them with engaging content might be a good place to start with. You can slowly uplift your goals to monitor conversion rates, email open rates, and other metrics to improve engagement.

Define your hypothesis

Before conducting an email A/B test, define your hypothesis first. For example, you have observed that the subject lines with emojis get more open. So your hypothesis can be – “Subject lines with emojis get more opens.” Now make your subject line the testing variable and test this hypothesis. Note that the purpose of setting a hypothesis is not to be proven right. It is to check the accuracy of your assumption.

Define your sample size

Try to conduct email A/B testing with a large sample size so that your test generates more data – more data translates to more accurate insights. You should have a minimum of 100 people on your mailing list, divided into 50-50 for split testing. If you have a huge mailer list, then you may not test it on your entire list. Just have a sample size big enough to get insightful and accurate data.

 A/B testing working sample

Source: Microsoft

Gather all your tools

The internet is like Pandora's box with plenty of tools, ready to be used. Look for all the crucial email A/B testing tools that you may need. Find the A/B testing features in basic email marketing services. Along with other marketing techniques, on-point email marketing can do wonders for your future campaigns.

Determine the test variable

The best practice is to choose one test variable at a time. This way, you can simply test your initial assumption against the hypothesis. We have already mentioned ten components to test in an email A/B testing, so create your hypothesis around any one of them and get started.

 

A/B testing working sample 2

Source: Weebly

Create email variations

Based on your hypothesis, create different variations of your control email. The 'control' or default version is the original email that you would have sent if you were not conducting A/B testing. This is vital because many confounding variables can impact the validity of your test. An example of a confounding variable can be – your email recipient is on vacation with no cellular connectivity. Always remember to keep a controlled version of your email to test against.

Run the email A/B test

Once you have satisfactorily gone through every step and formulated emails for email A/B testing, it's time to run the test by hitting those mailboxes.

Analyze the result against your hypothesis

After a few hours or at least a day of running the email campaign, have a look at the metrics for conversion rate, click-through rates, and email open rates. And most importantly, check if your initial hypothesis stands true. Pay attention to what works for your audience.

Get, set, repeat

You may not notice significant changes immediately, but you can certainly monitor which variable performs the best. It's time to repeat the steps and test a new variable. Remember, consistency is the key.

Best practices and tips to conduct effective email A/B testing

  • Have a large sample size to easily draw significant statistical parallels between the results of A/B testing.
  • Pick up a variable or testing metric that aligns with the goal of your email campaign.
  • Only choose active subscribers to avoid any discrepancy in the results.
  • Have a clear hypothesis to follow and do not make random alterations to the campaigns.
  • Along with your newsletters and promotional emails, do not forget to test the transactional and automated emails, too.
  • Focus on the elements that are most important for a particular campaign. Your A/B testing pattern should change with the nature of your campaign.
  • Always share your test report with those who handle other marketing channels for better clarity.
  • Experts suggest testing only one variable at one time to have a better insight into the impact on subscriber engagement.
  • Wait for at least a few hours or even a day after sending both mails and then compare the gathered data. 
  • Do not hope for significant changes immediately, and keep testing frequently, altering different elements every time. 
  • Even if the results do not align with your assumptions, trust the empirical data and not your instincts.
  • Do not just gather data but also build on your learnings to bring change in performance.

Conclusion

Now you know all about email A/B testing, just like we promised. So, it's time to dive deeper into improving your marketing skills with email A/B testing. 

As you start understanding the email metrics, you can expand into web analytics, data analysis for audience preference, compare read rates between variations, understand what works for your audience, and skyrocket your revenues. 

Skeptic to take your first step alone? Fractional CMO has got your back! Get in touch with us right away to explore the best marketing tactics for your brand.

Start growing with fractional CMO today!