How to Optimize Your Email Campaign through A/B Testing

How to Optimize Your Email Campaign through A/B Testing

Practice makes a man perfect. Testing makes decisions perfect. If you make testing a practice in everything you do, you are always perfect.

We know this principle well, but don’t necessarily follow it religiously. This leaves us with a lot of ifs and buts.

If you are the manager of a marketing automation team and your job is to ensure only the best result for email marketing campaigns, you cannot help but learn experimenting with emails.

We are talking about A/B testing your email automation campaign that tells you what works and what does not and then why.

Why A/B Testing

You may have collected genuine sales leads from various sources. You have prepared a nice email draft with a perfect subject line and a body copy. You may have also invested your time in segmenting your email lists. But, you are yet to gain the targeted results.

The reason is that you didn’t try out with multiple sample tests and then compared the results of best open rate, page views, click-through rate and actual sales conversion.

Unless you create a hypothesis for your email marketing goals and do a rigorous testing to compare between hypothesis and the actual, you cannot decide what works best. In order to analyze the derived statistics, you need to treat A/B testing as a separate project and keep executing it until you get the desired results.

A/B testing today has been marked as a common philosophy and business strategy in the web development domain. This article will take you through the mechanisms of A/B testing and necessary steps to do it right.

A/B testing can tell you –

  • If your email recipients find answers to their questions
  • If your email intends to answer some of their common questions
  • What kind of recipients prefers your resources through email
  • If your email subject line is engaging enough for someone to respond
  • If your email delivery time, frequency and tone of the message are correct

Such experiment may demand additional work and effort, but that saves you a lot of time by reducing errors and rework.

If you think that executing A/B testing manually is a bit difficult for you, there are platforms like GetResponse that can make things easier. All that you need to do is setup the email settings in a few clicks and create multiple versions of your email including content, subject lines and other necessary fields like date and time of execution. With such ready-to-use solutions, you can easily monitor CTA, email length, delivery time, message delivery rate, opens, clicks, and other important metrics. You may even decide how long an A/B test would run and if you want only the best-performing campaign to continue automatically.

Factors to Consider during A/B Testing

1) Create a Good List of Hypothesis First

A hypothesis is a predefined set of statements that help you uncover what happened and why. It helps you remove assumptions and do better research to identify a potential solution. A full-proof hypothesis should draw valuable insights from your email experiments. You may read this blog post by Sana Rusonis to know how to create a hypothesis for A/B testing.

The individual components of a hypothesis look like –

If _____[Variable] _____, then _____[Result]_____ [Rationale].”

Here is a couple of sample hypotheses if you are targeting the email audience.

  1. “The click-through-rate will be higher on Free Download emails for customers who already subscribed to our newsletters earlier. “
  2. The click-through-rate will be higher on Invitation emails for prospects who have attended our previous events, seminars and webinars.

Another two sample hypotheses if you are targeting resource placement.

  1. Adding a summary of the Free download material next to the Download button or a link will improve the click-through-rate and the lead conversions.
  2. Adding a call-to-action text, a helpline number or a simple inquiry form to the invitation email will help people respond easily and quickly.

Here are a few guidelines to consider while you write a hypothesis.

  • Be clear about the problem you want to address to or the metrics you want to improve through the experiment.
  • Consider both qualitative and quantitative data.
  • Create strong hypothesis so that you can take action upon it.
  • Avoid bringing multiple variables into a hypothesis statement.
  • Treat every test unique and important, since it adds value to your learning curve.
  • Write a hypothesis for every outcome so that you know the result – positive or negative.
  • Your hypothesis should serve as a reference to the future experiments, so document it in the context of present, past and future of any email marketing campaigns.

A well-formed hypothesis is the key to start A/B testing process successfully. Similarly, the objective of an A/B testing is to prove the hypothesis correct. Once you learn diligent use of statistics to write data-driven hypothesis, you can expect the testing to give you desirable outcomes.

2) Focus on the Subject Line

Subject line is what an email means to its recipients. You can test subject line in various ways. For example, using personalized email subject may provide better results than using a non-personalized one. You can simply add ‘You’ to personalize the subject and avoid using personal pronouns.

Choosing short subject lines over the long sentences work better. However, there is no optimal character limit for a subject line, so you should determine it based on your target audience, campaign type and the industry vertical.

3) Test Your Email Body Content

This is where most of us fall through cracks. You need to spend time and effort in drafting email body copy that engages your customers well. What works and what flops in an email depend much on what you add or don’t add to your draft. Here are the ways to structure an email body and measure their performance.

Text based email vs Email with a combination of texts and visuals

If most of your email recipients are mobile or tablet users, text emails work better. A report by Marketing Land says that Mobile Email Consumption Continues To Surge With 61% Of Brand Emails Viewed On Smartphones & Tablets. However, emails with both visuals and texts can be more appealing.

Therefore, in your next email campaign you may try out both and compare the opens and click-through-rates. You may further test the campaign by altering the placement, size and number of visuals.

Emails with Hard Sell message vs Emails with soft sell message

Hard sell emails intend to target the audience directly to ensure quick returns. Soft sell messages are for nurturing leads into prospects and prospects into customers. In an A/B test, soft sell emails generally rank higher than the hard ones. The reason is that people love free information that are related to their interest. Hard sell message works only if it has enough substance for users to take a quick call with confidence.

Setting various tastes of tones

If you have customers all around the world, you may try out different tones of voice and colloquialisms for different set of audiences. You need to split your email list based on demographics and send two different copies to test what gives you the best results.

4) Examine Your Call-to-Action (CTA) Strategy

CTA is what boils down to the net result of any email campaign. With A/B testing you can better apply your Action Words and Offers that supports your CTA plan.

For Action words you may try out with these variations:

  1. Sign up vs. Schedule a Free Demo
  2. Click here vs.  Request a Call
  3. Book an Appointment vs. Let us call you today

You may also play with the call-to-action placement and test the results. For instance, putting a CTA button in the top right corner, in the middle of the content, or at the end of the text may provide you with different results. The more variables you add to your A/B testing, the more alternatives you have during campaign execution.

5) Try out with Different Timings

You may have noticed that your email gets better response on certain days in a week and perhaps at certain times of the day. If you observe carefully, you will know that there is a pattern of how people respond to your emails.

Your A/B testing may include these variations:

  1. Day of a week – Monday vs. Friday
  2. Time of a day – 10 a.m vs. 6 p.m depending on the business hour
  3. Season – International vs. Regional holidays
  4. Frequency – Once a week vs Twice a week vs Twice a month

The key to running any successful A/B testing campaign is how fearless you do it. You may not just restrict yourself to the minor changes in subject lines or tweaking words in your call-to-action button, but do more. The more creative you are and the bigger changes you make, the higher impact you may anticipate.

Once you are done with enough test runs and test results, the next steps are to follow the email templates for subsequent campaigns and monitor the difference in results. If you think that, the delta is becoming large with time you may consider repeating the A/B test with fresh set of data.

  • William you beautifully explained here all things. Thanks for this effort