Back To Blog

Key variables to A/B test in your email marketing

Are your email bulletins often a shot in the dark? As you hit send do you feel confident your messaging will engage a high proportion of your audience and prompt them to become “active participants” in whatever it is you want them to do or know? Is there any science to your approach?

Data is a communicator’s best friend. It can provide greater visibility into how well your messages connect with a target audience and the decisions that hold potential to yield results. Data can help identify who cares about your messages, what topics interest your subscribers most, what techniques are most effective at eliciting a response, and most importantly, what tactics drive meaningful audience engagement. Take a look at our public sector comms benchmarking report for average performance metrics and tips for improving your comms results.

In the context of email marketing, A/B testing allows you to compare the performance of two versions of an email bulletin (which are sent to a sample of the target audience). The bulletin that achieves the highest engagement rates – opens and clicks – is deemed the “winning” bulletin, and is then sent to the remainder of the audience for maximum engagement with your message. A/B testing can be done manually or be automated using the GovDelivery Communications Cloud and should be used to refine different elements of an email. For example, test these variables to see what resonates best with your audience:

  • Subject lines
  • Message content
  • Images/video
  • Message length
  • Format (templates)
  • Landing pages
  • Call-to-action buttons

Many of our clients are using A/B testing to optimise their email marketing communications, including Central Bedfordshire Council who have used it to re-engage subscribers ahead of GDPR, and for other messaging. Here’s a peek at what they’ve discovered through trial and improvement.

 

A/B testing examples in local government email communications

By Alan Ferguson, Web Manager, Central Bedfordshire Council 

We have been using two types of A/B (split) testing for our email marketing:

  • Test and send – two versions of the email go to 10% of your database and then the winning email goes to the other 80% automatically.
  • 50/50 test – as it sounds, one version of the email goes to 50% of the database and the other to the remaining 50%.

This has allowed us to test a few things:

  • Emojis in the subject line
  • Different subject lines completely
  • Different templates

Emojis in the subject line

Overall, this has been pretty inconclusive. Sometimes emoji wins and sometimes not! But given that you have the option for the test and send, why not use it as the reality is you will always have a winner!?

In this example, the version with emoji had a fractionally higher open rate, but at 0.03% it is tight.

A/B 1

Different subject lines

We reversed the info in the subject line of this press release and sent it to subscribers in that area. The winning email led with the name of the town and had almost 2% higher open rate. In this instance front-loading subject lines with keywords/personalisation clearly helped to increase engagement. Given that some inboxes only show the first 30-50 characters of a subject line, getting the right words in at the start is important.

A/B 2

New template test

Exactly the same subject line but two different templates – existing versus new. New template had a higher open rate and click rate.A/B 3

If you’d like to learn more about using A/B testing to improve the impact of your communications, check out our A/B testing toolkit.