A/B Testing As a Way to Increase Engagement … and Inventiveness

With references to Airbnb and Mailchimp + tips and best practices

The Mixternal Comms Playbook
3 min readJun 24, 2020
the letters A and B from a children’s book
Detail from Funny Alphabet (ca.1850)

A/B testing (or split testing) is when you simultaneously use two different versions of the same idea to determine which one gets a better engagement rate. Version A versus Version B gets you the A/B in A/B testing.

  • No need to limit to two choices: A/B/C testing is a reasonable venture, but trying any more than that dilutes the test’s effectiveness.

Do you prefer Coca-Cola in a bottle (A) or in a can (B)?

Do you sign up for my newsletter if I prod (A) or I plea (B)?

A. Subscribe to my newsletter

B. Please please subscribe to my newsletter, please!

One of the most common A/B tests for internal communications is for email newsletter subject lines. The idea is to test between two headline ideas on a segment of your population, say 20% of employees, to determine which line garners the most open rates. The winning subject is then dished out to the remaining 80% of recipients.

Mailchimp provides several examples of how their clients learn from A/B testing:

  • What day of the week gets better open rates?
  • Does a subject line with an incentive or a teaser work best?
  • Does including your company name in your subject line increase engagement?
  • Is it better to use your name as the from name, or your company’s name?
  • Does the time of day a campaign is sent affect the click rate?
  • Are recipients more likely to click a linked image or linked text?
  • Do recipients prefer a campaign that contains a GIF or one with static images?
  • What time to send the email
  • Whether to include an emoji in the subject line

“If you double the number of experiments you do per year, you’re going to double your inventiveness.” — Jeff Bezos

A/B testing is one of the easiest ways you can determine which editorial option will resonate most with your employees. You can (and should!) A/B test lots of things, such as:

  • Headlines
  • Lead images
  • Newsletter subject lines
  • Calls to action (e.g., “click here” vs. “learn more”)
  • Colors (e.g., an orange button vs. a red button)
  • Enticements to get bigger crowds (e.g., serving lunch vs. giving out swag)

Tips and Best Practices for A/B Testing

  • Ensure data reliability for the A/B testing solution
  • Conduct an acceptable test before starting
  • Test one variable at a time
  • Conduct one test at a time
  • Adapt number of variations to volume
  • Wait to have a statistical reliability before acting
  • Let tests run long enough
  • Know when to end a test
  • Measure multiple indicators
  • Take note of marketing actions during a test
  • Segment tests

The list above comes from AB Tasty, a very geeky and engrossing website if you’re into A/B testing. (You’re welcome.)

Here’s a case study from Airbnb, which used A/B testing on a customer’s booking flow — from searching to contacting hosts to receiving a reply to actual booking. It’s nerdy, but worth a review.

Lastly, don’t become complacent. Over time your employees will have gotten used to the winning A or B idea. Check in again in six or twelve months with another A/B test to make sure the tactic you’ve been using remains a winning one.

--

--

The Mixternal Comms Playbook
The Mixternal Comms Playbook

Written by The Mixternal Comms Playbook

I help comms professionals master mixternal (internal + external) communications, save hours weekly through AI-powered workflows, and improve executive comms.

No responses yet