12 A/B Testing Examples To Inspire Your Own (Real Case Studies) – MailerLite
Experiments aren’t just for wacky scientists! A/B testing enables you to test different versions of your marketing creations to see which one gets the best results.
Gone are the days of guessing what will connect with your audience. With A/B testing, you can create emails, landing pages, pop-ups and more based on what converts.
Should we go with red or yellow banners? Test it, and find out. Should we use an emoji in the subject line or not? Try both and see which version gets the most opens!
There are so many testing possibilities, it can be hard to know where to start. We compiled these 12 A/B testing examples to help you! Let’s dive into what A/B testing is, see some real-life case studies, and inspire you to run tests of your own.
Mục Lục
What A/B testing is, and why you’ll love it
A/B testing, also known as A/B split testing, is the method of comparing two different versions of a variable (such as web pages, app and emails) to see which one performs better using a sample of your audience.
You don’t have to read your audience’s minds to get results. With A/B testing, you can make data-driven decisions about your digital marketing campaigns.
There’s a whole heap of things you can test, including:
-
Emails
-
Pop-ups
-
Forms
-
Landing pages
-
Websites
You can test text and copy, color designs, layouts, subject lines and more to find what catches your audience’s attention. Let’s jump into some real-life examples. If you’d like a more in-depth explanation of A/B testing, take a look at our ultimate guide below.
9 A/B testing examples for email
Here are some A/B email tests that we’ve run ourselves!
For emails, MailerLite’s A/B testing feature compares two versions and sends them to a chosen sample size, usually 50% of the email list. The ‘winner’ is automatically chosen after a defined time frame (but you can also select the victorious email manually, if you prefer).
Side note: Please remember that every A/B test is specific to the audiences and goals of the test. These examples are meant to inspire your own tests. The results are not best practices.
1. Emojis vs. no emojis in the subject line
Emojis in emails can be a sticky subject. Some people avoid them at all costs, while others think they’re the best thing since sliced bread! We fall into the latter category, but we weren’t sure whether our subscribers felt the same way.
Looking at A/B tests over the past two years, it seems like our subscribers initially preferred subject lines without emojis, but in recent months, emojis became more popular.
Emojis in 2020
When we announced the new features in our website builder, we wanted to do it with a bang! But would our subscribers be more hyped with a partying emoji, or without one?
Interestingly, it was the emoji-less subject line that passed the test, and it was sent automatically to the rest of our email list.
Emojis in 2021
We just couldn’t let the party emoji go, though, and it wasn’t long before we wanted to try it again. This time, to introduce our new and improved HTML builder! We pitted the two subject lines against each other, and here is what we found:
Victory to emojis! This time, people seemed to be much keener on the emoji subject line, with the winning open rate being significantly higher.
So, what can we conclude about emojis?
The short (and slightly annoying) answer is… it depends! Every audience is different, and open rates will also depend on factors such as the subject line copy and length.
If in doubt, trial the same subject line with and without an emoji, and repeat this A/B test until you see a clear pattern.
– Megan, Senior Content Writer
Check out the article below to learn the best practices for using emojis in subject lines.
2. Longer vs. shorter subject lines
We weren’t always sure whether our subscribers liked long, elaborate subject lines, or something short and punchy. If you glance back at the emoji case studies, the longer subject lines led the pack. But we also found quite a few instances where shorter subject lines took over.
When talking about the design elements of a perfect email, we trialed a more descriptive subject line (Anatomy of the perfect email: 11 design elements) with something a little simpler (11 design elements of the perfect email).
This time, we set the A/B test metrics to select the winner by clicks, rather than opens, due to changes to Apple Mail Privacy Protection. But this metric also helped us to see who was motivated by the subject line to actually click the link to the article.
The shorter subject line was the victor. But you might notice that the open and click rates for the longer version were actually higher. This is because people will continue opening and engaging with the test email, even after the defined time period, which can change the results after a winner has been chosen.
Bonus tip 💡
For the most accurate results, allow at least 1-2 hours of testing before selecting a winner. You also need to choose a sending window with the best chances of high open rates, both during and after the test.
For example, our 2021 data shows that open rates peak between 10 AM—12 PM, so you could send out the test at 10 AM for 1 hour, and then send the winning version at 11 AM to make the most of this window.
Speaking of the best time to send an email, we also tested a shorter subject line to promote the article linked above, and here’s what we found…
Quick and concise was the order of the day, just like this A/B test about creating landing pages that convert.
To learn more about crafting a subject line that drives higher conversions, take a look at our ultimate guide below.
3. Images and GIFs at the beginning of the email
We can all agree that images and GIFs are an important design element of any newsletter. But does their positioning in the email influence conversions?
We wanted to find out, so we A/B tested whether images and GIFs at the beginning of the email could increase click rates.
When A/B testing emails, I prefer to select a winner based on clicks so I can get a better sense of overall engagement.
– Jonas, Content Team Lead
GIFs in email
Email GIFs are a dynamic way to catch people’s attention, so naturally, we inserted one at the beginning of our campaign to promote our new pop-up builder! But we were also intrigued to see how much of a difference it would make, so we created a test variation without the GIF at the beginning.
Take a good look at both emails below and place your bets—which version got the highest clicks?
To our surprise, the email without the GIF at the beginning of the email got much higher click rates.
This made us wonder if we should be including graphics at the beginning of the newsletter at all, or whether a plain text email was the way forward. We decided to test it again with an image at the beginning of the email.
Images in email
This time, we were promoting an article on 116 newsletter ideas. We created one plain text version of the email, and another with a cover image from the article.
You know the drill—take a look and see if you can guess which version won the day…
The email without the image at the beginning had much higher click rates!
Why did the campaigns without GIFs and images at the beginning have higher click rates?
There are a couple of possible reasons for this:
-
First off, bright images and dancing GIFs could be a distraction, stopping people from scrolling down and converting.
-
If the graphics take more time to load, readers might not get to the content quickly enough to convert.
To learn more about designing show-stopping emails, check out our ultimate guide below.
4. Starting the subject line with a question
Subject lines are just a smattering of words, but they have a big impact on email engagement, so we wanted to get it right. But what types of wording are most effective?
One idea we tried was opening with a question to make it feel more personal to the reader.
For example, when sharing about responsive pop-up designs, we compared How to optimize pop-ups for mobile📱 with Are your pop-ups optimized for mobile?📱
The question version got the highest open rate. We saw something similar when we shared some of our favorite email marketing resources.
Again, the losing version got higher opens and clicks after the testing time, but during the trial period, it was the question that got the most engagement.
Bonus: 10 more email test ideas
There are so many more things you can experiment with in your email marketing strategy, including:
-
Personalization: Does including the subscriber’s name in the subject line increase opens?
-
Call-to-action button (CTA): Which button color or text gets the most clicks?
-
Email design: Which colors in newsletters get the most engagement?
-
Preheader text: Which copy will best complement the subject line?
-
Layout: Should I position blocks side by side or above each other?
-
Email text: Which tone in my email content drives the most engagement?
-
Email length: Is it better to have 3 or 4 sections in my newsletter?
-
Type of promotion: Free delivery or lowered pricing?
-
Testimonials: Does the social proof from Customer A or Customer B increase conversions?
-
Sender info: Should I use my first name or my company name to increase engagement?
3 A/B testing case studies for a landing page, an ad and a pop-up
You can test other types of marketing assets to optimize conversions, including landing pages, ads and pop-ups.
With MailerLite, you can also test landing pages. If you want to test other things like digital ads or pop-ups, you can do it yourself! By creating and comparing two versions over a period of time, you can measure performance and pick the winner for the long term.
Here are 3 examples to get you started.
1. Long vs. short landing pages (data36)
Tomi Mester from data36 decided to improve a landing page for his online course using A/B testing. He compared the original version with a longer landing page that answered common FAQs, had more information about the course, and included four embedded videos.
Image credit: Tomi Mester, data36
Despite being much longer, Tomi found that the new version had double the conversion rate, with 99%+ statistical significance! It seems that people who wanted to buy were ready to go the extra mile and read all the relevant information before going to checkout. If you’re interested to learn more, you can read the full case study.
A note about statistical significance
In A/B testing, statistical significance measures how likely it is that the difference between the two versions wasn’t due to chance, or a mistake. The higher the statistical significance, the more sure you can be that the differences are real. You can use an online calculator, such as this one from Neil Patel, to calculate your own.
2. Personalized banner (Sony)
Wanting to drive more conversions for their banner ads, retailer Sony decided to use more personal language and redesign with the copy: “Create your own VAIO laptop”.
Image credit: Sony
They compared it with a more promotional ad (below) to see which one would get the highest click through rates and adds to the shopping cart.
Image credit: Sony
The more personal call to action led to a 6% increase in clicks, and a 21.3% increase in adds to the shopping cart. If you are planning to test banner ads, you can learn more and go deeper by reading the full case study.
3. Exit-intent pop-ups (Crossrope)
Fitness ecommerce brand Crossrope wanted to gather email list signups before their website visitors left. They first created a pop-up (below) that appeared when people moved towards the browser bar.
Image credit: Crossrope
With this pop-up, they were able to convert 7.65% of people who were leaving. They decided to take this idea further with a fullscreen pop-up on their blog.
Image credit: Crossrope
With this new version, they were able to convert 13.71% of website visitors who came to their blog! This suggests when done right, a larger pop-up could catch attention and increase conversions. If you’re planning to test pop-ups, read the case study to learn more.
New! A/B split testing for automation
People using a Growing business or Advanced plan can take advantage of the A/B split testing step for automation workflows. This allows you to add split tests at different points of your automation sequence.
The A/B split testing step can be used to create up to 3 versions of an email or delay step. Test workflow subject lines, email content, CTAs, delays and send times to uncover the secret sauce to a perfect workflow.
In this example, we use the A/B split testing step to find out the best time to trigger an automated email.
Implement A/B testing in your workflows to identify factors that make your subscribers click, discover the key elements of high engagement, refine your email automations, and pave the way to highly-optimized email marketing campaigns.
How to run an A/B test and measure the results in MailerLite
First off, even if you don’t have access to an A/B testing tool, you can still run general tests on your content to see what resonates.
It’s best to test one thing at a time, so that you can easily interpret the results.
– Erin, Content Writer
For example, you could select a landing page element, such as the text or color of a CTA button, try it for a period of time and then examine the conversion rate on Google Analytics and tweak accordingly.
Use the right A/B testing tool
Did you know that you can run A/B tests with MailerLite? It lets you split test your email campaigns and landing pages to find the most effective version!
Create a free account
Here’s how to use MailerLite to A/B test your newsletters, landing pages and workflows.
A/B testing emails
With MailerLite, you can send two versions of your email with differences in one of the following:
-
Subject line
-
Email content
-
Sender name and email address
To start a campaign, click the menu on the Create campaign button in your dashboard and select A/B split campaign.
After creating your campaign versions, you can select the size of your test group, how long the test will run for, and whether the winner will be selected by opens or clicks.
Since the changes to Apple Mail’s Privacy Protection in September 2021, we recommend using clicks as your winning metric.
For a step-by-step guide on setting up an email A/B test, check out the video tutorial below.
A/B testing landing pages
You can test up to five different landing page versions. The testing is based on unique visitor traffic, rather than page views. You can decide how to distribute visitors between different versions (we recommend dividing them equally).
To get started:
-
Go to the Sites tab and create a new page or select an existing one
-
Click the button labeled Create split test
-
Select Add your first version
-
After you’ve saved it, you can continue adding new versions to compare it with
-
Once you activate the ON slider, the test has started and won’t stop until a winner is chosen. The split test can only be activated once
To learn more about A/B testing landing pages, you can watch the video tutorial below.
A/B testing automation steps
With A/B testing for automation, you can test up to 3 versions of Delay and Email steps. There are many ways you can use the A/B split test step. For example, you can:
-
Test different automation email subject lines: Find out which version yields the most opens/clicks
-
Test different sender names: See if subscribers resonate more with a person or a brand
-
Test emails with different content: Find out what kind of newsletter engages more subscribers
-
Test different delay times: Find the appropriate time to send follow-up emails that get more conversions
To add a split test to your automation workflow:
1. Open your automation in the workflow editor.
2. Click the + icon and select A/B testing.
3. In the sidebar, give your split test a name and use the slider to select the percentage of which you’d like to distribute traffic.
4. If needed, add a third variable by clicking + Add path C.
5. Click each + icon to add your variables. You can choose to test Delay steps, Email steps, or a combination of both.
6. Once you’re happy with your split test, click the single + icon at the bottom to continue building your workflow.
Start your own A/B testing campaign now
You should now have all the tools and information you need to start testing! As you get started, remember to…
-
Test one thing at a time so that you can clearly see how each element impacts the results
-
Select a winner by click rates rather than open rates for your emails, especially if you’re comparing the content
-
Consistently test and adjust to keep up with changing tastes and preferences
By keeping an eye on the data, you’ll be able to make the right decisions to enhance user experience, reduce bounce rates and most importantly, drive conversions!