Skip to main content

A/B Test Your Email Campaigns

Learn how to create and send an A/B test campaign

Ira avatar
Written by Ira
Updated over 2 weeks ago

A/B testing helps you identify which subject line, sender name, sender email address, or email content drives the highest engagement. Send two variations to a small test group, then automatically send the winning version to the rest of your audience.

This article explains how A/B testing works in Omnisend and walks you through setting up, analyzing, and optimizing your tests.


Before You Begin

  • A/B testing is available on all Omnisend plans.

  • You must have at least 10 contacts subscribed to email to use A/B testing in campaigns. Non-subscribed contacts cannot receive marketing emails. Learn about subscribed vs. non-subscribed contacts.

  • 2,000+ subscribers are recommended for statistically reliable results.

  • A/B test campaigns cannot have a booster.

  • You cannot A/B test different send times. Both versions are sent simultaneously. To test timing, use A/B test segments instead.

  • If you pause a test, you can resume it for up to 7 days from the start date.

💡 When sending to iOS devices, use clicks as the winning metric. After the iOS 15 update, open rates on Apple devices are less reliable. Learn about iOS 15 email privacy.

How A/B Testing Works

A/B testing helps you determine the most effective version of your email by comparing different elements, such as:

  • Subject line;

  • Sender name;

  • Sender email address;

  • Email content.

During campaign setup, you can define what percentage of your audience will receive Version A and Version B. Once the test period ends, Omnisend automatically identifies the winning version based on performance and sends it to the remaining recipients.

💡 To ensure reliable results, we recommend postponing the winning version for 24 hours. This allows enough time for opens and clicks to accumulate and ensures your campaign is sent at a consistent time across all versions.

Set Up an A/B Test Campaign

Step 1. Create the Campaign

To start setting up your A/B test campaign, go to CampaignsNew Campaign → Locate Email A/B test and click Create A/B test.

Step 2. Set Up A/B Version Details

After clicking the A/B test option, you'll enter the setup wizard where you can configure your A/B campaign settings. You can test the following elements:

  • Subject line;

  • Sender name;

  • Sender email address;

  • Email content.

💡 You can test all of these elements at once or choose to test only specific parts. For example, the sender name and content, while keeping the subject line and sender email the same.

Once you're finished configuring both versions, click Create email for Version A to proceed.

Step 3. Design Email Content

Choose a template to design the email content for Version A.

This step works the same way as in regular campaigns: you can select a pre-built layout, use a saved template, or start from scratch. You’ll also be able to change the template later, before sending.

After selecting a template, edit the content of your email. Once you're done, click Finish Editing.

The content for Version A will be saved, and you'll return to the setup wizard. From there, you can reopen the editor, preview the email, or send a test version if needed.

Repeat the same process for Version B. Once both versions are complete, click Next step to continue.

Step 4. Select Recipients

After finalizing your content, choose the recipients for your A/B campaign. You can send it to:

  • All subscribers, or;

  • Specific segments.

The A and B versions will be sent to randomly selected contacts within the chosen audience. Once your audience is selected, click Next step to proceed.

Note: To run an A/B test, your selected segment must contain at least 10 subscribed contacts. If the segment is too small, A/B testing won’t be available.

Step 5. Configure A/B Settings

In this step, define how your A/B test will run and how the winner will be determined:

  • Set test group size. Decide what percentage of your total recipients should receive Versions A and B.

    • For example, you can send Version A to 20% and Version B to another 20%. The remaining 60% will receive the winning version.

  • Choose the winning metric. Select how the winner will be determined:

    • Open rate – Ideal for subject line tests;

    • Click rate – Recommended for testing content or sender details (more reliable for iOS devices).

  • Define the test duration. Choose how long to wait before sending the winning version.

    • Minimum: 1 hour;

    • Maximum: 7 days;

    • Recommended: 24 hours.

After configuring your A/B settings, click Next step to continue.

Step 6. Send or Schedule

This is the final step of your A/B campaign setup. Before proceeding, review all elements of your campaign:

  • Subject lines;

  • Sender names;

  • Selected recipients;

  • A/B testing configuration.

Once everything is set, you can:

  • Send the campaign immediately, or;

  • Schedule it for a later date and time.

When you click Send, Versions A and B will be delivered to the test groups according to your selected A/B settings.

If your sender email address isn’t verified, you’ll see a prompt to verify it before sending. Click the Verify button to complete verification.
Learn more about verifying your sender email address.

Edit A/B Test Settings

After your A and B versions are sent, there’s still time to manage the test before the winning version is delivered. During this period, you can either:

  • Stop A/B testing, or

  • Manually select the winner.

Go to Reports Campaigns, and find your campaign in the Campaign performance list:

Or navigate to the Campaigns tab and click the three dots next to your campaign:

What happens if you stop testing?

  • The campaign will be paused and can be resumed within 7 days

  • You will not be able to segment customers based on the results of the A/B test.

  • However, you can still view recipient details for each batch in the Activity tab under Reports.

Analyze A/B Test Reports

After sending your A/B test campaign, you can view performance data in ReportsCampaignsCampaign performance → select your campaign.

The A/B test campaign report provides a detailed breakdown of key performance metrics, helping you evaluate the results and optimize future campaigns based on data-driven insights. The report includes:

Overview

This section displays the overall performance of your A/B test campaign, including:

  • Emails sent – Total number of emails delivered during the test.

  • View rate – Percentage of recipients who opened the email.

  • Click rate – Percentage of recipients who clicked on any link in the email.

  • Unsubscribe rate – Percentage of recipients who unsubscribed.

  • Spam rate – Percentage of emails marked as spam.

  • Revenue – Total sales attributed to the campaign.

A/B Test Results

This section compares the performance of Versions A and B. Metrics include:

  • Open rate;

  • Click rate;

  • Unsubscribe rate;

  • Spam rate;

  • Revenue.

First 24-Hour Performance

This section highlights early performance indicators, such as:

  • Views;

  • Clicks;

  • Device breakdown.

Early results can provide useful insights into campaign effectiveness and help guide future adjustments.

Contact Activity

The Contact activity tab shows detailed recipient-level actions for the campaign, including who viewed, clicked, or unsubscribed.

FAQ

Can I A/B test an email at a different time?

Unfortunately, it is not possible. Both A and B batches are sent simultaneously. However, you can use the A/B test Segment to split contacts and schedule emails sent at different times.

Does Omnisend have A/B testing for automation?

You can add the A/B test block anywhere in the automation workflow structure.

What if I need more contacts?

To do A/B testing, you need at least ten subscribed contacts, which means those who opted in to receive email marketing from you. Learn about subscribed vs. non-subscribed contacts.

Should I use Open Rate or Click Rate to determine the winner?

  • Use Open Rate if you're testing elements that determine whether someone opens your email, such as subject lines or sender names.

  • Use Click Rate if you're testing elements that drive actions after opening, such as email content, images, CTAs, or offers.

Why does the “loser” version show better results after the winner was already selected?

A/B test metrics continue to update in real time even after a winner is declared. This means the “loser” version can appear to outperform the winner later on, as more recipients open or click.

The winner is always selected based on data available at the end of the test period (e.g., 2 or 24 hours). Engagement that happens afterward doesn’t affect the winner selection but is still reflected in the final reports. For example, if you choose a winner after 2 hours:

  • Version A has a 20% open rate, Version B has 18% → Version A is sent to the rest of your audience.

  • Over the next few days, if more recipients open Version B, the final metrics may show Version B outperforming Version A.

💡 Tip: To reduce this gap, use a longer test window (like 24 hours) to allow more data to accumulate before the winner is selected.


Looking for assistance? Hop on a chat with our Support Team or email us at [email protected].

Did this answer your question?