Use A/B testing for Signup Forms to compare two Popup versions and identify which drives more signups. Test copy, visuals, field count, or timing – then automatically send the winner to remaining visitors.
This guide shows you how to set up, run, and analyze A/B tests for Signup Forms in Omnisend.
Before You Begin
A/B testing is available for Popup and Flyout forms only.
When testing forms with or without Teasers, note that Teaser views (set to appear before or before and after the form) are not counted in results. Instead, A/B test a form with a Teaser set to after the form or test different Teaser text.
For fair comparisons, use the same targeting settings (e.g., test both versions on
/productspage).
Benefits
With A/B Testing Forms, you have the flexibility to experiment with various aspects of your popup forms. Here are some popular use cases to get you started:
Different Copywriting. Test variations in the text and messaging of your forms.
Different Visuals. Experiment with images, colors, and graphics.
Look and Feel. Adjust both the visual elements and copywriting for a comprehensive makeover.
Display Settings. Test different triggers or timing for form display.
Single-Step vs. Multi-Step Forms. Explore whether a one-step or multi-step form is more effective.
Contact Information. Experiment with the amount of information you collect, such as email only, email plus SMS, email plus name, etc.
Promotions. Compare different promotions to see which resonates with your audience.
Form with a Teaser vs. Different Style or Text Teaser. Test whether changing the teaser style or text impacts the conversion rates of your form.
Wheel of Fortune vs. No WoF. Test whether adding gamification affects form submissions.
Setup Process
Step 1: Pick Your Form
Go to Forms → Create form → select the Popup or Flyout you want to test.
💬 Learn more: Create a Popup Signup Form.
Step 2: Create Two Versions
In your form's Behavior tab, click A/B Test. You'll see two copies:
Control Version (Version A): Your original form – leave unchanged.
Experimental Version (Version B): Make your changes here (e.g., add a field, change headline, swap image). Find more ideas below.
You can adjust settings anytime before starting the test by clicking the form name.
Step 3: Decide on Traffic Split
Set what percentage of visitors sees each version (e.g., 50/50 or 80/20).
Step 4: Start Your Test
Once you finalize edits and traffic split, click Start A/B Test to enable the form.
Visitors will now see either Version A or Version B based on your split percentage.
Step 5: Keep an Eye on Results
Track results in Reports → Forms → select your A/B Test form.
Focus on the signup rate (percentage of visitors who submitted the form).
Step 6: Choose the Winner
After collecting sufficient data (typically 1–2 weeks or 100+ signups per variant), choose the version with the higher signup rate. That form becomes your active Popup.
💡 Low-traffic sites may need 3–4 weeks for reliable results.
Step 7: Merge Your Data
Once you select your winner version, all metrics and signups from both variants merge automatically into the winning form. No data is lost.
Ideas to Test
Single-step vs. multi-step – Does splitting email + name into two steps improve completion?
Email-only vs. email+SMS – Which fields maximize signups?
Headline variations – "Get 15% off" vs. "Join 10,000+ subscribers."
Visual themes – Minimalist vs. colorful design.
Teaser text – "Unlock your discount" vs. "Subscribe for exclusive deals."
Promotions – 10% off first order vs. free shipping over $50.
Gamification – Wheel of Fortune vs. standard form.
💬 Learn more: Multi-Step Signup Forms | Teaser Signup Forms.
Adjustments During A/B Testing
Before the test starts, you can edit the main form or variants. Once the test is running, avoid major changes to ensure accurate comparisons.
Plan your updates in advance—and let the test run its course to deliver meaningful insights.
FAQ
How can I ensure fair and consistent comparisons when testing forms with different targeting settings?
To ensure that your comparisons are fair and consistent, it's a good practice to A/B test forms on the same page or location of your website. For example, if you want to test different targeting settings, use the same page, such as /products, as the testing ground.
What happens if I delete a form during an A/B test?
Deleting the form deletes the A/B test. They're linked.
Can I cancel an A/B test that hasn't started yet?
Yes, if you cancel a Forms A/B test before it starts, it will revert to the original form configuration. No data will be lost, and you can make adjustments as needed.
Can I cancel an A/B test after it starts?
No. Once started, you must choose a winner to conclude the test.
Should I make small or large changes during testing?
Make smaller, focused changes (e.g., one headline variation). Large changes make it harder to identify what caused performance differences.
How can I switch between forms during the A/B test setup?
Click the form title field → Select the form version you'd like to edit.
Can I change the names of the forms being tested in A/B tests?
Absolutely! You can change the names of your forms to help you identify and distinguish them during the testing process. Pick the copy you need under A/B testing → click Edit.
Can I have more than two variations in an A/B test?
Currently, the A/B testing feature supports only two variations (A and B). It's not possible to add more variations for a single A/B test.
How long should I run an A/B test?
The duration of an A/B test depends on factors like website traffic and the significance of results. Typically, you should run the test until you see statistically significant changes rather than just a small percentage difference.
How can I test different discounts effectively?
To test different discounts, apply specific tags in the audience management of each form. Use separate automations for segments based on these tags to track and analyze the effectiveness of different discount offers.
Why are my A/B test view counts unequal when the split is 50/50?
This is expected if one version has a Teaser enabled and the other doesn't. Forms with Teasers only count a view when the Teaser is clicked, while forms without Teasers count a view when displayed. The traffic split (50/50) is still working correctly; the difference is in how views are measured. To compare versions accurately, ensure both have the same Teaser settings.
I'm testing in incognito mode, but I only see one form version. Why?
If you only see one variant, try these steps:
Check your traffic split – if it's not 50/50, you're more likely to see one version.
Test from a different browser or device.
Clear all browser data (even incognito can retain some data).
Once a visitor sees one version, they'll always see that version to maintain test consistency.
What's the difference between Interaction Rate, Submit Rate, and Signup Rate?
Interaction Rate: Percentage of viewers who clicked any form element (fields, buttons). Close button clicks don't count.
Submit Rate: Percentage of viewers who clicked submit, including both new and existing subscribers.
Signup Rate: Percentage of viewers who became new subscribers (only counts new Contacts, not existing ones).
If an existing Contact submits your form, they count toward the Submit Rate but not the Signup Rate.
If you run into any challenges, contact our 24/7 Support Team via in-app chat or [email protected].












