Optimize the engagement of your Marketing Campaigns with A/B testing. A/B testing (sometimes also referred to by marketers as 'split testing') allows you to send different versions of your Single Sends to an initial subset of your contacts.
When recipients interact with the emails sent during an A/B test, we will compare the engagement metrics and automatically choose the winning version of your campaign according to the A/B test criteria you set (opens or clicks).
To set up an A/B test on an existing Single Send:
When you are A/B testing your emails, you want to optimize for a specific metric. Determine whether you want to optimize your Open Rates, by testing the Subject Line; or your Click Rates, by testing the Email Content.
You can test up to 6 different variations for each A/B test.
Select the Subject Line A/B test to optimize the Open Rate of your Single Send, since the subject usually is all the recipient sees until they open your email.
High open rates show the strength of a subject line. Once you find a subject line that works well, you will potentially see other engagement metrics improve as well.
Select the Email Content A/B test to optimize the Click Rate of your Single Send, since the recipient will not see this content unless they open your email.
High click rates mean that you have compelling content and calls to action (CTAs).
Enter the different versions of your email where you would normally edit that piece of content in your Single Send.
Subject Line Testing
For subject line testing, you will find multiple input boxes in the sidebar where you would normally find your subject line, one for each subject line variation.
Email Content Testing
For email content testing, you will see additional tabs at the top of the content area, one for each email content variation. The number of tabs you see will depend on how many versions you have decided to test.
Make edits to each of your email content variations by selecting one of the tabs.
A/B Testing Tip - Adding Variations
To know the direct cause for the best performing variation, we recommend only making one change per variation rather than many changes. This way, you can point to a direct cause for the differences in your stats.
Choose a percentage of your contact list that will participate in the A/B test. Each variation of the email will be sent to the same number of contacts within the participating portion of your list.
Twilio SendGrid automatically selects the winning variation based on how many recipients open your email.
Twilio SendGrid automatically selects the winning version based on how many recipients click links and engage with the content in your email.
You can set your A/B test duration between 30 minutes and 24 hours.
While you can test your email variations for up to 24 hours, emails will only be sent to the subset of contacts you've chosen to participate in the A/B test during the test duration you set. The remainder of your contacts will only be sent the winning variation of your A/B test email after the test duration has completed.
A/B Testing Tip - Setting the Test Duration
You should be mindful of your test duration, with respect to the timeliness of your Single Send content.
For example, if you have a one-day sale that happens the day of your Single Send, you should set the A/B test duration to less than 24 hours so that your remaining contacts still have time to get the final email and participate in your one-day sale.
When a variation wins, based on your criteria and duration, you will be notified that a winner was chosen and which variation won. SendGrid will automatically send the winning email variation to the rest of your list.
Let us know how we’re doing! Please rate this page:
Please note, we cannot resolve account and login issues reported on GitHub. Contact support for account assistance.
Thanks for helping us improve our docs!