How to Make Split Testing Integral to Your Digital Advertising Strategy

September 12, 2018

You’ve heard about the value of testing. You may even incorporate testing every now and then into your digital advertising campaigns. But, is it a routine part of your “doing business” as a digital marketer?

If you’re just now hopping onto the testing bandwagon or want to make it a regular thing but don’t know where to start—we get it. There are so many features, products, and opportunities out there, that sometimes it might be a struggle understanding how to test and the impact it can have on your ad account performance.

In this article, we hope to shed some light on the darkness of ad campaign A/B (“split”) testing, and provide a simple and effective split testing framework.

When to Test


To have the most impact, you should incorporate testing at each stage of your ad campaign. Be sure to always perform testing at the project level. Although it may feel like all of these A/B tests can gobble up time and budget, this isn’t necessarily the case—when it comes to split testing, some platforms make the process easy, and allow you to quickly see the results without eating up all of your campaign budget.

To make things even easier for you and your team, plan to complete your testing campaign in three phases:

  • Targeting and delivery
  • Creative
  • Messaging


Let’s take at these phases, and the most common A/B test scenarios for each one.

Targeting and Delivery


Objective test (traffic versus conversions)

With targeting and delivery, a common use case is measuring traffic versus conversions. This is a good test to conduct if:

  • Your main objective is striking a balance between volume and cost
  • You sometimes struggle with increasing your conversion volume
  • You optimize for link clicks and aren’t confident about switching to conversions, since it may negatively impact your KPIs


Test setup

To set up this type of test, create two campaigns: one with a ‘traffic’ objective and the other with a ‘conversion’ objective. Make sure you have identical creative, number of ads, and targeting.

Conduct the split test, making sure you’re allocating enough budget for each ad to deliver at least one conversion per day. Run the campaign for a set period of time (e.g., one to two weeks) or until you can clearly see which campaign performs best.

Optimization window

What if your campaigns are generating conversions, but you want to make sure you’re optimizing toward the best-quality ones? You can test the optimization window, typically one day versus seven days.

The conversion window allows you to tell Facebook’s algorithms what data to consider when deciding whom to show your ad.

Test setup

Create two campaigns or one campaign with two ad sets. Make sure that all segments that are part of the optimization window are identical, and, like the traffic versus conversions test, that you’re allocating enough budget for each ad to deliver at least one conversion per day. Run the campaign for a set period of time (e.g., one to two weeks) or until you can clearly see which campaign performs best.

Ad formats (static versus video)

Does this sound familiar? You’ve always used static images for your conversion campaigns, reserving video for brand awareness. But, you recently noticed that static images limit your scale, so you’re looking to identify a new best practice. Video it is!

With ad formats, you can A/B test single formats or a combination.

Test setup

  • Single format: Create one campaign, including different formats in different ad sets, making all other segments identical. Run the split test at the ad set level.
  • A combination of formats: Create one campaign that includes a combination of formats to different ad sets, for example:





















Ad SetLink Type1Static link ad2Video link ad3Static link ad and video link ad
Targeted audience (interest versus lookalike)

Although you should always target combinations of audiences, you can A/B test to identify some best practices. For example, if you’re looking to identify another set of audiences that’ll bring value to your campaigns, or reduce audience overlap without including a long list of exclusions, testing can help.

Test setup

Depending on your reporting preferences, you can set up your test at the campaign or ad set level. Create one campaign with several ad sets targeting:























Ad SetAudience1Interests21% lookalike of your most valuable users3Campaign (pixel) lookalike41% website custom lookalike audience
Make sure your audience size is sufficient to deliver. All other segments must be identical.

Creative


For many advertisers, creative can be challenging, especially if you don’t have an in-house creative team. Still, there are a few things you can test to improve results, without the need for a ton of resources.

  • Video length: A/B test different video lengths, for example, five seconds versus 10 seconds or longer, to determine which one drives better results.
  • Opening frame: Test different opening video frames. Your ad needs to capture attention as fast as possible, so identify which opening frame does the job.
  • Aspect ratio: Test different ratios, for example, landscape versus vertical video.


Messaging


Different messages can drive different results, and determining the messaging that has the most appeal to your audience can be a quick win. Try testing:

  • Subject matter: Test different stories to communicate your value proposition.
  • Character limit: Try testing copy length, i.e., shorter ad text versus longer.
  • Emoji: Depending on the product you’re advertising, it might be interesting to include an emoji in your test ad copy, and see how it performs against an ad that doesn’t use it.


Wash, Rinse, Repeat


By making testing a basic part of your digital campaign life cycle, you’ll be able to continuously identify best practices, adapt to publisher changes, and scale your account. If you’d like to learn how to implement an A/B test strategy into your accounts, speak to your Marin team. If you’re new to Marin, contact us to learn more.

Jana Christoviciute

Marin Software
By submitting this form, I am agreeing to Marin’s privacy policy.

See why brands have relied on Marin to manage over $48 billion in spend