Is your Media Mix correct? Find out how much revenue you're leaving on the table at your current spend with our free audit.

Marketing on Facebook: Building and Incorporating an A/B Testing Framework

March 15, 2018

Marketers who conduct the most successful ad studies with the most quality, consistent learnings tend to do a few important things:

  • Establish a foundation of refining and outlining business goals
  • Review and mark out baseline performance and past results
  • Methodically execute campaign flights against the A/B testing framework outlined


This is where the concept of a “scope of test” framework can help you succeed. In this second article in a three-part series, we unpack this framework so that you can better understand it. Using a real-world example, we’ll review how a scope of test used ad studies to help reveal what creates the most relevance, and in turn, how that improves ROAS.

While our focus in this post is more specific to testing for relevance and quality, we’re happy to support other forms of ad studies for advertisers.

A/B Test (Ad Studies) Background and Guidelines


Ad studies help test test the impact of different audience types, delivery optimization techniques, ad placements and creative, budgets, and more, on mutually exclusive audiences. Once they’re completed, these studies help you understand ‘what works’.[1]

Audiences are split into ‘cells’, ensuring that someone in one cell isn’t in the other. Because of this ‘split’ comparing one variable versus another—for example, News Feed Desktop placement versus News Feed Mobile placement—the data is statistically accurate. Each cell is exposed to a unique variation of the test variable, so a determination can be made as to which variable delivers performance in comparison.

facebook



When you’re creating an ad study, it’s important to follow a few test guidelines:

  • Define KPIs
  • Determine confidence level: tests with larger reach, longer schedules, or higher budgets tend to deliver more statistically significant results
  • Select only one variable per test
  • Avoid launching segmented A/B tests (e.g., one round of testing that’s then used to determine winners and losers)


Once you follow and meet these guidelines and recommendations, you can create a scope of test in support of planned ad studies. It includes KPIs, schedules, etc., and builds a process to implement the ad studies, and acts as a compass in tracking results and winners.

Scope of Test


A scope of test has several sections:

  • Historical Scenario
  • Understanding the Baseline and Goals
  • Summary of Insights
  • Summary of Opportunities for Optimization

  • Summary of Scope of Test (Phases and Rounds)


A phase is the umbrella to rounds, used as a proof of concept. For example, “Phase 1” could be “Testing Placement Optimized Ad Sets.” (We’ll review this in more detail in our next article.),.

There should be at least two rounds within a phase to establish a pattern in the data—a tiebreaker to determine winners and losers.

To help define KPIs, review historical data and establish a benchmark, both from an insights perspective and an Opportunities for Optimization perspective. Be sure to include summaries of both within the Understanding the Baseline and Goals section of the scope of test.

If no historical data is available, run a few campaigns to what you believe are the most relevant audiences, using the resolving data as the benchmark.

With available benchmarks, the summary of scope of test incorporates the KPIs that will determine the success of each study.

In our real-world example below, we review a retail advertiser’s scope of test. For the purposes of this article, we’ve condensed a lot of the summaries. If you’re a Marin customer, reach out to your account manager or Customer Engagement Team for more details on a scope of test and how to implement one.

Historical Scenario


The retail advertiser is using Conversion and Product Catalog sales objectives. They’ve set a goal to optimize their campaigns, with the broader challenge to drive ROAS improvements.

The advertiser is targeting men and women without segmenting the genders into separate ad sets, using a custom conversion to track results. They’re also running Dynamic Ads using a ‘Purchase’ event for tracking conversions.

Within the prospecting conversion campaigns, the advertiser’s targeting focuses on a similarity Lookalike Audience based on past purchasers (180-days lookback), as well as interests gleaned from Page Insights. Placement optimization incorporates all available placements for Carousel ads.

For retargeting business goals, they’re running Dynamic Ads, targeting people who’ve engaged with products but haven’t added them to cart, as well as those that did add to cart but haven’t completed the purchase. Both dynamic audiences are looking back 30 days.

Understanding the Baseline and Goals


In partnering with this advertiser, a Marin Software Customer Engagement Manager first outlined the Understanding the Baseline and Goals section of the Scope of Test, which provided the benchmarks and helped set the KPIs.

Here’s some of the content included within Understanding the Baseline and Goals:

Prospecting Campaigns

  • Men and women targeted in the same ad sets
  • Similarity lookalike audience based on 180-days purchasers
  • Targeting framed around relevant Facebook interests
  • Optimized placements (Facebook Feed, Mobile Feed, Instagram, Messenger, etc.)
  • Custom Conversion
  • Auto Bid


Retargeting Dynamic Ads Campaigns

  • Men and women targeted in the same ad sets
  • 30 days viewed but not added to cart
  • 30 days added to cart but not purchased
    Conversion: Purchase event
  • Auto Bid


On average, prospecting campaigns post a Relevance Score of 3; their dynamic retargeting campaigns post a Relevance Score of 5. In our test, the team reviewed 90 days of recent campaign insights.

In our next article, we’ll review these campaign insights and how they can be analyzed and acted on for future growth.

[1] Split Testing

Irakli Iosebashvili

Marin Software
By submitting this form, I am agreeing to Marin’s privacy policy.

See why brands have relied on Marin to manage over $48 billion in spend