Standing Out From the Noise: Measuring Breakthrough in Video Testing

How to test if your video ads will be memorable in real-world conditions with Video Breakthrough Testing

 

When was the last time you saw an ad on its own? Just one ad, with nothing else competing for your attention?

 

While there are certainly ways to serve just one ad to a target audience, we usually encounter ads in blocks, such as the 2-4 minute block on television, the 45-90 second preroll ads on YouTube, and so on. Of course, competing for your audience’s attention in these noisy environments is a key part of effective advertising. 

 

Ads that people ignore can’t persuade, which is why we’re excited to discuss a new Grow Progress offering: the Video Breakthrough test. 

 

A Video Breakthrough test enables you to measure how effectively your ad breaks through the noise of modern advertising. This is a powerful addition to the ad testing toolbox, allowing you to rigorously quantify another dimension of ad performance—how well the ad stands out in a cluttered viewing environment.

 

What’s a Video Breakthrough Test?

 

Normally, when you test an ad with Grow Progress’ Rapid Message Testing tool, we randomize respondents into two conditions: a treatment group that sees your ad and a placebo group that sees an unrelated ad. The difference in responses between the two groups then gives an estimate of the causal effect of your ad, just like in a medical trial. 

 

In a Video Breakthrough test, we add a clutter reel condition in which respondents see your ad sandwiched between several unrelated ads. This mimics the more challenging attention environment of the real world.

 

 

By comparing this new clutter reel condition with the placebo, we get a causal estimate of your ad’s impact when surrounded by other creatives. 

 

Here’s the key idea: The difference between these two treatment effect estimateswith and without clutterquantifies breakthrough. Here’s what makes this design powerful: It measures whether your ad still achieves its goals when competing for attention, not just whether people remember it. 

 

For example, we would expect an ad that’s quite persuasive but not very effective at cutting through the noise to have a high effect when shown solo, but to lose steam in the clutter reel exposure condition. In contrast, for an ad extremely effective at grabbing attention, we’d expect the ad to be nearly as effective in the clutter reel as on its own.

 

Traditional attention metrics like ad recall are helpful, but they’re indirect—a memorable ad might not actually change minds or behaviors. The Video Breakthrough test answers a different question: Does your ad still move the needle on what matters (purchase intent, brand favorability, vote choice) when it’s surrounded by other content? 

 

It’s the difference between measuring ‘did they notice it?’ and ‘did it still work in a crowded environment?’

 

Case Study: Do These Ads Break Through?

 

To demonstrate how this works in practice, we ran a large-N experiment comparing two very different ads across both standard and clutter reel conditions. We measured effects on two different outcome measures per ad, for a total of four success questions. 

 

The first creative was from Starbucks’ recent ‘Not My Name’ campaign, and the second was a political ad aimed at improving confidence in U.S. election administration.

 

So, did both ads succeed at breaking through? The broad answer is yes! Even under the clutter reel exposure condition, both ads continue to have positive effects on both of their target outcomes:

 

 

That said, the two ads don’t break through equally well. The Starbucks ads, on average, retain roughly 74% of their original impact (averaging over the two success questions). The election administration ads retain just 51% of their original impact, driven by a particularly large decrease in the system confidence success question.

 

Why did the Starbucks ad hold up better in clutter? The creative differences are telling. The Starbucks ad uses rapid-fire visual progression, constantly changing scenes, and high production polish to maintain viewer engagement. 

 

The election administration ad does less to compete for attention, with a single speaker delivering information-dense content at a steady pace and calm background music. Our open-ended responses validate this pattern: When surrounded by other ads, viewers simply understood and retained less of the political ad’s message. 

 

This highlights how the Video Breakthrough test fits into a broader toolkit—the quantitative breakthrough metrics tell you what happened, while more qualitative methods like recall open-ends help explain why.

 

Breakthrough Testing in the Larger Attention Measurement Toolbox

 

The Video Breakthrough test is an exciting addition to the larger set of tools we’re making available to quantify attention. By modifying the attention environment, the design allows us to focus on whether we’re capturing enough attention to move our key outcomes, not just intermediate measures of attention. 

 

It’s worth noting that the differences between regular treatment exposure and clutter reel effects tend to be fairly small, which is why we recommend larger, well-powered tests with higher sample sizes. Further, like all ad testing, A Video Breakthrough test works best as part of a broader attention-measurement strategy.

 

As we work with clients to integrate these tools into their workflows, we’re excited to share learnings and best practices from across our space. We hope this introduction to the Video Breakthrough Test was as fascinating to you as it is to us! 

 

Want to see how your ads do in a Video Breakthrough test? Reach out for a walkthrough or to explore how teams like yours are using our platform.