The Blank Test Approach

Test traffic (for Google Analytics and other tracking tools), sandbox environments (for Adjust tracking etc.) and A/A tests (for A/B testing tools) are common methods that help us validate that our tools work properly and the data we see is properly processed. But what about advertising platforms? Advertising platforms are pay to play. We can validate conversions and clicks (through Google Analytics) but how can we evaluate the “influence” that our creatives may have besides clicks? A good way is to check the assisted conversion types (Facebook: one day view – AdWords: view-through impression) that many platforms provide and can’t be measured by Google Analytics. Another way is to measure assisted conversions from Google Analytics and correlate them with the assisted conversion of the advertising platforms but is that enough?

Assisted Conversions Evaluation

Since we consider our team data-driven, we decided to give a shot on this topic and we came up with an unconventional approach that we call “The blank test” approach. The method is simple, we placed “empty” banners to understand what is the baseline of clicks that we can get, and what is the allocation of assisted conversions that we can have, considering the fact that empty banners will have the lowest performance (that can be used as a benchmark) compared to our branded banners. Our first try was in a display network that we had little traction  with the following settings:

  • Hypothesis: The group of empty banners as ads will generate less assisted conversions than the group of branded banners because branded banners influence users more. Besides hypothesis validation, after the test we will have a proper benchmark about “how much” a branded banner influence users at least within this specific advertising platform.
  • Placements:
    We have two types of banners, the first type was “empty” placements -literally a white creative and an empty light grey creative-, while the second type was 2 types of creative branded banners. Branded banners were tested before and they were strong candidates for traction due to their funny and engaging content (check below)
  • Audience:
    We used two type of audiences, one that is low performing (new cookies) and another one that is high performing (visitors that did not convert) so that we can identify potential changes between those two groups. Also, those two groups were not actual customers so the “damage” in our brand would be minimum.
  • Rotation:
    We wanted to avoid the auto-optimization that most networks do, so we set the campaign to “even” rotation so that it will serve all placements equally. This can be checked by the number of impressions each one creative get and on our case, it worked properly.
  • Budget:
    Since it was not a campaign that would generate conversions we did not spend much money, but still it was a considerable amount so that we could generate enough traction. We always have a budget for testing and experimentation otherwise we couldn’t afford that kind of tests.
  • Time:
    We randomly choose the time period of one month. That month was not within seasonality period so the results are not negatively biased (at least intentionally).
Blank Test Concept
Branded Variation #1


Blank Banner Variation #1


Branded Banner #2


Blank Banner Variation #2

Here are the results:

Blank Test Results
Empty banners generate better results than branded banners!

The table above shows that assisted conversions in only one of the problems that emerged. Empty banners generated more clicks, better CTR and more assisted conversions for both low and high performing audience. There were arguments that the experiment was not properly set up, such kind of experimentation does not exist and placing blank banners would make people curious and encourage them to click more. All those are assumptions that may be valid but there are many counter-arguments as well. This post does not intend to promote blank banners as an effective performance campaign concept. My main intention is to encourage everyone to make more tests, even “bold” ones because it can lead you to the right path even if that is a hard choice. Testing and cross-checking data of platforms and tools, in general, require time and effort but it’s a great process to make sure that tools are working properly and you are not spending money on advertising that is not delivering.



Site Footer