This is the first of experiments and A/B tests we're doing. Each test has a budget of $10'000. We're publishing the results in this blog, as transparent as possible.

I've been creating and optimizing Facebook campaigns since 2008. I've personally helped 100s of advertisiers with their campaigns - and therefore "seen it all". But the results of this test REALLY amazed me. So trust me, you want to stick around for the full post.

Let's go!

One campaign has $3333 in total budget. The other campaign has $6666 in budget. Everything else (targeting, bidding strategy, bid, optimization goal, placements, ads) are all the same. This is all made through the split testing feature in Facebook Ads, to make sure it's a proper A/B-test (with the audience being split up in two different buckets).

To be completely transparent, this initial A/B-test was made many months ago. But the results amazed me so much that I decided to keep experimenting this before sharing.

But let's start with the initial results on the simple first A/B-test where one campaign had twice as much budget as the other one.

Campaign A: $111 in daily budget for 30 days = $3333 in total.
Campaign B: $222 in daily budget for 30 days = $6666 in total.

Have any idea how the results will be? I think a common guess would be that results would be pretty similar, but that the volume of conversions from Campaign B would be twice as much - since we spent twice as much on that campaign.

Nopes.

Campaign A had a CPA of $20, generating 166 conversions over the 30 days.
Campaign B had a CPA of $14, generating 476 conversions over the 30 days.

30% lower CPA in Campaign B. But why, everything was the same but the budget? 30% is A LOT.

So I kept experimenting.

I'm not going to bore you with everything I did to reach the results. So, let's fast-forward and arrive to why results went this way and how you can utilize this insight.

Based on this first test, it was an obvious proof that statistical significance and Facebook's learning phase was of importance. Campaign B much faster than Campaign A reached a phase where the optimization algorithms had learned enough to utilize the conversion data to reach an even high-converting part of the specificed audience. When this shift happened, CPA drastically started to sink.

Key takeaway #1: Don't make to hasty decisions. Let the campaign and ad sets go through the learning phase and experience the wonders of Facebook algorithms do it's magic.

There's nothing revolutionary with this first key takeaway, but I think many marketers are too nervous and doesn't take this into consideration enough.

BUT it doesn't end here. Another major finding is about to be revealed.

The experimentation continued.

What happens if you group a number of campaigns and give them one daily budget?

See, this is not something that can be done in Facebook Ads Manager. But since we're a Facebook Marketing Partner with our own developer team - we can do this.

The theory is: Let's try to group all prospecting campaigns - and give them one budget. Then group all retargeting campaigns and give them another budget.

We also tried (in another ad account) grouping campaigns with a similar goal and gave them one daily budget.

In a third ad account, we grouped all campaigns and set a daily budget for the whole account.

Then, the daily budget on each specific campaign was changed automatically on a daily basis by our algorithm.

The results?

Since we could give our algorithm even more data to optimize on, the daily re-allocation of budgets between campaigns could enable a new level of efficient spend of marketing budget.

The minute a campaign started to show a tendency of decreasing in performance, a chunk of daily budget was moved to another campaign with more promising performance. And this was made on a daily basis.

Key takeaway #2: Grouping campaigns and giving them one daily budget increases scalability and delivery.

As you all can guess by now, the results were fantastic - especially after a week or so when statistical significance started to increase and Facebook's algorithms and our algorithms we're working in symbiosis to increase delivery.

And of course, we had to put this out to the world. So, in mid-january of 2020 - we launched this feature in our platform, and you are welcome to try it 14 days for free.

Below is a short video tutorial on how the specific feature works.

I know I'm biased now and that I of course have an incentive in having you (as a reader) becoming a user in our platform. But trust me, as a marketer to another, this is something you definetily need. I will help you save time, increasing scalability and delivery.

You don't even have to take my word for it, simply sign up for a risk-free trial (takes 1 minute) and get going.

(Pssst.. we're currently making experimentations where we're optimizing budgets between Facebook Ads and Google Ads. The initial results are promising - and a feature we'll launch within a near future as well!).