This is a potentially infinite topic, my focus here will be on a few top line principles that can be applied across all channels and that are key for driving customer acquisition at scale. Things that I have learned in over 10 years managing international and global campaigns for some fo the biggest brands in tech. If a team can follow this generic framework, it sets a strong base for achieving excellence.
1. Volume equals efficiency
Volume and efficiency are two sides of the same coin, so we must never, ever look at them in isolation. This is critical for understanding performance e.g. reducing CPA by 15% while reducing volume by 15% is not really an improvement, (it often happens naturally, see point below)
2. The law of diminishing returns
This is particularly true for paid channels, but is broadly true on all channels: the more you spend on a channel the less efficient it tends to be and vice versa, the less you invest the more efficient one can be.
This can be offset by operational efficiencies and data/learning for optimisation, but to grow volume one either needs to target less relevant audiences or pay more for those audiences (increase bids etc.).
3. Goal setting, the OMM
This is a principle that is borrowed from Lean Analytics: we can only optimise to one metric. The moment we optimise to multiple metrics we need to make trade-offs and decisions become sub-optimal. This also has a cultural effect: when optimisation is focused on one single metric teams work better.
Going back to point one, we mustn’t forget point the other side of the coin: our goals always need to be set in terms of volume and efficiency. We either maximise volume for a given efficiency level or we maximise efficiency for a given budget level.
In certain organisations you are given a set budget and a set level of efficiency. Unfortunately that is a mathematical problem that can’t be solved. We either are happy to accept a higher CPA or we are happy with having budget left over (and allocated to other channels).
4. Statistical significance
All decision on any test that is made should be based not just on data, but on robust data. These are numerous sample calculators that can be found online and when possible it’s good to build one in-house.
Generally no decision should be made unless you can be 80% or 95% confident.
5. The three axis of optimisation
When optimising and deciding what to test remember that you are optimising on three separate but related axis
When deciding what to test prioritise for impact, (money), then consider how much time that learning requires. If test A has 50% more impact then test B, but it takes twice as long to run, then perhaps you are better off running test B and C, before you run test A. All things kept equal, prioritise for speed.
If two tests have the same impact and take the same amount of time, think how much work they require for implementation and prioritise accordingly.
6. A/B and multivariant testing
A/B testing is standard practice, but it needs to be implemented well. Make sure you can isolate the variables, so everything else must remain consistent.
As a goal is to maximise speed of learning, we can evolve the A/B test. To do so, we could do an A/B/C test, but this won’t really save us time. A solution can be what I call A1/A2/B1/B2 test. Two copies of identical ads, (A vs B) going to two copies of different pages.
Pro tip: counter intuitively running a test with two different ads pointing to two different pages can make sense. In this case you are not technically isolating a variable, but as the user experience is across the ad and the landing page, you are effectively separating experience A and experience B.
Multi-armed bandit technique for advanced level speed-testing.
7. No need for continuous optimisation
Are you always testing new ads? Often this is seen as a must do best practice, but in reality it worsen results. Initial testing is important in order to find a champion ad. Once that ad has been found, it is likely that other ads will perform worse, thus lowering your average CTR and CR. Sure you will eventually find a better ad, but at what cost?
Best practice, once you have a strong champion ad, is to wait until ad fatigue kicks in, and then try to find a better performing one (this also saves resources, which is one of the axis we are optimising to).
8. Three layers of optimisation
Hygenic: ensuring that all campaigns are live and running at the right budget (might sound basic but if you have 100+ campaigns running this requires work.
Ongoing standard optimisation: things like audience review, cookie pools, creative and landing page testing etc. Establish a frequency at which these need to happen and best practices
(automate as much as possible of the above)
New initiatives: create a monthly roadmap of new performance enhancing activities like beta tests, new technologies, full restructures, new channels.
9. Granularity vs data
Granularity of targeting increases accuracy, but scarcity of data decreases learning speed. Set up all your campaigns keeping in mind this trade-off
10. Budget allocation
Once you have established your attribution methodology and your KPIs, stick to them. A conversion is worth the same regardless of what channels it comes in at, (if you measure value of conversions, then measure channels in terms of ROI).
If two channels have different ROIs, move budget from the least efficient to the most efficient one. All channels should have a similar ROI: this is logic.
11. Global vs local
For most programmatic media, channel expertise and economies of scope, outweight the benefits of localisation. When possible maximise for these two benefits. Once the team operates efficiently and consistently at scale, at that point one can add a layer of localisation by having closer collaboration with the markets. First set a strong base of best practices.
12. Document best practices
The beauty of performance based media is that it’s so data driven. This implies that often there can be a single best way of doing thing. Agree on what that way is and document it – then repeat the process for all possible areas.
These documents should not be a diktat, but a brainstormed set of rules, that are constantly being improved by the team whenever new techniques are found. Once all processes are agreed, then optimisation efforts shift more on the speed of learning axis.
Documenting optimisation best practices doesn’t only allow to remove uncertainty, but it helps when onboarding new team members, who can quickly learn the processes. These docs are not there to duplicate content from the adwords learning centre and the likes, but to supplement them with the team’s best practices.
13. Creative management
Keep a database of all creatives used, if possible have them also on a wall.
This helps not only to know what’s happening, but also to remember 9 months later what already happen and to spot trends. (keep in mind that here qualitative insight is key, but that’s a whole other topic).
Comments