Marketing Performance (Not Performance Marketing)
Performance marketing is a type of marketing where payment or evaluation is tied to a measurable outcome. That outcome could be an impression, a click, a lead, a sale, recurring revenue or a whole plethora of other metrics.
The word “performance” here just means “measured.” It could just as easily be called “accountable marketing” or “marketing where someone asks if it worked,” but those don’t fit on a conference badge.
It’s not a separate discipline. It’s not a magical framework. It’s the practice of checking whether the thing you did actually did the thing you wanted it to do. The fact that we need a special term for this tells you more about marketing than it does about performance.
Why measure the performance of marketing campaigns?
You measure performance when you want to know whether something worked. That sounds obvious, but most marketing operates in a space where “worked” is loosely defined as “someone saw it” or “people said nice things about it on Slack.”
Performance measurement forces a different question: did this activity move the number we said we cared about? If you spent £10,000 on ads, did you get £10,000 worth of outcome? If you published 47 blog posts, did anyone read them, and if they did, did they do anything that mattered to the business?
Measurement clarifies intent. It separates the things that sound good in a deck from the things that actually produce results. It’s not that unmeasured marketing is bad – brand work, creative campaigns, and long-term positioning all have value, but if you can’t articulate what success looks like in advance, you’re not doing performance marketing. You’re doing something else, which is fine, but call it what it is.
Who needs performance marketing?
Anyone doing marketing with a specific business goal and a budget to achieve it. That’s the prerequisite. If you have neither of those things, measurement becomes academic. You’re tracking numbers for the sake of tracking numbers, which is theatre, mostly.
Small teams with tight budgets need it more than anyone. When every pound matters, you can’t afford campaigns that “build awareness” without a mechanism to know whether awareness is actually built. Large teams with big budgets need it too, but they can survive longer without it. They have the luxury of waste.
The moment someone asks, “Is this working?” is the moment you’re in performance marketing territory. Before that question gets asked, you’re operating on faith, taste, or tradition. Those are all fine ways to make decisions, but they’re not performance.
How performance marketing works
You define an outcome. You do marketing activity intended to produce that outcome. You check whether the outcome happened. If it did, you do more of the thing. If it didn’t, you stop doing the activity or change the tactics until it produces the outcome. You iterate and get better.
This is not complicated. The difficulty is in the execution and the iteration.
Defining the outcome requires honesty about what you’re actually trying to accomplish. “Engagement” isn’t an outcome. “Leads” isn’t an outcome unless leads turn into revenue at a known rate. “Awareness” isn’t an outcome unless you can show that awareness produces something the business values.
Rand Fishkin points out that attribution modelling (the thing most marketers think is performance measurement) is effectively dead due to privacy changes and platform restrictions. What remains is simpler: did the thing you wanted to happen actually happen after you spent money trying to make it happen? The simplest answer is normally the best. It’s normally a yes or no.
The measurement part is where most efforts break down. Not because measurement is hard, it’s easier than ever. Measuring the right thing requires deciding what the right thing is, and that’s uncomfortable. It means admitting that some activities don’t matter and they’re normally the fun ones. It means saying no to campaigns that sound good but produce nothing measurable.
Then comes the action phase. This is where performance marketing separates from performance theatre. If the data says something isn’t working, you stop doing it. If the data says something is working, you do more of it. Most teams skip this step. They measure everything, report on everything, and change nothing.
When to start measuring marketing performance
Day one. Not because you’ll have meaningful data on day one, you won’t, but because the discipline of measurement changes behaviour immediately. When you know you’ll have to report on whether something worked, you think harder about whether it will work before you start.
Avinash Kaushik’s Digital Marketing and Measurement Model puts business objectives first. What is the business trying to accomplish? Not “What marketing is trying to accomplish? Then it’s “How can marketing help?”
That question forces clarity. If you can’t answer it, you’re not ready to measure performance because you don’t know what performance looks like. You’re just collecting numbers.
Early measurement is also when you establish what normal looks like. You need a baseline. Without a baseline, every number exists in a vacuum. A 2% conversion rate means nothing if you don’t know whether 2% is good, bad, or exactly what you should expect given your product, price, and audience.
The mistake is waiting until you have “enough data” to start measuring. You never have enough data. You always wish you’d started tracking things earlier. Start now. Track the basics. Add sophistication later when you’ve earned it through volume and repeatability.
What to track
Revenue, where possible. If your marketing can be tied to revenue, tie it to revenue. Everything else is a proxy for revenue, and proxies are useful only when the direct thing can’t be measured.
For example, if you can show sales leads turn into revenue at a predictable rate. Lead volume without lead quality is vanity. If you’re generating 500 leads a month but closing two deals, the problem isn’t lead volume. For the record, quality is also subjective.
Cost per outcome. Not cost per click, unless a click is the outcome. Not cost per impression, unless impressions are what you’re buying. Cost per the thing that matters. If you’re spending £5,000 a month and generating ten sales, your cost per sale is £500. That number tells you whether the channel is viable.
How long does it take to produce the desired outcome? This matters more than most people think. A channel that converts at 5% but takes six months to close might be worse than a channel that converts at 2% but closes in a week, depending on your cash flow and patience.
Channel contribution. Which channels are present in the journey to conversion? Fishkin argues that provable attribution is largely impossible and that most companies would do better by trusting their gut and measuring lift rather than trying to attribute every conversion to a specific touchpoint.
That’s correct. Attribution models exist to make paid channels appear better than they are. What you want to know is simpler: when we turn this channel on, do we see more conversions? When we turn it off, do we see fewer? That’s a contribution. Or am I spending more than I’m making over 30, 60, 90, days?
The tools you actually need
A spreadsheet, to start. Before you buy anything, track everything in a spreadsheet for 90 days. Manual work clarifies thinking. You’ll discover that half the things you thought you needed to track don’t matter, and the things that do matter are simpler to measure than you imagined.
Analytics for your website. Google Analytics will do. It’s free, it’s comprehensive, and most companies only use 5% of its capabilities. If you’re not using 90% of what GA offers, you don’t need a more sophisticated tool. You need to use the tool you have.
Ad platform dashboards. If you’re running paid media, the platform gives you a dashboard; use that, don’t pay for a third-party dashboard until you’ve maxed out what the platform provides. Sometimes people need steam coming out of their ears. Most teams never get there.
Similarly, a CRM isn’t about the software; it’s about the discipline of logging every interaction so you can look back and see what worked. If your sales team doesn’t log activity, the CRM is useless, regardless of which one you bought. Use a Google Sheet instead.
That’s it. That’s the stack. Everything else is optimisation for teams who’ve already mastered the basics. If you’re not measuring religiously with the tools above, buying more tools won’t help. It’ll just give you more dashboards to ignore.
Common tracking mistakes
Tracking everything and acting on nothing. Measurement without action is pointless. The fix isn’t better tracking; it’s using the tracking you have to make different decisions.
Confusing activity with outcome. Blog posts published, emails sent, ads run are activities. A They’re not outcomes. Outcomes are the things that happen because of the activities. Traffic, leads, sales, retention. If your performance report is a list of things you did rather than things that happened, you’re reporting on the wrong metrics. Of course, someone’s personal development KPI’s could be to produce more output, but that’s not the same thing.
Optimising too early. You need volume before optimisation matters. If you’re running one ad to 50 people, split testing creative is pointless. The sample size is too small. You’re optimising noise. One practitioner observed that testing a campaign for just a day or two leads to premature conclusions. Wait until you have enough data that patterns emerge, then optimise the patterns.
Blaming the measurement when the marketing doesn’t work. The data didn’t lie to you. The campaign just didn’t work. This happens more often than anyone wants to admit. Sometimes the product isn’t good enough. Sometimes the price is wrong. Sometimes the audience doesn’t want what you’re selling. Measurement reveals this. It doesn’t fix it.
Setting goals you can’t influence. If your goal is “increase brand awareness,” but you have no way to move awareness through your available channels and budget, the goal is aspirational, not operational. Pick a goal you can actually affect with the resources you have. If that means the goal is smaller and less impressive, so be it. A small goal you hit is better than a large goal you miss while pretending you’re “building towards it.”
Performance and brand
They’re not opposed. They’re sequential. A brand creates conditions where performance is easier. Strong brands convert better, retain longer, and cost less to acquire. But brand work that never leads to performance is just expensive art.
Attribution is simultaneously every marketer’s dream and nightmare, with platforms like Google and Meta often attributing success to themselves rather than revealing true performance drivers. The solution isn’t to stop measuring brand work. It’s to measure brand work the way you’d measure anything else: did the number you said you cared about move after you did the thing?
If you run brand campaigns, measure brand lift. Survey awareness before and after. Track branded search volume. Monitor direct traffic. Watch retention rates. These are all proxies, yes, but they’re measurable proxies. You can plot them over time. You can correlate them with activity.
The mistake is treating the brand as immeasurable and therefore exempt from accountability. Everything is measurable if you’re willing to define what success looks like in advance. If you can’t define success, you’re not doing brand marketing. You’re just making things and hoping someone notices.
What this looks like in practice
A company decides to grow revenue by 20% this year. They have £50,000 in marketing budget. They currently get 60% of revenue from organic search, 25% from direct traffic, and 15% from paid ads. The cost to acquire a customer through paid ads is £180. The average customer value is £450. That’s a 2.5x return, which is acceptable but not great.
They decide to test LinkedIn ads because their ideal is on LinkedIn. They allocate £5,000 to a 60-day test. The success metric is cost per lead under £50, because they know that leads from LinkedIn close at 15%, and they need that conversion rate to hit their £180 CAC target.
Day 30: cost per lead is £75. Not good enough. They review the creative, the targeting, and the landing page. Targeting is too broad. They narrow it. Creative is fine. The landing page has a 28% conversion rate, which is acceptable. The problem is cost per click is too high, which means they’re not reaching the right people.
Day 60: cost per lead is £52. Close, but still not hitting the target. They have two choices: kill the channel or reduce the CAC target by improving the close rate. They look at the leads. The quality is good. Three have closed already, which is faster than typical. They decide to run another 30 days with refined targeting.
Day 90: cost per lead is £48. They’re in. They can scale spend to £10,000/month and watch the metrics for another 60 days to see if it wasn’t a fluke; if it wasn’t, that can become a new channel.
That’s performance marketing. Clear goal, defined success metric, test, decision criteria established in advance, and action taken based on results. No dashboards with 47 metrics. No reporting decks with insights that lead to no decisions. Just: did it work, yes or no, and what are we doing about it?
The only test
If you had to cut your marketing budget in half tomorrow, do you know which half to cut? If the answer is yes, you’re doing performance marketing. If the answer is no, or if the answer is “I’d cut the half that looks least impressive in the board deck,” you’re not.
Performance marketing shows value. Not data that shows it’s busy. Data that shows it contributed to an outcome the business cares about, and that it was worth more than the cost.
Most marketing fails this test. That’s fine. Not everything needs to pass. But if nothing you’re doing can pass this test, you’re not in marketing. You’re in communications, or brand, or creative, which are all real jobs with real value. But they don’t mean anything. Did the activity contribute to the outcomes you wanted? If yes, keep doing it. If not, stop doing; if it’s partial, keep at it, incrementally. That’s called growth.