When We Killed Our Top Channel: A Growth Team Post-Mortem
Two years into my first real growth role at a San Francisco SaaS company, I killed our top acquisition channel. On purpose. The channel was producing roughly 38% of our new signups. On the Friday we shut it down, it had a week-over-week CAC that looked perfectly healthy and a volume trend that was up and to the right.
Six people in the room thought I was making a career-ending decision. Two of them told me so directly. One of them sent me a very polite email that I have kept in a folder ever since.
I am writing this because almost every post-mortem I read is about channels that failed. This is the opposite story: a channel that was “working” and still needed to die. I believe the decision was correct. I also made three mistakes in how I killed it that cost the team real money and trust. This is the honest version of both.
The Channel
For confidentiality reasons I will describe it in generic terms. It was a partnership with a large aggregator site — the kind of domain that ranks for thousands of commercial queries and points “recommended vendors” to partners like us. We paid a monthly placement fee, plus a revenue share on any signup that could be attributed to the aggregator’s URL parameters.
On the dashboard, it looked great:
- 38% of new signups in a typical month
- Blended CAC about 20% below our other paid channels
- Signup conversion rate roughly 2.3x our organic search average
If you stopped there — and most of our exec team did — you would have called this the crown jewel of our acquisition mix.
What I Saw That Made Me Look Harder
The thing that stopped me was a product metric, not a marketing one. Users who came through the aggregator had a 30-day retention rate of about 18%. Users from direct, organic search, and content had retention rates between 52% and 71%.
I had seen numbers like that before — segment differences in retention happen. But 18% is not a segment difference. 18% means the users were fundamentally different. They were not our customers. They were somebody’s customers briefly passing through our funnel.

I pulled two more data sets before deciding.
Support tickets. Users from the aggregator opened 4.1x more tickets per user in their first 14 days. The tickets were largely “I do not understand what this product does” and “how do I cancel.” These were not frustrated customers. These were confused visitors who had been marketed to like they were prospects.
Revenue attribution properly windowed. When I matched first-touch to first-paid-invoice with a 60-day window instead of our default 30-day, the aggregator channel dropped to about 12% of revenue, not 38% of signups. The gap between signup attribution and revenue attribution told the real story. People were signing up because the aggregator put us in front of them with inflated claims, and they were leaving when the product did not match the claim.
By the time I had all three data sets on one screen, the decision was obvious. We were paying to pollute our funnel with users who would churn quickly, overload support, and make every downstream metric look worse.
The Internal Case
Getting buy-in was harder than seeing the problem. The CEO liked the signup number. The CFO liked the CAC number. The sales team liked that aggregator-sourced leads sometimes converted to paid, even if they churned fast.
I built a one-page memo with three charts.
- Blended LTV by source. Aggregator users had LTVs roughly 35% of our average. So “CAC 20% below blended” was a number we were comparing to the wrong denominator.
- Net revenue contribution. After cost, after expected churn, after support cost, the channel’s real contribution was roughly 4% of net new revenue — not 38% of signups.
- Counterfactual signup volume. Based on organic demand and waitlist overflow, we estimated 60–70% of the aggregator’s best-retaining users would have found us through other channels anyway within 90 days. The channel was less incremental than it looked.
The meeting took 40 minutes. The CEO signed off. I was wrong about two things in that memo. Both would cost us.
Mistake One: I Underestimated the Short-Term Dashboard Hole
The day after we killed the channel, our top-of-funnel dashboard looked like someone had sabotaged it. Signups dropped 34% on the first Monday. The week-over-week trend for the first three weeks was ugly enough that three people — the same three who had warned me — came to ask if we should turn it back on.
I knew the drop was coming. I had not prepared the organization to see it and stay calm. I should have written a one-pager that said “here is what the next 6 weeks will look like, here are the metrics I expect to see degrade, here is the metric that will tell us whether the decision was right.” Without that prebrief, every week of the dashboard looked like a crisis.
Lesson: if you are going to cause a visible short-term regression, pre-commit the organization to a recovery curve before you pull the trigger. Announcing “I will own the consequences” is not enough. People will still get nervous. Decision fatigue will creep in.
Mistake Two: I Did Not Communicate With the Sales Team Early Enough
Sales was tracking a pipeline that partly depended on aggregator-sourced leads. When we killed the channel, their top-of-funnel collapsed and nobody on sales had been warned. I heard about it through a Slack message with seven frowny-face emojis and the phrase “we need to talk.”
They were not wrong. I had made a decision that affected their quota without giving them lead time. A two-week heads-up and a joint plan for how to rebuild their pipeline from other sources would have cost me nothing and saved a month of tension.
Lesson: when you make a marketing decision that changes the shape of someone else’s team’s work, you owe them the conversation before the change, not after. Even when you are sure, even when you are right, even when the data is on your side. The political cost of skipping that conversation is higher than the time it would have taken.

Mistake Three: I Did Not Set Up a Proper Before/After Measurement
This one bothers me the most. I killed the channel, waited, watched, and within 10 weeks our retention curve had improved materially. Our CAC-to-LTV ratio looked better. Support ticket volume dropped. Every late metric moved in the right direction.
But I did not have a clean counterfactual. The company was growing. Seasonality was in our favor. A product release that quarter was strong. How much of the improvement was the channel kill, and how much was the other stuff? I cannot tell you, to this day, with confidence.
Lesson: before any major change, write down exactly which metrics you will look at, over which time window, and what values will count as “decision correct” versus “decision incorrect.” Pick the metrics before you see the data. Do not let yourself cherry-pick in retrospect. I cherry-picked in retrospect, and even though I believe the decision was right, my evidence is softer than it should be.
What the Post-Mortem Meeting Looked Like
Six months after the kill, I ran a formal post-mortem. The format was simple.
Three sections. What went well. Our retention numbers were substantially healthier. Support cost per user was down materially. Our internal conversations about channel quality were permanently different.
What went poorly. The three mistakes above. Plus: I had not sunset the aggregator partnership gracefully. The partner was embarrassed and a little angry, and the industry is small. That relationship was damaged in ways that cost us again later.
What would I do differently. Pre-brief the organization with a recovery curve. Talk to sales two weeks before. Document the measurement plan in writing, before acting. Plan the partner exit with the same care I planned the internal decision.
Nobody in that post-mortem meeting was surprised by any of it. They had all felt it. But writing it down, owning it in front of the team, made it a lesson rather than a private regret. That is the whole point of running post-mortems on your own decisions. The bad ones teach you more than the good ones, and pretending otherwise is how you stop growing.
The Rule I Took Away
Any marketing channel that looks great on signup metrics and ugly on retention is not a marketing channel. It is a leak. You can fill it with budget forever and wonder why your product does not feel like it is working. The answer is that your marketing is bringing you the wrong people, very efficiently.
Killing channels that look healthy is the hardest thing to do in marketing. It is also, in my experience, the single highest-leverage decision most growth teams are avoiding. If you have a channel that looks great on signup count and terrible on 90-day retention, that channel is costing you more than it brings. Even if the dashboard disagrees. Especially if the dashboard disagrees.
|
|
Written by
Marcus Webb
Marketing strategist with 12+ years of experience. I test tools so you do not waste money on software that does not deliver. More about me → |