Attribution Part 2
Now that we've covered the hard, technical parts of marketing attribution, we can have a frank conversation about common marketing ignorance, and how to become enlightened.
Howdy friends. Happy March! Spring is here. The last month has felt like a journey. The weather out east has been treacherous. Slow became something called âicecreteâ, which led to too many days below 20°F, which led to a wet trip to CA.
Mentally and physically, February was exhausting. But this last week, it all turned a corner. More sleep, a stable schedule, sunshine, a long run in warmth, time to think, read, process, and build.
It doesnât take much to cure the heart.
Today, we are going to go deeper into attribution. If you need a reminder of our journey so far:
Attribution Part I: The Basics
Attribution Part 1.5: Web Attribution
đ Attribution Part 2: Ignorance to Enlightenment
đ Attribution Part 3: Advanced Methods
⊠but before we do that, I wanted to share two questions I got from a reader last week:
Question #1: I understand this is a personal problem, so I donât mind no response, but do you have any resources where you learned this technical engineering side with CC/LLMs? Iâm talking step-by-step basics that you recommend.
Answer #1: Yes! Here is what I would do: 1) Go take the command line crash course, 2) go look up and teach yourself basic HTML and JavaScript. Python would also be fine, but frankly, I think youâll get a lot more value out of the language of the web. This site is great (âLearn X the hard wayâ). Blaze through it. The goal is learning how things work, not how to write code.
After those two things, go look up Claude code - it will then make a lot more sense.
Question #2: I have the opposite problem, where the creative and strategic side thrives. Once itâs time to get down to technical, I feel like my brain just shuts off. But I think youâre right, and I am TERRIFIED to be left behind in this new marketing order.
Answer #2: Youâre right to be scared. Not trying to sugarcoat it, but I think lots of people can be creative with enough time and space. You have perhaps the advantage of knowing how to pull that out of yourself. Engineering and technical folks will learn how to be creative, but I think that quality exists in more humans than weâd like to admit. Non-technical folks need to learn how to get more technical. There are diminishing returns, though - youâre not going to become a software engineer, nor is that needed. The CEO of Anthropic predicts software creation will be dead as a skill in less than a few years. So donât become a software engineer. Become data and engineering fluent.
If you have questions, just respond to me in this email or on LinkedIn! Whatever floats your boat. Iâll leave you a nifty button here in case you feel so compelled to use Substack directly.
Okay, a last note. One of you amazing fans DMâd me on LinkedIn, and I accidentally deleted your request đđđ. I think your name was Eduardo, and I think this was your LinkedIn profile pic. If this is you, and you did in fact reach out to me, Iâm really sorry. Ping me again by responding to this email, and we can connect đ
Letâs kick off with a reminder of where weâve been and where we are going.
In Part II (today), weâll cover,
What data can you get from mobile attribution today
How people talk about attribution incorrectly
Telemetry Bias
Type 1 and Type 2 Marketing Errors
What problems are you really trying to solve?
What attribution data can tell you and what it canât
Then in Part III, weâll cover the most fun part - what can you do about it? Methods for making sense of attribution data to answer business problems.
HDYHAU
First, last, and multi-touch models
MMM
Incrementality
Probabilistic modeling
Letâs dive in!
What data are you actually getting
Everyone thinks they are getting to one ultimate truth in marketing attribution. In fact, everyone wants some version of a chart like this:
Why? Most people think of marketing as a pure math problem. If I spend money on channels, it must be driving the conversions Iâm seeing. Every dollar spent leads to X conversions.
There are many problems with this that weâll explore today, but the first is that you really donât get enough data to be this scientific. As we discussed in Attribution Part 1, on iOS, you get the IP address and the IDFA if the user agrees. On Android, youâll get UTMs, referrer, IP, and GAID. And as we discussed in Attribution Part 2.5, on the web, you know the UTMs of an incoming link, the referrer, and the IP address.
When you consider that browsers block 30-40% of traffic because of stripping URLs, ad blockers and SDK detectors, and just 15% of people on iOS typically opt in to ATT, this means that even if you implement all your âattribution telemetryâ perfectly, youâre missing more than half the story if you are a web and native based app with equal customer bases. At best, youâre missing 30% of the data if youâre web-only.
All of these things can be used to link an intent (somebody seeing an ad or clicking a link). But intent doesnât always translate to meaning. How many times have you clicked on a link after you saw an ad elsewhere? How many times have you googled the name of a very specific brand that youâve already bought and love, and just happen to click the paid ad link at the top because it's first?
Does somebody clicking on a branded link for your website mean that Google is the thing that âdrove themâ? Would they have clicked on your link anyway if they were already a paid customer?
We are teasing at the more problematic part of this situation, which is what you cannot measure. You canât reliably and deterministically measure how many times somebody stared at your subway post in the MTA, you canât measure the stunt you pulled off in the NYT, you canât measure how - over 10 years - people have been repeating your slogan in their head because theyâve heard it in all your placements. You canât measure how a TV commercial evoked an emotional response 6 months ago.
The problem with marketing is that ads have led us to believe it is a purely scientific endeavor, when in fact itâs much more complicated and nuanced than that.
If you can only deterministically measure - roughly 50% - couldnât you argue that you might as well flip a coin?
Incorrect ways of talking (and thinking) about attribution
This leads me to something Iâve wanted to get off my chest for a while. How people talk quite incorrectly about attribution. Itâs happened with nearly every company Iâve served in the last 10 years. Every founder - and sadly many, many CMOs - commit the same mistakes, and perpetuate their own ignorance. What do they do exactly?
They believe marketing âattributionâ is just a manifestation of that chart above. That itâs all about which channels are driving success.
They believe that if they see this chart, they will then know where to âdouble downâ.
They believe attribution is deterministic. That everything can be tracked.
They believe that you can have certainty if you just wave a magic wand and instrument telemetry correctly.
In the best of cases, this way of thinking, talking, and acting on attribution data can waste time and point people in the wrong direction. In the worst cases, it can distract a team for months and leave them wandering the plains of the marketing world without a compass.
You see, marketing attribution is part scientific and part judgment. Itâs like navigating the open ocean, guided by the stars 100 years ago. There is science to knowing where you are and what the stars are telling you, but there are things you canât see or predict deterministically.
Telemetry bias
Letâs talk about some of those things. The first, and biggest, is telemetry bias.
In a nutshell, telemetry bias is bias created by your valiant efforts to measure things. Because if you donât measure everything (very likely), youâll only know the results of what you measure. But what happens if you fail to measure an important signal? Then your data is telling you the wrong story. You might as well have flipped a coin.
Countless times, Iâve worked with a company to audit their tech stack and the work of their growth team. We find pretty early broken or inaccurate data capture. But their program was still working so well, largely because of luck, intuition, and the VC-driven mob mentality to spend more to drive growth.

The point is, even if youâre the best company in the world, with the best engineers, which everyone in Silicon Valley seems to think they have, you will miss something. Your telemetry data will be incomplete, incorrect, or just plain wrong at some stage.
But because we donât know how to account for this in our understanding of attribution, we simply ignore it.
Type 1 and Type 2 marketing errors
Many people are familiar with type 1 and type 2 errors. These can be similarly applied to marketing.
A type 1 error is a false positive in statistical hypothesis testing. It happens when you reject the null hypothesis even though itâs true. (âfalse positiveâ)
A type 2 error is a false negative in statistical hypothesis testing. It happens in the opposite scenario (âfalse negativeâ)
Because of telemetry bias and a failure to acknowledge the role of the unexpected, unplanned, and unknown, people often commit marketing-type 1 and type 2 errors.
Letâs expand this with an example. A founder sees the beautiful attribution chart above and sees that Google is driving all their conversions. Google must be a good channel. But in reality, they are just bidding on keywords that users already know about, are searching for, and clicking on.
The founder thinks Google is driving results for them. But in reality, something else is. Maybe the LinkedIn post, word of mouth, or the billboard. False positive.
Now, letâs think of the opposite.
A CMO wants to run a brand campaign, but they know that brand campaigns are notoriously hard to measure. Last year, they ran one, and since they didnât see an immediate uptick in conversions, they felt it was a failure. They also didnât ask anyone at checkout how they heard about the brand. If they had, they would have known that it was word of mouth and out-of-home placements. False negative.
These situations happen all đ the đ time đ in marketing.
Going back to the basics: What problems are you solving?
One of my frequently stated lines is: âWhat problems are you trying to solve?â Especially in marketing technology. Because people so quickly want to jump to solutions and tools. They think that buying something will magically solve their problems and make them better.
But in reality, tools are just meant to solve problems.
In marketing, attribution also has a purpose. Its purpose is to help you understand what is working and whatâs not. But it isnât a panacea. And like most tools, it has to be wielded correctly with context.
Thatâs why I developed this Problem / Solution matrix for marketing attribution.
This lays out the problems companies should strive to solve at various company sizes and stages, and how they should consider marketing attribution as a salvo.
What attribution data can tell you, what it canât, and what you should do about it?
In early stages of a company with fewer channels, low volume, and good telemetry, it can be very precise in confirming whatâs working
It canât give you a math equation for success
In some cases, with the right telemetry, it can tell you what people did. Did they see the ad? Did they click the link?
But activity behavior does not always mean causation. People clicking links does not necessarily mean âthis is the reason they boughtâ.
Correlation doesnât mean causation.
In later stages, advanced methods can help you build confidence in channels and programs
One way I like to think of it is like rays of different wavelengths shooting at different stacked filters. Different surface areas to capture traffic. People are drawn to these surface areas randomly or on purpose. When itâs on purpose, that means deterministic attribution is at play.
When itâs random (or seemingly random), you might not understand or capture what caused the purchasing behavior, but you did capture their final entry point to the funnel.
What you should do about it âŠ
Instrument telemetry quickly and easily. DM me, use my Marketing Telemetry Bot, or read my other articles and give it to Claude Code.
Donât rely on instrumentation alone. Use gut, instinct, and basic survey data (an HDYHAU, which weâll cover in Part 3!)
Learn to develop hypotheses and stories about what is happening in marketing, and then use data to try to prove or disprove those hypotheses, rather than relying solely on what you measure to tell you whatâs happening.
If you got this far - especially after these long Attribution reads - you must be really committed to marketing. I appreciate you, and while itâs still possible you might be Gary, the claw bot I built in my Mac mini, I am still grateful. These articles are a labor of love, and itâs scary to put things out there. Appreciate all the encouragement, kudos, and shares. If you were looking for a way to be helpful, the #1 thing would be to like this or other articles you enjoy and share them with friends.
Additional Reading
Every week, I share some things Iâve been reading. Please share your recommendations!
The 2028 Global Intelligence Crisis by Citrini & Alah Shah â ïž must read â ïž
The missing variable in most content strategies by Carilu Dietrich
A new look at free to paid conversion by Kyle Poyar
Subscribe to The Growth Stack Mafia đ«
Become a paying subscriber of the mafia to get access to this post and other subscriber-only content.
A subscription gets you:
All my content, access to the archive
2 x technical marketing briefs per month, where we explore fundamentals
1 x founder support article per month, which is a good read on how to be a better founder and operator
Interviews and chats with heroic founders, leaders, and operators
Dope Memes, 90s nostalgia
Dog photos








Always Learning, Thankyou !
Oh and I got another Dawg, that makes THREE ! Paw Attribution....