Here is a mantra I want you to carry with you in your professional career: we are not data accountants.
I know that might sound like a strange thing to say. Instead, let me start with something more personal. Have you ever sat in a meeting defending a number you knew was wrong because someone expected precision you couldn’t deliver? Have you felt that particular kind of stress that comes from numbers that won’t reconcile, and a low-level sense that somehow this is your fault?
If that hit a bit close to home, you are absolutely not alone. I have these conversations regularly, and I’ve seen what that pressure does to people who really do care about doing good work. The problem is not you! The problem is a way of thinking about analytics that was always going to fail you.
What is a “data accountant”? It’s a phrase I’ve been using for a while now. A data accountant is an analyst or marketer who has been trained to treat analytics as a reconciliation exercise: every session must be accounted for, every click must match up, every conversion number must be exact. The job becomes defending the numbers rather than understanding what the numbers mean.
Video Version
How Marketers Became Data Accountants
At some point, the marketing industry got sold an idea: that good analytics means precise analytics. That rigour equals exactness. That if your numbers don’t add up perfectly, something is broken, and it’s probably your job to fix it.
I understand how this happened. When analytics tools first became widely adopted, the data felt clean. Reports matched up. Numbers reconciled. The illusion of completeness was easy to maintain because the gaps were not visible yet. What we didn’t fully appreciate at the time was that the data was never complete. It just used to be less obviously incomplete.
That illusion is gone now. Between ad blockers, cookie consent refusals, browser privacy protections, and the simple reality of how people move between devices throughout their day, you could be doing everything right technically and still be missing a significant chunk of what’s actually happening. That is not a configuration problem you can solve. That is the current reality of analytics, and it is not going to go back to the way it was.
And yet the expectation of precision remains. Organizations still evaluate analysts on whether the numbers match. Clients still ask why the GA4 sessions don’t reconcile with the ad platform numbers. Stakeholders still expect reports that account for every row. The cultural expectation calcified long before the technical reality made it impossible to meet, and we’ve been living in that gap ever since.
What I often see is analysts and marketers caught in a painful loop: one more tool, one more configuration change, one more attribution model, and finally the numbers will be right. I’ve seen talented people spend weeks chasing a discrepancy that, even if it were fully resolved, would not have changed a single recommendation they made. The promise that perfect data is just around the corner is one of the more expensive beliefs in our industry, because it keeps people focused on the wrong problem.
What the Data Accountant Mindset Can Cost You
The data accountant mindset is not just stressful, it’s also counterproductive, and you should be honest about what it’s actually costing you.
First: the goal is impossible. You will never have complete data, and the pursuit of completeness is not a problem you can engineer your way out of. This means the data accountant is in a position where failure is structurally guaranteed, regardless of how hard they work or how skilled they are.
Second, and this is the part I want you to sit with for a moment: even if you could achieve more precision, it often wouldn’t change the recommendation. Think about your last few reports. Would knowing you had exactly 12,476 sessions versus “approximately 12,000” have changed what you told your client or your team? In most cases, the answer is no.
(That being said, I want to be clear that there are contexts where precision does actually matter. A/B testing and conversion rate optimization work can sometimes hinge on small differences, and I’m not dismissing that. But most day-to-day analytics reporting is not CRO work, and it’s the default culture of counting and reconciling that I’m pushing back on here.)
Third, and this one comes from a reader who put it more clearly than I could: the data accountant mindset makes analytics feel scary, overwhelming, and intimidating to the people you’re reporting to. That is the opposite of what we want!
A reader named Kira Rodriguez wrote to me after the newsletter version of this piece went out. She’d been a data accountant in her previous role, and she described the experience that might feel familiar to you: the stress, the impossibility of the task, and the realization that “being extra precise didn’t actually meaningfully change our decisions.” Kira used the mindset shift in her job interviews and found it helped her come across as more senior, and more importantly, showed the people she was speaking with that analytics didn’t have to feel overwhelming.
Marketers have been pretending to be data accountants for years, and it’s a tough habit to break. But naming the habit is the first step to changing it.
From Counting to Reasoning
Let’s walk through what this shift from counting to reasoning would actually look like.
Instead of asking “how many sessions did we have this month,” we ask “is organic traffic trending up or down, and do we understand why?” Instead of obsessing over whether a conversion number is exactly right, we look at whether the ratio of effort to outcome is improving over time. We use percentages instead of raw counts when comparing across time periods or channels. We round large numbers. We say “approximately” when it’s appropriate. We focus on what the data is telling us, not whether every row reconciles.
Let me show you what this looks like as a before-and-after:
Data accountant version: “We had 4,312 sessions this month, which is down from 4,587 last month. Conversion rate was 2.34%.”
Reasoning version: “Organic traffic is down about 6% month over month. That’s consistent with the changes we’re seeing from increased presence in AI mode and AI overviews on Google Search and it’s to be expected. Our conversion rate had a slight increase due to the decrease in traffic while conversion counts held steady. This tells us that the traffic we are getting is qualified.”
Using the same data, we had a completely different conversation. The second version is doing something the first one isn’t: it’s making sense of the numbers rather than just reporting them. The person on the other side of that report doesn’t need to worry about whether 4,312 is the right number. They need to know what it means.
This is also the framing I use when I think about the difference between data-driven and data-informed work. This “counting to reasoning” shift is the practical expression of that philosophy in your day-to-day reporting.
One thing I want to be clear about: reasoning from imperfect data requires more analytical discipline, not less. What we’re doing is understanding what the data is actually telling us, accounting for context, recognizing when a change is signal versus noise, and then communicating that clearly to people who aren’t in the data every day like we are. It’s harder work than reconciling rows, but it’s the work that we need to be doing.
Having This Conversation With Clients and Stakeholders
Just being aware of this mindset shift isn’t enough on its own. You also need language to bring the people around you along with you, and that can be where this gets tricky.
The issue is that your clients and stakeholders have been trained to expect precision as well. That isn’t something they made up all on their own, they absorbed it from the same industry culture we all did. So when you show up with a different framing, you’re not just changing how you report. You’re asking them to update a mental model they’ve held for years and that takes some patience!
The approach I’ve found that works is to lead with confidence rather than apology. There is a significant difference between “sorry, the data isn’t perfect” and “here’s what the data is telling us, and here’s why this is the right way to read it.” Leading with an apology can invite doubt, while leading with confidence builds trust. When you can explain clearly and calmly why complete data isn’t available, and then redirect immediately to what you can know with confidence, you come across as more credible, not less.
Before I have this conversation with a client, I also think about where their organization actually sits in terms of analytics maturity. A conversation about imperfect data lands very differently with a team that already has solid measurement practices in place than it does with stakeholders who are still in the early stages of working with data at all and it’s important to keep that in mind as you have these conversations.
All this being said, this won’t be a switch you flip overnight or change hearts and minds in a single meeting. But you can get there one conversation at a time.
The Commitment
Let me come back to the mantra: we are not data accountants.
I want you to hold that not as a disclaimer, but as a commitment. A commitment to doing the harder, more valuable work of reasoning from imperfect data instead of pretending the data is complete. A commitment to being honest with your clients and stakeholders about what the numbers can and can’t tell you. A commitment to building the kind of trust that comes from useful analysis rather than precise-looking reports.
Measurement is not getting easier. Privacy legislation is tightening, tracking is becoming more limited, and the gap between the data we have and the data we’d theoretically like to have is not getting smaller. The time to build this practice is now.
This post started out as a note in my newsletter, The Huddle. If this resonated with you, sign up here to get more insights.