Ever been in a meeting where someone pulls up a dashboard, points at a number, and says: “The data says we should do X”? What happens at that point? Typically, the conversation just stops. The data has spoken. Decision made!

The term “data-driven” sounds great in theory, but in practice when data is driving, we stop thinking. The data doesn’t know your business context, your competitive landscape, or the conversation you had with your biggest customer last Tuesday. It doesn’t have judgment. It’s a useful tool but it isn’t a decision-maker.

I now use the term “data-informed” instead, and I think the distinction matters — which is why you’re reading this post right now! It’s more than just a word swap. Instead, it’s a fundamentally different relationship with your analytics, and it changes how you make decisions, build reports, and talk about performance with your team.

What Data-Informed Decision Making Actually Means

I think of this like blindly following your Google Maps directions. If the directions say to drive down an abandoned road, but you don’t know that it’s abandoned, you won’t realize the mistake until it’s too late. Data without direction is the same thing. Yes, that data will take you somewhere, but there’s no guarantee it’s somewhere useful or the place you actually wanted to go. A data-driven approach means that you follow whatever directions you’re told. A data-informed approach keeps you involved, ensuring that you won’t be directed to drive off a cliff.

Data-informed decision making means you use data as one input into human decisions, not as a replacement for them. You still put in the work to collect good data and set up your analytics thoughtfully. And in fact, data-informed requires more scrutiny, not less, because you can’t hide behind false precision. But you’re also honest about the limitations of what comes out.

Here’s what I mean in practice:

Data-DrivenData-Informed
PhilosophyData makes the decisionData informs human decisions
AssumptionData is accurate and completeData is directional and imperfect
RiskFalse precision, gaming metricsRequires more thoughtful setup
Approach"The data says X, so we do X""The data suggests X. Here's what that might mean given what we know."

Remember: marketing analytics data is directional, not authoritative. It points you toward insights. It doesn’t hand you answers.

And this extends to AI, too. The idea is that you can just feed your data into AI and it’ll give you all the answers. But AI only can work from what you tell it. If your data is garbage going in, AI just helps you make bad decisions faster.

If you want data to actually help you make better decisions, it starts with building a measurement strategy that aligns with business goals. That means defining what you’re trying to learn before you start collecting, and being honest about what your analytics setup can and can’t tell you.

From "Is Your Data Right?" to "Is Our Data Right?"

One of the most damaging side effects of a data-driven culture is how it warps team dynamics around reporting and incentives. When organizations treat metrics as authoritative, people start optimizing for the number rather than the outcome the number was supposed to represent.

More than once I’ve been told that someone’s performance review depends on hitting a specific metric target. It used to be bounce rate in Universal Analytics, where people would need a very low bounce rate to be considered successful at their job. Unfortunately, bounce rate was incredibly easy to fake. The idea behind setting bounce rate as a metric to watch makes sense, but the actual implementation fell far short of the goal.

The same dynamic plays out in reporting. When you present data as if it’s perfect and completely accurate, you set yourself up for the inevitable moment when someone notices the numbers don’t match their back-end system or their gut sense of what happened. And when that happens, the conversation can turn adversarial.

Changing how data is presented can help improve this relationship. Instead of presenting data as something you produced and they evaluate, frame it as a shared responsibility. Instead of “is your data right?”, ask “is our data right?” This can help pivot the conversation from blame to collaboration.

This ties into something deeper about where your organization stands in analytics maturity. At certain maturity levels, I often see data being used to support conclusions people have already reached. That’s not data-driven or data-informed. It’s data-justified, and it really doesn’t serve anyone well.

The data-informed mindset can break that pattern because it starts from a position of honesty. You’re not claiming the data proves anything. You’re saying here’s what the data suggests, here’s the context we need to consider, and here’s what we might do about it. That kind of transparency is uncomfortable at first, especially if your team is used to reports that project absolute certainty. But over time, it will build far more trust than pretending your numbers are perfect.

I often have this conversation with clients who are nervous about admitting data limitations to their leadership. I believe that the alternative is worse. If you present data as perfect and it turns out not to be, you lose credibility in a way that’s very hard to recover from. If you’re upfront about limitations from the start, you build trust and people actually listen more carefully to the insights you do present with confidence.

(And sometimes leadership just doesn’t want to hear it no matter what you do and that is just a bad situation all round. I don’t have a lot of great suggestions if you’re in this situation except just realize that you really cannot change things and decide what you want to do about that.)

What Data-Informed Reporting Actually Looks Like

So what does this look like when you sit down to build a report? Let’s walk through some of the practical changes.

Answer questions, not metrics. I think about reports using what I call the Jeopardy method, where everything should be in the form of a question. So instead of a page titled “Sessions by Channel,” the page asks “Which channel did the best job of driving visitors to our website?”

This isn’t just a case of changing up the words on the page or presentation. When you frame a report as an answer to a question, you have to think about what question the people you’re reporting to actually have. And that forces you to use their language, not yours. If your boss says “I want to increase inbound phone calls,” then your report should talk about phone calls, not conversion events. The terminology matters because it determines whether people engage with the data or glaze over.

Stop presenting false precision. I’ve previously written about practical techniques for reporting with imperfect data in detail, but here’s the highlights. Round your numbers. Use percentages instead of absolutes. Show the slices of the pie rather than the exact counts. When you present a number with two decimal points, you’re implying a level of accuracy that your analytics almost certainly can’t support.

Set the stage before the data. The first page of every report we build includes a glossary, a “what you need to know” section, and a callout box that flags known data issues. Getting ahead of those questions prevents the “your data is wrong” spiral before it starts.

Give context, not just numbers. A data-informed report doesn’t just show you what happened. Instead, it connects the dots to explain why it matters and what you might consider doing about it. There’s a real tendency, especially for people newer to analytics, to just show as much data as possible because complexity feels like it justifies the investment. In my experience, the opposite is true. Give people something simple and actionable, not something complex and overwhelming.

Making the Shift

Data-informed doesn’t mean data-casual. It doesn’t mean you stop investing in good analytics or stop caring about data quality. It means you put in the work to set things up right and you’re honest about the limitations of what comes out the other side.

The real payoff of this mindset shift is trust. When you help people make better decisions with imperfect data you build the kind of long-term relationship where they actually listen to your recommendations.

The shift from data-driven to data-informed isn’t about lowering your standards. It’s about raising your honesty. And in my experience, that’s a trade worth making every time.

Frequently Asked Questions

Data-driven implies that data makes decisions for you. For example, the data says X, so you do X. Data-informed means using data as one input to human decision-making, acknowledging that analytics data is directional rather than perfectly accurate. In my experience, data-informed means you put in the work to set things up right and you’re honest about what comes out.

I’d argue data-informed is better than data-driven, because it actually requires more rigor, not less. When you acknowledge that data is directional rather than authoritative, you can’t fall back on false precision. Instead, you have to understand what the data is actually telling you and combine it with business context and judgment to make good decisions.

Data-informed decision making means using data as a guide while combining it with business context, professional judgment, and experience. Think of it like driving a car: data gives you the dashboard instruments, but you still need a human behind the wheel who knows where you’re going and can react to current conditions.

The biggest problems I see with being data-driven are false precision, misguided incentives, and the removal of human judgment from decisions. When organizations tie performance to specific metrics without questioning whether those metrics are accurate or meaningful, people optimize for the number rather than the outcome.

Black and white portrait of Dana DiTomaso

Dana enjoys solving problems that haven’t been solved before. With her 20+ years experience in digital marketing and teaching, she has a knack for distilling complex topics into engaging and easy to understand instruction. You’ll find Dana sharing her knowledge at digital marketing conferences around the world, teaching courses, and hosting a technology column.

Learn more about Dana