Why Doers and Leaders Still Struggle to Align on Data and How to Fix It
Published: July 2025
By Amy Humke, Ph.D.
Founder, Critical Influence
You can have a powerful model, a slick dashboard, and a warehouse full of data, but you still fail to move the needle. Why? Because no one's aligned.
I've spent the last decade building analytics systems, predictive models, and self-service dashboards for organizations. And I've watched some of the best technical work fall flat because of a gap that doesn't show up in the data: the divide between doers and leaders.
This isn't a tech problem. It's a trust and translation problem. If you're not building bridges between strategy and execution, between those requesting and producing the data, you're not solving the real issue. Below are the most persistent fault lines I've encountered and what it takes to close them.
1. The Blind Spot: Business Context Missing from the Data
One of the quiet killers of data projects is hidden context. Models drift not just because the data changes, but because the business changes, and no one told the data team or flagged the changes in the data for the models to pick up. I've seen models underperform after a sudden campaign launch, a shift in process, or a policy tweak that never made it into the data. It's not that the model is broken; it's operating in the dark.
This happens when data teams are structurally excluded from the front lines of the business. They're not looped in on marketing experiments, staffing changes, or new KPIs. They're asked to explain outcome variations without visibility into what's driving those changes. It's like forecasting the weather from a sealed room.
Organizations need to build data awareness into their operational rituals to counter this. That means aligning on which initiatives are in flight, what metrics matter, and how those things intersect. It also means documenting and surfacing business logic clearly, something foundational for alignment to action. Shared visibility isn't a nice-to-have; it's a core requirement for any analytics function that wants to be predictive instead of reactive.
It also underscores why transactional updates aren't enough. Even minor shifts in business activity can derail analysis unless surfaced early. That means being in the room when strategies form, not just when metrics get assigned. If data teams are flying blind, the dashboard isn't the problem. The blind spot is.
2. Silver Bullets, Shiny Objects, and the Myth of Plug-and-Play
A dangerous narrative still lingers in executive spaces: hiring a few data scientists or installing the right platform will magically deliver insight and impact. I've seen this play out in the form of dashboard dumping, teams demanding "just build it" without defining the question or the decision it's meant to inform. The assumption is that technology will compensate for the lack of clarity. It never does.
Data work is fundamentally human work. Most of the time is spent wrangling messy inputs, navigating ambiguity, and aligning definitions before you can even model or visualize. Yet many still see data science as a vending machine: input requests and output decisions as if the insights are like diamonds in the ground just waiting to be picked up and delivered. This mindset leads to wasted resources, demoralized teams, and tools no one trusts. When projects fail, blame is often shifted to the tech or the team, rather than the underlying misunderstanding of what good data work requires.
What's needed is not more tools but more shared understanding. Organizations must invest in data literacy for analysts across the organization. Leaders don't need to become coders, but they do need to understand the basics: that data science is iterative, that not all questions have clear answers, and that outcomes improve when you slow down to define success upfront. This reframing is critical to move from disappointment to actual data-driven transformation.
3. Us vs. Them: The Culture War Undermining Data Work
When misalignment festers long enough, it becomes something more insidious: a cultural rift. The "us vs. them" dynamic emerges when leaders feel like analysts are difficult or overcautious, and data teams feel like leaders are reckless or uninterested in the truth. This breeds distrust, disengagement, and siloed behavior. People stop talking unless they have to. They hoard knowledge. Collaboration dies.
In some cases, this is worsened by insecure leadership that sees questioning as insubordination. In others, analysts retreat into jargon to protect themselves from criticism. Regardless of who starts it, the effect is the same: a slow erosion of psychological safety. Without it, no one will ask better questions, admit uncertainty, or challenge assumptions. That's when teams get locked into performative cycles, delivering just enough to meet expectations without ever solving the real problem.
Too often, alignment is performed rather than practiced. Leaders nod at metrics they don't fully grasp. Analysts use complex terminology to sound credible instead of clear. Everyone wants to seem competent and confident, so no one asks the questions that matter. This creates a culture of false consensus, where misunderstanding gets mistaken for agreement. Clarity requires more than communication; it requires courage. The courage to say, "I don't get it." The courage to admit when a dashboard isn't helping. Real alignment doesn't come from everyone nodding. It comes from the willingness to pause, probe, and actually check for understanding.
Rebuilding this trust isn't about a workshop or a one-off strategy session. It's about modeling curiosity from the top, making it safe to challenge ideas, and explicitly rewarding bridge-building behavior. Reclaim the middle ground, the translator space between the technical and the strategic, where mutual respect and alignment are actively cultivated. That middle ground is where real momentum begins.
Kickoffs that prioritize cross-functional brainstorming help reinforce this. They create the space to voice uncertainties and surface context before assumptions harden into specs. When everyone contributes, everyone owns the outcome, and that shared ownership can be the difference between symbolic buy-in and actual progress.
4. Misaligned Timelines, Expectations, and Definitions of Done
Ask a data scientist how long a project will take, and they'll likely say: "It depends." Ask a business leader, and they might say: "Can we launch in two weeks?" This disconnect isn't just annoying, it's operationally destructive. When leaders push for speed without recognizing the uncertainty and iteration involved, teams skip steps. Data quality gets ignored. Assumptions go untested. And the final product ends up half-baked.
Worse still, there's often no shared understanding of what success looks like. A model might be statistically sound but useless to the business. A dashboard might be visually clean, but it answers the wrong question. These are not technical failures; they're alignment failures. They stem from a rush to build before the real goal is clarified or from a lack of fluency in the messy realities of data work.
Solving this starts with slowing down at the beginning. Co-create the problem definition. Define what "good" looks like in both business and technical terms. Use project frameworks, like the data-to-action methods outlined in your uploaded playbooks, to ensure every stage is grounded in context, rigor, and use-case relevance. Most critically, it is essential to protect space for the data team to challenge deadlines or request clarification without it being seen as resistance.
5. Who Owns the Fix? The Authority Gap in Data Quality
Data quality is often treated as a technical detail, something to be managed quietly, behind the scenes, by the same people delivering insights and dashboards. But here's the reality: in many organizations, human glue is the only thing holding data pipelines together. Analysts and engineers are patching around upstream errors, formatting issues, and broken logic because fixing root causes requires business policy change, and they don't have the authority to demand it.
This creates a dangerous illusion for leaders. The numbers are showing up in the dashboards, so things must be working. But behind that illusion is a crumbling foundation: undocumented workarounds, duplicated cleaning logic, brittle joins, and assumptions that quietly grow stale. Leaders often deprioritize data quality not because they don't care, but because they don't see the risks. And the doers rarely escalate those risks because doing so feels like resistance, not contribution.
The fix requires breaking the cycle of silent patching. That means naming the technical debt, quantifying its business impact, and elevating it to leadership as a priority, not just a nuisance. It also means involving doers in governance conversations, and recognizing that quality is not the opposite of speed; it enables sustainable speed. If your analysts spend more time cleaning than analyzing, or if your metrics definitions change every quarter, the issue isn't effort, it's architecture. And no amount of dashboard polish will make up for that.
The Fix Is Human: Trust, Fluency, and Structure
This points to a simple but often overlooked truth: data work succeeds when humans are aligned, not just systems. You can't dashboard your way out of a trust issue. You can't model your way past a broken relationship. The hard part isn't the math, it's the meaning.
The solutions aren't mysterious. You need better translation (that's where Business Analysts shine). You need consistent rituals that surface context early and often. You need a culture where it's safe to say "I don't know" and where curiosity isn't punished. You need to tie data work directly to business priorities, not as an afterthought, but from day one. And you need to raise the floor on data fluency for everyone, not just the specialists.
Translators, business analysts, analytics product leads, or hybrid operators often hold alignment together. However, many organizations treat them as lucky accidents rather than strategic infrastructure. If you're serious about bridging the data divide, these roles must be recognized, resourced, and protected. Otherwise, cross-functional clarity becomes a matter of chance, not process.
And finally, you can't build a data culture if using the data is optional. If insights never show up in performance reviews, strategic meetings, or promotion decisions, people will take the hint that data isn't required; it's cosmetic. That demoralizes the doers and misleads the leaders. Culture isn't what you say. It's what you reinforce. It's what you reward.
If we want analytics that drive action and models that influence behavior, we must start with how we behave. Real alignment isn't technical. It's deeply human.