Data-Informed, Not Data-Driven: Rethinking Metrics for a More Human L&D

Jeff Bezos once said, “When the data and the anecdotes disagree, the anecdotes are usually right.” It’s a jarring statement in an age obsessed with analytics, but it captures something essential about learning and development.

L&D has always been asked to prove its worth. For years, we’ve responded with dashboards that count completions, track seat time, and celebrate satisfaction scores. They’re quantifiable and familiar. They also say little about whether learning is improving performance, readiness, or confidence.

Now, as AI and analytics take a larger role in how we design and measure learning, the pressure to prove impact has only intensified. But we don’t need more data. We need better judgment in how we use it.

The future of measurement in L&D depends on being data-informed: using information to guide decisions while keeping people and context in view. That means treating data as insight and balancing the precision of analytics with the perspective that only human experience can provide.

When Metrics Miss the Point

Data can create a comforting sense of certainty. When numbers look clean, we assume they’re telling the truth.

But metrics like completion rates or hours logged can’t tell us what happens once a learner returns to their real work. They don’t capture behaviour change or influence on outcomes. What they do offer is a sense of control; a signal to the organisation that we’re tracking something.

This is how many learning teams end up managing the appearance of impact rather than its reality. A well-designed dashboard might satisfy an executive’s curiosity, but it often leaves deeper questions unanswered. Did performance actually improve? Did learning close the gaps the organisation cared about? Are people working differently, or just finishing training faster?

Worse, metrics don’t just describe performance; they shape it. When you measure the wrong thing, people optimise for the wrong goal. If L&D teams are rewarded for completion rates, they’ll design shorter, easier courses. If executives focus on seat time, programs will stretch to

meet it. If satisfaction surveys are the benchmark, learners will get higher-quality entertainment and fewer educational challenges.

The more we measure activity without meaning, the further we drift from the work that matters. Measurement becomes performance in itself, detached from the outcomes the organisation needs.

People experience what numbers only approximate. When the two don’t line up, it’s often the human experience that reveals what the data missed. So how do we bridge this gap between data and experience? It starts with changing how we approach metrics.

Turning Data into Dialogue

Being data-informed requires curiosity. It means asking what the numbers might be suggesting, not just what they confirm.

Consider a sales enablement team that sees low participation in a new product training. A data-driven response would push reminders and escalate compliance. A data-informed response would investigate.

Why didn’t sellers engage? Were they too busy closing deals? Did the content fail to connect with real scenarios? Were managers reinforcing it in the field?

In one company I worked with, a similar pattern revealed that high-performing sales teams were skipping formal training entirely because they had built informal peer sessions that worked better. The top performers were running 30-minute scenario workshops in their team meetings, practising objection handling with real customer situations rather than theoretical cases. The insight wasn’t to force compliance. It was to learn from what those teams were already doing right and scale their approach.

That’s the difference between measuring learning and understanding it.

When L&D uses data to start conversations, it deepens its relationship with the organisation. Metrics stop being a defence mechanism and start becoming a shared language for problem-solving.

Why Algorithms Can’t Replace Context

This conversational approach to data works precisely because no algorithm can replace human discernment. Data shows what is happening, but only people can interpret why.

This is where judgment becomes essential. Skilled L&D leaders know how to interpret context before acting. They understand the nuance of timing, audience, and culture. They can see when a number looks strong, but the behaviour behind it is weak, or when an unplanned change in learner behaviour signals real progress.

For example, an operations team might show declining completion rates after moving to a new microlearning approach. A surface read would flag a problem. Deeper analysis might show that fewer formal modules are needed because learning now happens in shorter bursts inside the workflow. The drop signals efficiency, not disengagement.

Judgment allows us to see the difference. Analytics create visibility. Humans decide where to look next.

As analytics platforms and AI-enabled systems advance, it’s easy to assume algorithms will give us perfect insight. But even the smartest systems can misread context. An algorithm can show that engagement dropped after a platform change, but it can’t tell you that employees are frustrated by too many logins or content that feels repetitive.

Learning is personal. It’s built on trust, curiosity, and belonging. While these qualities can be assessed through qualitative methods, they’re not reducible to simple metrics. Yet, they drive every learning outcome that matters.

Three Shifts Toward Data-Informed Practice

Anchor Metrics in Organisational Language

Start measurement conversations with the desired organisational outcome. Executives care about performance improvement, cost reduction, or risk mitigation. Frame your metrics around those outcomes.

Instead of reporting that 90 percent of managers completed a course, share that time-to-readiness for new leaders dropped by 25 percent. Show how a specific program reduced customer escalations by 40 percent or improved operational consistency enough to cut error rates in half.

When metrics reflect organisational performance, they gain relevance and influence.

Combine Data with Experience

Every dataset needs a human interpreter. Combine analytics with qualitative insights from learners, managers, and stakeholders.

When data shows a pattern, test it through interviews and observation. What did learners experience? What did managers notice? What’s happening in the environment that might explain the trend?

Quantitative data tells you what’s happening. Qualitative data explains why. The strongest measurement models weave both, using numbers to identify trends and stories to illustrate meaning.

Treat Reporting as Storytelling

Most L&D teams report to prove effort. The best ones report to create alignment.

Instead of sending executives spreadsheets full of metrics, tell a story. Start with the problem the organisation faced, describe the learning intervention, show how performance shifted, and end with what you learned. Support that narrative with data but lead with meaning.

Executive leaders make decisions based on outcomes they can connect to organisational results. When you translate learning impact into a clear narrative – problem, solution, outcome – you make it easier for them to see value and invest in what’s next.

What Data-Informed Looks Like in Real Teams

In one healthcare system, the L&D team replaced traditional course tracking with outcome tracking. They measured how often newly trained clinicians followed safety protocols and how that influenced patient outcomes. The numbers improved, but what mattered most was the confidence staff reported in applying their training. Both views, quantitative and qualitative, completed the picture.

In a global software company, the learning analytics group worked directly with Finance to link onboarding data with time-to-productivity metrics. The shift changed how leadership viewed L&D investments. Learning became part of the company’s growth engine rather than an expense line.

These examples share one pattern: none began by collecting more data. They started by asking better questions.

A More Human Approach to Measurement

Modern L&D leaders face a flood of data. The temptation is to measure everything. The opportunity is to interpret wisely.

Being data-informed means knowing which signals matter most at a given moment. It’s about reading patterns, noticing exceptions, and using that knowledge to guide next steps. It’s less about quantity and more about clarity.

Data stewardship means using information responsibly, interpreting it wisely, and sharing it transparently. When learners share their progress and feedback, they’re giving us insight into how they grow. How we collect and interpret their data affects trust.

The teams that do this well treat data as an input to strategic thinking. They know when to trust the numbers and when to listen to the people behind them. They balance analysis with intuition and precision with empathy.

If the goal of L&D is to improve performance and enable people to do their best work, then measurement must reflect both sides of that equation. Analytics can show what’s changing. Judgment explains why it matters. Together, they build the credibility our field has always sought.

Data-informed decision-making relies on evidence but keeps perspective. It pairs data with experience, not as a replacement, but as a partner. When those come together, we move closer to a learning culture that earns trust through results people can feel and see.


Written for OEB 2025 by Tracie Cantu.

Join Tracie for her Learning Café at OEB25 titled: “Data-Informed, Not Data-Driven: Reframing L&D Metrics for Human and Business Impact.”

Leave a Reply

Your email address will not be published.