14 August 2019

Learning Measurement: Consider Lagging and Leading Indicators

Dani Johnson
Co-founder & Principal Analyst


  • In 2019, we had the opportunity to talk to several learning leaders about their practices to understand how they were having impact on the overall business goals of their organization.
  • While each L&D function necessarily impacts (and measures that impact) differently, our interviews with learning leaders helped us to identify several patterns.
  • This is the fourth in a series of 7 articles highlighting these patterns. A huge thanks to forMetris for sponsoring this research!

Measuring what and how people learn is multidimensional, hard to quantify, and even more difficult to prove. It’s an age-old problem, and one that has yet to be solved. L&D functions with a sole focus on proving their impact to their organizations will undoubtedly disappoint. We are in a world that changes very fast, and L&D needs to develop the ability to, in essence, see around corners.

How? Currently, most L&D organizations use metrics as a way to report out what they have done instead of using them to determine what they should do next. The latter requires a different set of metrics applied in a different way: we’re talking about using leading indicators along with lagging ones. What’s the difference?

Lagging indicators

Lagging indicators are typically “output” oriented. They are easy to measure but hard to improve or influence.  A lagging indicator is one that usually follows an event. The importance of a lagging indicator is its ability to confirm that a pattern is occurring.[i]  In business in general, and in L&D functions specifically, lagging indicators are often used to provide some sort of report card: to tell the market (or whoever controls your budget) how well you have been doing.

Lagging Indicators:
Used to confirm long-term trends. Significant changes in the organization generally occur before trends in the market.Examples:

  • ROI (Return on Investment)
  • Development Spend per Employee
  • Return on Assets
  • Operating Income Growth

Not surprisingly, especially since organizations are often hyper-focused on efficiency metrics and what Wall Street thinks, lagging indicators are used extensively within organizations, and especially within L&D functions.

In fact, the metric we (unfortunately) hear most often is Return on Investment (ROI), the granddaddy of lagging indicators. ROI is a great example of a lagging indicator because it is primarily used as a way to show impact. ROI only offers the organization a report card – whether or not some initiative or course delivered value. There is little to nothing that an L&D function can do with that information. It’s in essence, a dead metric.

This is not to say that lagging indicators don’t serve a purpose. They do. But because they are backward focusing, they often provide an incomplete picture of what is happening, and even more important, what needs to happen moving forward.

Leading indicators

Leading indicators, on the other hand, are easier to influence but hard(er) to measure[i]. While lagging indicators measure what has already happened, leading indicators are predictive in nature. They are frequent and formative, and provide information to the organization that can help it adjust or change. By their nature, they do not necessarily show impact that L&D functions have had, but they do offer crucial information to help them adjust (and because of that, enable the L&D function to have impact). Leading indicators are used much less often in L&D functions.

Leading Indicators:
Predictive in nature. Frequent and formative. Offer valuable information that can help organizations adjust or change.Examples:

  • Employee / manager engagement
  • Intention to change
  • Improved awareness
  • Actions implemented after a course
  • Popularity of a topic on an LXP

Interestingly, what is considered a lagging indicator in one case can actually be a leading indicator in another. For example, for L&D functions who want to measure their own effectiveness, course completions and assessment scores can be seen as lagging indicators – a report card on a specific course or initiative. For the organization, on the other hand, course completions and assessment scores may be leading indicators – predictors of behaviors in the workplace or key performance business measures.

Leading indicators become particularly valuable to L&D functions as they move away from measuring the course or the initiative and instead focus on the conditions they are creating for learning, as we discussed here. Monitoring leading indicators gives the L&D function an idea of what is working, what is not working, and what could be shifted in order to affect the business outcome.

If we return to the example on innovation that we introduced here, we see that many of the measures we use for monitoring the learning conditions or environment can be seen as leading indicators.

Leading Indicators for Measuring Conditions – Example for Innovation

 Leading Indicators for Measuring Conditions – Example for Innovation

While it would be nearly impossible (and certainly not advisable) to tie innovation to any one of the metrics in the image above, understanding how they are trending and then correlating those trends with innovation KPIs helps the L&D function know which way to move.

In planning their measurement and analysis strategies, L&D functions should pay attention to both leading and lagging indicators, and be aware of how those indicators will be received and used by the rest of the organization.

Questions to ask:

  • How many of our metrics are lagging indicators, which tell us how we have done versus leading indicators, which tell us what we should do?
  • Are we able to make decisions about what we need to do from our metrics?
  • To what extent are we identifying how our metrics are leading information to organizational KPIs?
  • How often can we correlate our data with data from the organization?


Written by

Dani Johnson
Dani Johnson

Dani is Co-founder and Principal Analyst for RedThread Research. She has spent the majority of her career writing about, conducting research in, and consulting on human capital practices and technology. Her ideas can be found in publications such as Wall Street Journal, CLO Magazine, HR Magazine, and Employment Relations. Dani holds an MBA and an MS and BS in Mechanical Engineering from BYU.

Share This