One of the most interesting discoveries we made in the Learning Impact study is how more evolved organizations make a clear distinction between metrics and data. While they are often confused, there are some key differences between the two.
Successful organizations can only measure so many things well, and what they measure ties to its definition of success.1 While data in and of itself is directionless and meaningless, metrics have significant meaning and are intentionally created and tracked to show progress. Many organizations may be data-rich while they are metric-poor. They have yet to determine which metrics tie to their definition of success. We found that evolved organizations are doing two things differently:
- Make their own metrics
- Intentionally choose data sources
Choosing Data Sources
As a part of this study, we took a look at learning metrics commonly referred to in the literature. Interestingly, the data sources for these metrics generally fell into roughly five categories.
- Learning / LMS data
- Surveys, evaluations, assessments
- HR / HRIS data
- Observations, conversations, meetings
- Business data, latent system data
During our interviews, we asked which of these data sources was most commonly used. Not surprisingly, learning data, followed by surveys were used most commonly. It makes sense that L&D functions tend to leverage those data sources that they are wholly in control of: it takes little collaboration or permission to use your own data or issue surveys, evaluations, or assessments, or access data that lives within their own systems.
Comparing data from one part of the function to data in another part of the function can help L&D understand how what it does affects its own metrics and goals (sometimes) but does little to build a story around what it does for the larger organization, and does even less to help it understand how it needs to adjust to have greater impact.
Depending on the measurement and analytics strategy of the L&D function, some or all of these sources could be accessed; however, to some degree, using each of these data sources taxes organizations: measurement is not free. The image below shows these five data sources, gives examples of some of the types of data available, and identifies the tax on the organization.
Interestingly, while many L&D functions default to LMS data first, and then to surveys, evaluations, and assessments (Kirkpatrick, anyone?) to gather information, these types of information are often geared toward one course or initiative (rather than thinking more broadly of conditions), and, at least the second of the two methods, often requires a fairly heavy lift from the organization: lots of people hours are needed -from L&D, from the people taking the survey, sometimes from the IT department, and from those that will crunch the data.
Anecdotally, when looking at the five methods based on how frequently we see them used and the resources required to get good metrics from them, you get something like the image below.
More evolved L&D functions recognize the tax that measurement and analytics can put on the organization, and appear to be finding and using data that already exists. Business systems, web analytics, and other latent system data can provide all kinds of information to the L&D function without requiring a significant lift by the organization.
How leaders are doing it
One learning analytics leader in a large communications organization realized fairly quickly that learning impact data would be better if he mined existing systems for existing data rather than only surveying employees. This leader began by identifying the KPIs important to the business lines (as it turns out, many of them had the same concerns and needed to move the needle on the same thing), and then identified the data sources that would provide him information to measure against.
He then worked with the IT and analytics teams to set up recurring data feeds instead of static reports or one-time data pulls. “It’s only slightly more difficult to put a feed into place than it is to ask them to do a one-time data pull”, this leader said. “In the long run, we’re causing less work for the broader organization.”
The data feeds made it possible to create a much more agile learning team, as they were able to adapt to the needs of the organization and the changing KPIs.
A final note: one thing that surprised us during some of our interviews was the territorial nature of some data owners. L&D functions expressed their frustration in not being able to get KPI data and metrics from other business functions because the organization didn’t foster a culture of trust and respect.
While more evolved organizations tend to view data and metrics less as “ours” and “theirs” and instead look at all sources of data that can help them, the organization, and the individual make decisions, this is still a major challenge for many. We think that as L&D functions begin to show impact and help other business functions perform that this will be less of a challenge. Where it is a challenge, L&D should start where they are and slowly win leaders of other functions over. And yes, we know. This is easier said than done.
Questions to ask:
- To what extent are we relying on existing data and metrics from our systems rather than making metrics that would better serve business goals?
- Do we understand the sources of our data?
- How often are we leveraging data sources beyond those entirely controlled by L&D?
- Are we aware of the measurement tax we’re asking for?
1 Know the Difference Between Your Data and Your Metrics. Harvard Business Review. Jeff Platt and Bob Filbin. March 2013.
2 Google Dictionary
3. How Much Data Do We Create Every Day? The Mind-Blowing Stats Everyone Should Read Bernard Marr, Forbes, May 21, 2018