Events

Learning Impact: Roundtable #1 Mindmap

Posted on Friday, March 15th, 2019 at 9:08 PM    

Recently, several leaders came together to talk about the ins and outs of learning impact – how we measure it, and how we have it. The mindmap below is a summary of that meeting. Click into the box to explore the map!

 


Learning Impact: Anything New?

Posted on Friday, March 8th, 2019 at 1:55 AM    

Introduction

If you have been following our Learning Impact project, you know that the main premise of this research is that we’re evaluating “learning” in organizations all wrong. Therefore, in conducting a fairly in-depth review of existing literature on the topic, we were not at all surprised about the state of learning impact – only somewhat disappointed.

We looked at over 50 academic and business articles, reports, and books for this literature review, which has given us a decent understanding of the known world of evaluating learning. This short article will summarize:

  • What we saw
  • What we learned
  • Overall impressions

Learning Impact: Anything New?

Word cloud of the learning impact literature: Most prevalent words in literature reviewed.

What we saw

What we hoped to see in the literature were new ideas – different ways of defining impact for the different conditions we find ourselves in. And while we did see some, the majority of what we read can be described as same. Same trends and themes based on the same models with little variation. We have highlighted four of those themes below.

Models, models, models!

Much of the literature focused on established evaluation models. These articles generally fell into three main categories: use cases, suggested improvements, or critiques.

By far, the most common model addressed in the research is Kirkpatrick, although, given the number of articles written on how to use it effectively, adaptation is still a challenge. It was frequently called the “industry standard” (including by Kirkpatrick Partners themselves).1

Articles on Kirkpatrick appeared to be fairly passionate, either for or against. While many authors doubled down on it, we also ran across several articles, like this one or this one, that offer what we consider to be fair critiques of the model, including that there is a lack of empirical evidence for existing learning evaluation models and their tie to business results.

Other models, including Phillips’ Chain of Impact, Kaufman’s 5 Levels of Evaluation, Brinkerhoff’s Success Case Method, and Anderson’s Return on Expectation, among others, were also explored. In total, we looked at over 20 models. They are summarized in the table below.

Evaluation Model Summary

 

Model Year Incredibly simplified steps  Read more
Kirkpatrick 4 Levels of Training Evaluation

Kirkpatrick

1976 Termed “the industry standard” by many of the articles we read, Kirkpatrick’s four levels are used widely to determine learning effectiveness.

  • Reaction
  • Learning
  • Behavior
  • Results
“The Kirkpatrick Model”
Kaufman’s 5 levels of Evaluation

Kaufman, Keller, Watkins

1994 Kaufman’s model adapts Kirkpatrick’s original model to include 5 levels and is used to evaluate a program from the employee’s perspective.

  • Input and process
  • Acquisition
  • Application
  • Organization Output
  • Societal outcomes
“What Works and What Doesn’t: Evaluation Beyond Kirkpatrick”
Success Case Method

Robert Brinkerhoff

2006 Particularly effective in assessing important or innovative programs. It focuses on looking at extremes: most successful and least successful  cases, and examining them in detail.

  • Needs assessment
  • Program plan and design
  • Program operation and implementation
  • Learning
  • Usage and endurance of training
  • Payoff
“Success Case Method”
Chain of Impact

Jack Phillips

1973 Adapts the Kirkpatrick model by adding a fifth step: ROI. Purpose is to translate business impact of learning into monetary terms so that it can be compared more readily.

  • Evaluation
  • Reaction
  • Learning
  • Behavior
  • Results
  • ROI
ROI Institute
Value of Learning Model

Valerie Anderson

2007 Consists of a three-stage cycle applied at the organizational level. One of the few models that does not necessarily use the course or initiative as the unit of measurement. Anderson also introduced the term, “Return on Expectation” as a part of her work.

  • Determine current alignment against strategic priorities
  • Use a range of methods to assess and evaluate the contribution of learning including learning function measures, ROE – expectation, ROI, and Benchmark and capacity measures
  • Establish the most relevant approaches for your org
A new model of value and evaluation: A new model of value and evaluation
CIPP Model Daniel Stufflebeam 1973 Framework was designed as a way of linking evaluation to program decision-making (i.e., making decisions about what happens to the program). Has a use case for resource allocation and/or cost-cutting measures. Utilizes the following areas:

  • Context
  • Input
  • Product or results
  • Analyze
“The CIPP Evaluation Model: How to Evaluate for Improvement and Accountability”
UCLA Model

Marvin Alkins

Dale Woolley

1969 Five kinds or need areas of evaluation – each designed to provide and report information useful for making judgments relative to the categories:

  • Systems assessment
  • Program planning
  • Program Implementation
  • Program Improvement
  • Program certification
“A Model for Educational Evaluation”
Discrepancy Model 1966 Used in situations where there is an understanding that a program does not exist in a vacuum, but instead within a complex organizational structure.

Program Cycle Framework:

  • Design
  • Installation
  • Process
  • Product
  • Cost-benefit
“ABCs of Evaluation”
Goal Free Evaluation

Michael Scriven

1991 Focuses on actual outcomes of a program rather than only those goals that are identified. Scriven believed that goals of a particular program should not be taken as a given.

  • Goals and objectives
  • Processes and Activities
  • Outcomes
“The ABCs of Evaluation”
LTEM: Learning Transfer Evaluation Model

Will Thalheimer

2018 Designed to help organizations get feedback to build more effective learning interventions and validate results.

  • Tier 8: Effects of Transfer
  • Tier 7: Transfer
  • Tier 6: Task Competence
  • Tier 5: Decision-making Competence
  • Tier 4: Knowledge
  • Tier 3: Learner Perceptions
  • Tier 2: Activity
  • Tier 1: Attendance
“The learning-transfer evaluation model: Sending messages to enable learning effectiveness.”

Justification as the goal

Much of the literature reviewed focused on utilizing learning evaluation, measurement, and analytics to either prove L&D’s worth to the organization or to validate L&D’s choices and budget. Words and phrases like “justify” and “show value” were used often.

Interestingly, according to David Wentworth at Brandon Hall Group, the pressure to defend L&D’s decisions and actions appears to be coming from the L&D function itself (44%), rather than other areas of the business (36%),2 which means, while business leaders may not be explicitly asking for “proof”, L&D departments most likely feel the need to quantify employee development in order to have that proverbial seat at the table.
Literature also focused heavily on Return on Investment, or ROI. How-to articles and research in this space continues to attempt to tie the outcomes of a specific program or initiative to financial business results.

Course-focused

Almost all of the literature we reviewed utilized the ‘course’ or ‘program’ as the unit of measurement. While several models address the need to take into account environment and other variables, they appear to do so in order to either control the entire experience or be able to isolate the “learning” from everything else.

To date, we have not been able to find any literature that addresses evaluating or measuring continuous learning as we understand it (i.e., individuals utilizing the environment and all resources available to them to continuously develop and improve). We feel that this is a shortfall of the current research and should be addressed.

Finally, the research focused heavily on learning from the L&D function’s point of view. Few appear to be looking at the field of learning evaluation/measurement/analytics from a holistic viewpoint. We expected to see more literature addressing L&D’s role in delivering the business strategy, or at lease providing information to other functions that could be helpful to them in making decisions.

Aged ideas

While we do not disparage any of the great work that has been done in the area of learning measurement and evaluation, many of the models and constructs are over 50 years old, and many of the ideas are equally as old.

On the whole, the literature on learning measurement and evaluation failed to take into account that the world has shifted – from the attitudes of our employees to the tools available to develop them to the opportunities we have to measure. Many articles focused on shoe-horning many of the new challenges L&D functions face into old constructs and models.
We realize that this last finding may spark a heated conversation – to which we say, GOOD! It’s time to have that conversation.

5 articles you should read

Of the literature we reviewed, several pieces stood out to us. Each of the following authors and their work contained information that we found useful and mind-changing. We learned from their perspectives and encourage you to do the same.

Article 1:  Making an Impact3

Laura Overton and Dr. Genny Dixon at Towards Maturity

“96% are looking to improve the way they gather and analyze data on learning impact. However, only 17% are doing it.”

Highlights:

  • Points out areas where L&D functions measure and what is important to them
  • Provides an interesting discussion on evidence to understanding impact
  • Shows compelling data about benefits of those who measure vs. those who guess
  • Gives some good hints for getting started

Towards Maturity’s 2016 report provides some interesting statistics about the world of learning metrics / measurement / analytics / evaluation. This article provides a sound platform for continued research on learning measurement and evaluation, as it provides a good summary of how learning leaders are currently thinking about the space.

Article 2:  Human Capital Analytics @Work4

Patti Phillips and Rebecca Ray at The Conference Board

“Aspirational organizations use analytics to justify actions. Experienced organizations build on what they learned at the aspirational level and use analytics to guide actions.”

Highlights:

  • Outlines an analytics maturity model for organizations to gauge their evolution when it comes to using data.
  • Provides a good discussion on “converting data to money”, or utilizing data to provide a comparison of cost savings
  • Identifies four key elements to help organizations make analytics work: Frameworks, Process Models, Guiding Principles, and Capability, all of which should be considered when putting together a learning strategy
  • Recounts some good examples and case studies

This article broadens the discussion about learning measurement to people analytics in general – something that L&D functions should be considering as they revamp their measurement and evaluation methods.

Article 3: The Learning-Transfer Evaluation Model5

Will Thalheimer at Work-Learning Research, Inc.

“For too long, many in the learning profession have used these irrelevant metrics as indicators of learning.”

Highlights:

  • Honest (yet biting) assessment of the current 4-level models and their success to this point in time
  • Section about the messages that measurement can send
  • Discussion on measuring negative impact, as well as positive impact
  • Introduction of the first new model for learning evaluation in about 10 years

Will addresses several points that have evolved our thinking. On top of that, Will is a witty writer who is easy to read and downright entertaining.

Article 4: Making data analytics work for you – instead of the other way around6

Helen Mayhew, Tamin Saleh, and Simon Williams

“Insights often live at the boundaries.”

Highlights:

  • Emphasizes the importance of focusing on “purpose-driven” data, or data that will help you meet your specific purpose.
  • Introduces the idea that large differences can come from exploiting and amplifying incrementally small improvements.
  • States that incomplete information is not useless and should not be treated as garbage – it has value and can be essential in helping people connect the dots
  • Provides a good discussion on using feedback loops instead of feedback lines

This article addresses data analytics in general, but provides several applicable points that L&D departments can incorporate.

Article 5: Leading with Next-Generation Key Performance Indicators7

Michael Schrage and David Kiron

“Measurement Leaders look to KPIs to help them lead — to find new growth opportunities for their company and new ways to motivate and inspire their teams.”

Highlights:

  • Provides a decent discussion on Key Performance Indicators and what they currently mean in organizations
  • Points to Chief Marketing Officers and their increasing accountability for growth-oriented objectives (we think CLOs and L&D in general are close behind).
  • Has an excellent discussion on leading versus lagging indicators, and the importance of both in “measuring”.
  • Recounts several good case studies that helped us think differently about what a KPI is and how it can be used

We found this article eye-opening. While it is not geared specifically to “learning”, it provides several, adaptable ideas that we feel will be important for next-generation learning measurement and evaluation.

Bonus: 4 Measurement Strategies That Create the Right Incentives for Learning8

Grovo

“As human beings, we are compelled to adapt our behavior to the metrics we are held against.”

Highlights:

  • Has a great discussion on how even the act of measuring learning can be a motivation.
  • Introduces several non-traditional ways to “measure” learning
  • Makes the point that measurement strategy should be a part of the learning strategy, not as a way to measure its effectiveness.

Yes, we know it’s a blog, and yes, we realize it was written by a vendor. But this piece made some interesting points – particularly, how what we measure impacts the business.

Overall Impressions

If we were to sum up all we read into a short statement, it would be this: L&D has a long way to go. That said, we are also hopeful. As L&D functions further integrate into the rest of the business, as tools for analytics and measurement get better, and as we begin to define new models that incorporate new ways of learning and new environmental variables, we can imagine a world, in the not too distant future, where we finally – after more than 50 years of trying – maybe crack this nut.

We would love to hear what you think – what did we miss? What else should we be looking at? Comment below.


D&I Tech: The Rise of a Transformative Market

Posted on Tuesday, February 5th, 2019 at 10:47 PM    

In this Research:

Diversity and inclusion is not a new idea for today's corporations, but over the last 18 months, the slow D&I burn has turned into a flashpoint, in part due to the #MeToo moment. Leaders across organizations are asking: "How can we systematically challenge the status quo, and build a more diverse and inclusive workforce?"

D&I Technology Rise of a transformative market

It is upon this foundational question that technology companies have begun to construct dozens of new and innovative ideas to support equity, diversity and inclusion in the workplace—recognizing that new technological capabilities, paired with this increased urgency, represents an opportunity to address D&I challenges in novel ways.


Learning Technology Landscape

Posted on Friday, November 2nd, 2018 at 7:53 PM    

In this Research:

The learning technology landscape has gotten complicated in recent years. Whereas, in the past, learning and HR leaders simply had to choose which LMS to purchase, now, they are faced with endless options. However, while these options increase complexity, at the same time, they offer new and more effective ways to develop and engage workforces.

This report dives into the learning technology landscape, offering insights on trends, growth, innovation, and the learning tech providers that make it up. Specifically, it addresses:

  • The latest in learning tech trends, growth of the market, and new innovations
  • A framework for enabling employee development with technology
  • Tips on making better purchasing decisions

Humanizing Learning

Posted on Saturday, September 15th, 2018 at 4:36 PM    

For almost all organizations, digital transformation is inevitable. Google Search data shows a 900% increase in the term digital transformation in the last 4 years. Yet, the focus on digital transformation has created a lot of panic. In navigating through the differing opinions about what the future holds – AI, automation, and robots – we find it interesting that there has been little discussion about the uniquely human characteristics that have made our species successful in the first place.

This research reveals how forward-thinking leaders are leveraging the characteristics that make us uniquely human to make their organizations more competitive. How can we nurture these uniquely human traits in workplace culture to create a high-performing organization?


D&I Tech: A Question Becomes a Quest

Posted on Tuesday, September 11th, 2018 at 4:14 PM    

Back in March 2018, I posted to LinkedIn what I thought would be a rather quickly forgotten question: What technology had others seen that focused on improving diversity and inclusion (D&I) in companies? The response was huge, with lots of people I'd never met sharing how their company was using technology to tackle diversity and inclusion in ways that I'd not even dreamed of. Clearly, something big was happening – so the question turned into a quest to understand this new market.

We've ended the first 2 phases of that quest with the publication of our research on D&I tech, Diversity and Inclusion Technology: The Rise of a Transformative Market, which we, RedThread Research, have completed in partnership with Mercer.

Let me take a step back and tell you why I was even asking the question. Years ago, I'd asked folks what vendors they used to help with D&I. Most people just scratched their heads, and said, “Huh? I don’t understand what you mean.” So, I went about my merry way working on a study that ultimately focused on D&I practices, with no technology component.

Post #MeToo. Post many public D&I missteps that cost executives their jobs and companies their stock prices. I thought, surely, now, there must be technology focused on this space. But I just hadn’t read that much about it.

I started talking to a lot of people about this topic and found that it resonated with many of them. One of those people was Carole Jackson, a former colleague and current Principal at Mercer, focused on their When Women Thrive research. We found a shared passion for this topic and we agreed to partner on this research to bring a heightened understanding of the D&I technology market to both vendors and customers.

So, what began as my simple question ended up turning into a quest to find as many technology vendors focused on D&I as possible – and document who they are and what they do. Why? Three reasons:

  1. This market is exploding with new vendors – Our study has nearly 100 in it (and that's in just this 1st phase of the research) and many of them have only started within the last 3 years. Given this, organizational leaders need to better understand the innovative technology solutions available, and technology vendors need to see where opportunity for new products and solutions exists.
  2. D&I technology has the potential to be a disruptor – Structural biases hide in our processes and behaviors and, applied correctly, D&I technology can enable scalable, consistent treatment of people decisions while also alerting users to previously hidden patterns of bias. That said, our glasses are not so rosy as to blind us to the potential limitations and even detrimental impacts of D&I tech.
  3. Too little information is available on the market – The folks over at Gartner have written a report on this topic, but not everyone can access that. Further, focusing on the question of “If There’s Too Much Diversity Tech?” doesn’t give folks insight into the range and capabilities of D&I tech. We wanted to do an in-depth study that would help vendors and buyers truly understand the market.

To that end, our study answers 5 questions:

  1. What is D&I technology?
  2. Why are D&I technologies coming to market right now?
  3. What are the benefits and potential risks?
  4. What types of D&I technologies exist?
  5. Who are some of the players in the different D&I technology categories?

This report is a both qualitative and quantitative study that summarizes the D&I tech market landscape, based on a vendor and customer survey, customer interviews, and the feedback we received. It also includes an interactive market map tool that allows readers to quickly understand which vendors are in the market.

THANK YOU! To everyone – practitioners and vendors alike – for participating in this research! We hope you'll continue to be part of the D&I tech conversation going forward!


Snacking can make you fat (headed)

Posted on Wednesday, May 9th, 2018 at 8:36 PM    

Let me start by saying that I have argued the other side of the case I’m about to make.

Also, I see great value in just-in-time learning, feedback in the moment, and the ability to access the exact piece of information you need at the exact moment you need it. Digitizing and chunking content that we used to put into two- or three-day workshops is wonderful, and, with the use of technology, allows us to build really personalized development experiences for employees. I think it’s great for developing skills and improving performance.

I do wonder, however, about the broad stroke with which the idea of “snackable” learning is discussed and applied. Is there a place for it? Absolutely. Have we relied on courses only for too long? For sure. Is making something shorter the key to solving all employee development problems? Nope.

In the past, we needed employees to complete certain tasks in a certain way in order to increase the efficiency of our organizations. Today, business is moving so fast that we need them to think outside the box, be agile, and improve the system as they go. We need them to think critically. And often, to teach employees to do this, long form works better. Some things need to be presented in context. Sometimes a story works better than bullet points. And sometimes we should encourage employees to spend an hour thinking rather than surfacing an answer immediately.

Ironically, instead of a long form blog about this topic, I’m going to provide a bulleted list reasons that long-form may be a good addition to the L&D quiver of tools:

  • Jeff Bezos says so. In his 2018 annual letter, Jeff Bezos reiterated his rule that PowerPoint is banned from executive meetings. He maintains that “narrative structure” is more effective because stories inspire, bullet points don’t. Instead of presentations, he asks “presenters” to craft a six-page narrative (no bullets and real sentences). The team spends 30 minutes reading in silence and then they discuss.
  • “Snackable” often creates soundbites and echo chambers instead of real learning. So personal example here: I posted an article and quoted a stat this week about organizations that measure learning impact. I didn’t quote it correctly, which gave the impression that the stat was global, not local to India. One person corrected me (bless him). Everyone else shared it. There is opportunity for deeper context and higher precision in long form that isn’t available in the soundbite.
  • There is a case to be made for “effortful” learning. Mary Slaughter and David Rock from the NeuroLeadership Institute wrote an article in Fast Company this week about achieving “desirable difficulty”. They posit that the brain needs to feel some discomfort when it’s learning, much like your muscles need to feel some level of discomfort when you’re training. Long form often requires more effort.
  • Executives prefer long form for business insights. A study done last year by Forbes and Deloitte lists the top two preferred formats of executives for business insights as feature-length articles and reports, and business books. Interestingly, while they are very pressed for time, the C-Suite prefers longer forms for learning. Bruce Rogers, Chief Insights Officer at Forbes Media says: “CXOs need to think and act strategically, which is why they more often opt for longer pieces that take them from hypothesis, through case studies, to conclusion, and are based on credible data.”

I’m interested in your thoughts – how often are you incorporating long form into your employee development plans, and/or are you seeing a resurgence?


Learning Technology, or just Technology?

Posted on Thursday, April 12th, 2018 at 8:28 PM    

I spend one day a week tutoring immigrants who have left their home countries in search of a better life. They dedicate two hours, two days a week to learning English so that they are able to be active members of society. Sometimes I feel like the only two productive hours I spend each week are the two I spend with them. Incidentally, it’s one of the best tutoring programs I’ve been a part of and they’re always looking for volunteers. If you live in the Salt Lake City area, you should volunteer: Guadaloupe Schools Adult Education Program.

But I digress. The thing that constantly surprises me about the time I spend with that group is how much I actually learn. Sure, they correct my horrible Spanish and weed out the Dominican slang I picked up, but more than that, many of the ideas I decide to pursue from a research perspective start with the things I notice with this tutoring group.

One of those things actually made my list of topics for this year: adapting technologies that are not learning technologies for learning. I started thinking about it because of an experience I had with a guy in my tutoring group about 6 months ago. We were working on pronunciation of a particularly difficult word. He’d say it, then I’d say it, then he’d say it, and on and on. Finally, he stopped, pulled out his cell phone pulled up Google Translate, and said the word into his phone until his phone recognized the word he was trying to say.

Of course, it worked. Better than the conversation with me was working. It was a tool he knew was available in his environment, it was familiar to him, and I got the feeling that this wasn’t the first time he had done it.

In the past year or so, I’ve seen the adaptation of technologies that were meant for some other purpose being used for employee development. More than that, I’ve seen tech organizations start to play in the learning space. Google, Apple, Microsoft, and most recently, Amazon, have started to offer technologies geared specifically toward learning and development. YouTube has long been the staple of lifelong learners (I used it recently to change my car headlamps).

In light of that, I have some high-level, totally unscientific advice for organizations trying to figure out what their learning tech stack should look like:

  1. Consider everything. By this I go beyond the technologies that are specific for learning and that show up on your L&D balance sheet. Need some ideas? Start with Jane Hart’s annual Top 100 Tools for Learning list.
  2. Notice what people are already using. In many cases, it makes much more sense to commandeer something that is familiar and accepted than to try to find an learning tech knock-off. Is Whatsapp or Slack a staple when it comes to sharing knowledge and expertise? Embrace it.
  3. Experiment. We talk to lots of companies and the more evolved ones tend to reconsider technology at more frequent intervals. They’re constantly trying stuff out, updating, and sunsetting tech that no longer works for them.
  4. Be flexible. There is absolutely a place for large, enterprise solutions, and most organizations use at least one as a hub for other technology used for learning. But don’t be afraid to piece together best of breed solutions to meet your needs.
  5. Use data. Data exists somewhere in your company that can give you a better idea of what is being used and what is not. Find it. Use it. Make decisions from it.

And finally, the ask. Because this is an ongoing point of study for me, I’d love to hear from you if your org uses something unconventional for employee development. I’m also super interested in how organizations are putting technologies together to provide the right kind of experience and to get the right kinds of results. If you’ve got stories, I’d love to hear from you!


L&D Trends for 2019: The Agile Workforce

Posted on Monday, April 2nd, 2018 at 8:21 PM    

Simply put, L&D’s sole reason for existing is to ensure a skilled workforce. Hard stop.

In a world where businesses change so rapidly, employees move around frequently, and roles are constantly being adjusted, the job is now harder; but not impossible. New mindsets, technologies, and ways of working are creating opportunities for innovation in the employee development space.

Our learning and career research for the next six months will focus on creating an agile workforce. And as we set out to determine what exactly we should study, five fairly significant trends emerged.

  • The rise of reputation
  • Using tech to do completely different things
  • A more integrated breed of L&D function
  • Data as a development enabler
  • Learning organisms

Trend #1: The rise of reputation.

To this point in history, organizations generally determine who needs what training, or who gets what role based on a very one-dimensional view of the employee – generally what can be found on a resume or in an employee profile: level, education, role, tenure, or leadership responsibility. Learning and performance initiatives, not to mention readiness discussions about subsequent roles, are often triggered by one or more of these variables.

However, this is no longer adequate for two reasons. First, organizations are developing more open career models and encouraging movement outside of traditional career paths. Second, employees find development opportunities on their own – both inside and outside of the organization. As a result, most companies lack a good understanding of current skills and knowledge of employees, let alone the direction they’d like to take in their careers.

Organizations are beginning to augment information found on resumes or in employee profiles with other information that indicates the reputation employees have developed. For example, organizational network analysis, or ONA, provides information about an employee’s reach within the organization, which can indicate a person’s influence, the resources they have at their disposal, and what parts of the organization hold their interest. Several vendors, including DegreedPathgather, and Edcast, among others, build transparency about networks into their systems.

Other reputation markers or indicators, shared through other systems like LinkedInGithubYelp, or, if you’re in academia, RateMyProfessor, provide external data about how employees are perceived among their peers, or how they may need to develop to be more successful.

Trend #2: Using tech to do completely different things.

Most organizations use learning technology to automate things that they are already doing. A classic example of this is moving a course online rather than teaching it in the classroom. It’s both cheaper and more accessible, but chances are, there are few differences otherwise.

But we think there is more. In the past three months or so, we have seen organizations and tech vendors break out of traditional learning molds and begin to do completely different things through new technologies or combinations of technologies. While AI, VR, wearables, and the like, were considered too futuristic even a year ago, we are finding applications that help organizations personalize development experiences and build skills that they haven’t in the past.

Additionally, organizations are beginning to leverage technologies originally intended for other purposes for employee development. Slack and other messaging tools, business tools, like O365, (have you seen the resume helper that pops up when it thinks you’re drafting a resume?) are able to integrate opportunities for growth at the point of need. In fact, some of the biggest threats to the learning technology space will most likely come from the outside.

In the next few months, we’ll be talking about these technologies and their applications. Our goal is to help leaders categorize, understand, and make better decisions about the technology they use for development.

Trend #3: A more integrated breed of L&D function.

In the past, L&D functions have tended to be fairly siloed and often internally focused. Many have used vocabulary, metrics, and infrastructures that make sense only to them. This has made progress difficult for the L&D function, but has also hobbled the larger organization. Its tendency to remain separate has slowed down its ability to react to changes in the strategy and align to other business functions.

Lately, however, a new breed of L&D function has (finally) begun to emerge. Often led by leaders without an L&D pedigree, functions focus on alignment to the business strategy and external customer needs.

One way they do this is by integrating more tightly with other people practices. This integration enables systemic solutions – particularly those that result in a culture that supports the strategy. For example, if an organization competes on customer service, how employees are rewarded, trained, recruited, and led, should all be aligned to delivering great customer service.

These L&D functions are also aligning more closely to the rest of the business by viewing learning as something that happens inside the context of the work itself. Using the work for development simplifies the learning process (because context is built right in) and allows the organization to develop individuals at the same time it is improving the work, which leads to a more agile workforce.

Trend #4: Data as a development enabler.

While we’re at the beginning of this movement, we are seeing organizations start to use data to personalize development. Latent data collected from existing work tools – such as email, cloud storage, and calendars provides rich and useful information.

External vendor partners, such as Cultivate AI and Keen Corp are leveraging natural language processing (NLP) to turn latent data into data that can be analyzed. Analysis can determine politeness, engagement between individuals, and even bias. This information allows the system to provide in-the-moment feedback that helps them correct work at the same time employees are being made aware of language choices or biases that may hold them back.

This type of data can be a game changer for L&D. It moves them beyond smile sheets and completion data and helps them create systems that deliver business results, not just fulfill learning objectives. We’ll most likely be talking about this quite a bit in the following months.

Trend #5: The learning organism.

In more evolved organizations, learning has pretty much taken on a life of its own; they have in essence become organisms. that learn, grow, and develop based on their habitat and the ability to make use of it. The more in sync these organisms are with their habitat, the more quickly they are able to react to change, take calculated risks, and evolve as necessary.

We’ve noticed that more evolved organizations view the habitat in at least four buckets: the company’s attitude toward learning in all of its forms, the ability to use work and tasks as the main conduit for development, the infrastructure and technology in place, and the actual, physical and virtual environments. Evolved organizations emphasize alignment of these four areas and make them into a cohesive environment that encourages the organism to find what it needs, when it needs it.

In the coming months, we’ll be exploring this idea further, providing case studies, insights, and best practices for building the right habitat for development.

Do these ideas resonate? Did we miss something big?


Talent Management Trends for 2018: Accelerating Toward a More Responsive Organization

Posted on Tuesday, March 27th, 2018 at 4:06 PM    

With the currently extremely low unemployment numbers, many organizations are searching for a way to respond better to their employees' needs and are increasingly investing in this space. This is important, as this acceleration is driving several other substantial trends. Below are the top five on our radar right now.

  • Converging people practices – but they need to create business results (not just a common employee experience)
  • Designed networks – seeing the world and creating it as we want it to be
  • Diversity and inclusion – now core HR responsibilities
  • A new era in people data – with great power comes great responsibility
  • Leading in a time of artificial intelligence and other advanced technologies – developing new leadership muscles and reflexes

Trend #1: Converging people practices that create business results

Many organizations are trying to be more responsive to employees’ needs. However, if talent organizations operate in silos (e.g., separate performance management, learning, leadership, and other talent management activities), it is difficult to adequately understand employees’ needs and respond appropriately. Understanding this, many leaders are talking about “talent management and learning converging” and creating a “consistent employee experience.”

There is a lot good about this approach. Many organizations are trying to holistically understand employees’ experiences and bring together their talent practices in more integrated ways. Companies are using a variety of tactics, such as design thinking and agile development methods, as well as new tools, such as employee listening and pulse survey technologies (vendors include Glint, TinyPulse, and Waggl, among others), to create programs and experiences that are much more holistic, consistent, and responsive to employees. This is good.

This approach can use some refinement, however, when it comes to why organizations are creating a “consistent employee experience.” The purpose cannot be just to “treat employees like customers” or to increase engagement scores or happiness (not that we have anything against engagement or happiness). Rather, the purpose of an employee experience should be to reinforce the organizational activities and behaviors necessary to drive business results. For example, if an organization needs to focus on innovation, then its “consistent employee experience” should focus on driving innovative behaviors. The organization should recruit, develop, assess, promote, and reward for the characteristics that drive innovation. A consistent employee experience should exist to keep the business laser-focused on success.

Trend #2: Designed networks – seeing the world and creating it as we want it to be.

A lot has been written about the importance of networks in organizations, but leaders are beginning to design for them more intentionally. For example, Cisco implemented Team Space to help leaders better understand their teams and how to work with them more effectively. Vendors have also taken up the charge, with organizations such as Polinode, Syndio Solutions, Swoop Analytics and TrustSphere, and consultants such as Rob Cross, offering solutions that help companies understand the networks in their organizations and how to design them intentionally. Some learning vendors, such as Degreed, EdCast, and Pathgather, as well as performance vendors such as Zugata, are also beginning to integrate network data into their solutions to make them more responsive and personalized.

This focus on designed networks will likely accelerate, as new data make clear the impact of individuals’ context on their performance and how changes to networks and teams can drive impact for the organizations. Yet, a focus on networks and teams will force a re-thinking of talent management activities. For example, how should an organization approach learning, succession management or performance management, when the focus is first on the network, not the individual?

Trend #3: Diversity and inclusion – now core HR responsibilities.

Recent social movements, epitomized by the #MeToo movement, have highlighted that many HR departments have not responded adequately to issues of diversity and inclusion (D&I) in their organization. As employees expect their organizations to be more responsive, this will include D&I. Expect to see D&I more integrated into sourcing, talent selection, performance management, learning, leadership development, succession management, and other practices. Given the positive impact of creating an inclusive culture on business outcomes, this pressure to integrate is good. Also, we are at the beginning of a rush of technologies that will help leaders understand opportunities to behave in different ways, not just count representation numbers (for example, ADP, Cultivate AI, Entelo, Limeade, SAP SuccessFactors, Syndio Solutions, and Zugata all have solutions focused in this space).

Trend #4: A new era in people data – with great power comes great responsibility.

People have been shouting about Big Data from the mountaintops so long that it is hard to hear the messages about it anymore. That said, technology solutions are beginning to capture pre-existing data that could not be analyzed before – and organizations are starting to take action on those insights. As mentioned above, there are a host of vendors focused on organizational network analysis. Other vendors are translating text into data, offering natural language processing (such as Fama, Cultivate AI, Glint, and IBM), which can help identify trends in text feedback.

But, as they say, with great power comes great responsibility. While there is a rise in powerful tools to analyze new data types, there is also a lot of discussion about data privacy and ethics. This is even more so the case now, with the recent Cambridge Analytica story — and that company's ability to predict behaviors by combining personality, relational, and activity data — coming to light. Europe is much further ahead of the United States when it comes to data and privacy rules, with the European Union’s General Data Protection Regulation (GDPR) coming into effect this May. This topic of data ethics and transparency will likely accelerate dramatically across the next year.

Trend #5: Leading in a time of AI and other technologies

As others have written, rapid changes in technology, as exemplified by artificial intelligence, automation, and cognitive computing, represent large-scale opportunities and disruptions for organizations. Much less discussed is how leaders’ behaviors need to change to be more responsive to employees' needs.

There are at least three questions to examine here. First, how can these advanced technologies enable leaders to be more effective and responsive than before? For example, technology at Cultivate AI and Keen Corporation analyzes sentiment, tone, and response time in email and chat interactions, enabling leaders to understand when there has been a change from historical levels. Other technology, such as that from Bunch AI, allows organizations to analyze historical and current communications in Slack, and compare them to common cultural models and norms. The technology then provides suggestions on how to evolve culture and tools to monitor on a continuous basis. While these tools (and lots of other not-mentioned tools) are potentially powerful, leaders need to understand how and when to use them effectively. Unfortunately, there is currently little information on this topic.

Second, how do these advanced technologies change the experience of leaders’ “followers”? Historically, at least some portions of leaders’ power came from information asymmetry – leaders had information that their followers did not. However, information is increasingly ubiquitous, and with the rise of technologies such as those cited in the paragraph above, information and insights may become known to followers before or at the same time as to leaders. Further, as exemplified by “fake news,” the information followers receive may not be accurate, but followers may not understand this. Finally, with the increasingly sophisticated analysis and communication tools available, followers may create insights or find someone who has knowledge that exceeds that of their leaders.

Third, given these changes, how do leaders need to behave differently? We posit that a big part of the shift will come from leaders diminishing or relinquishing a “command and control” approach in favor of a “curate and coach” approach to leadership. While information is critical, understanding context, mapping potential actions and their consequences, determining appropriate communication approaches, and connecting followers to others within their network, will become increasingly critical to leaders. Doing this effectively will require leaders to develop new muscles and reflexes that many lack today.

This represents our initial thinking on what’s changing with talent management today. What do you think? Is there anything you especially agree – or disagree – with from this list? Are you a vendor offering solutions in this space (if so, let me know!)? What other suggestions do you have? We’ve put together this conjoint analysis survey, where you can vote on the top trends for talent management, make suggestions of others to add to the list, and see what others think. You can also feel free to email me at stacia at redthreadresearch.com or make a comment in the comment field below.

RedThread Research is an active HRCI provider