Posted on Thursday, February 24th, 2022 at 7:52 PM
Organizations are investing more than ever in diversity, equity, inclusion, and belonging (DEIB) efforts. We see an opportunity for L&D functions to do the same, beyond simple diversity training. With their influence on culture and reach across the enterprise, L&D functions are well-positioned to improve the DEIB culture in their organizations.
And L&D functions want to do more on DEIB. In LinkedIn Learning’s 2021 Workplace Learning Report, 64% of L&D professionals globally and 73% in North America said DEIB programs were a priority. Our own experience tracks with this trend: RedThread community members are asking more and more about DEIB and learning.
But L&D functions seem to struggle to identify the best ways to help. That’s why we launched a research study focusing on this question:
What are the most impactful things L&D functions can do to help build a robust DEIB culture in their organizations?
To get a grasp on the current DEIB and learning conversation, we reviewed nearly 100 articles, books, podcasts, and reports. We expected, frankly, to find a lot about diversity training and not much else. And, as expected, there was a lot about diversity training. But there were more interesting ideas, too.
This short article summarizes the key ideas we found, including:
- 4 themes from the literature
- 1 hidden gem
- 5 articles that caught our attention
- 6 additional articles to check out if you have time
What we found: 4 themes from the literature
The literature has lots of ideas about DEIB and learning. These ideas fell into 4 themes:
- L&D is tangential to the DEIB conversation
- L&D is focused on improving diversity training
- Developing underrepresented groups is a common DEIB strategy
- L&D functions need to take a hard look at themselves
L&D is tangential to the DEIB conversation
In the literature we reviewed, DEIB or org psych professionals sometimes wrote about diversity training or unconscious bias programs. But not many L&D professionals ventured into the broader DEIB conversation.
Additionally, a few studies we ran across revealed that L&D functions are on the periphery of DEIB efforts. In one survey by i4cp, only 25% of respondents said L&D is “heavily tasked” with efforts to improve diversity and inclusion goals.
Many articles noted that L&D and DEIB teams often do not work together as closely or as effectively as they could. As a result, L&D functions are sometimes left out of key DEIB strategy, goal-setting, and planning decisions. These pieces argued that if L&D functions want to do more on DEIB, they need to partner better with stakeholders across the business. For example, Matthew Daniel, principal at Guild Education, wrote:
"Rather than siloing objectives onto separate teams, CLOs and CDOs can accomplish more by working together, while also measuring and tracking progress at the same time."
Other pieces echoed Daniel’s point about measuring and tracking progress. They suggested that L&D functions should know how success on DEIB is defined, tracked, and measured in their organization. Then, they said, L&D should align the learning strategy to those goals and metrics.
L&D is focused on improving diversity training
We expected to see many articles arguing that compliance-focused, event-based DEIB training doesn’t work. And there were lots of articles about diversity and unconscious bias training. To our surprise, however, these articles took the ineffectiveness of these training programs as a given. They often cited the 2016 article, “Why Diversity Programs Fail,” as proof.
There were 2 broad threads in this portion of the literature:
- Training effectiveness: Ideas about making diversity training more effective in changing employee behavior. For example, articles mentioned using AR / VR simulations to encourage empathy and help employees practice skills.
- Inclusivity: Suggestions for making all training (especially diversity training) more inclusive. For example, the literature suggested soliciting diverse perspectives when designing training and content.
Some articles did explore additional learning methods that might be used to develop employees’ DEIB skills. Of these, many mentioned coaching managers on being more inclusive leaders. Others discussed microlearning and “nudges” that space learning over time. But these articles did not explore ways for L&D functions to improve DEIB outside of creating learning programs.
Developing underrepresented groups is a common DEIB strategy
The literature agreed that organizations should develop individuals from underrepresented groups. As one study by McKinsey pointed out, employees in underrepresented groups report having fewer development opportunities than other employees. Several articles argued that active and intentional support of underrepresented groups could help reduce this gap.
The literature also noted that employees from underrepresented groups are more likely to use and benefit from structured programs. There were many ideas about programs that might enable these employees to develop and advance. Some of the ideas mentioned included:
- Employee Resource Groups (ERGs)
- Work-study programs
- Work assignments (e.g., international postings)
- Rotational schemes
- Tuition reimbursement
- Talent marketplaces (to enhance visibility and access to opportunities)
- Intrapreneurship programs
- Communities of practice
- “People advisors” who provide career coaching
- Mentoring and sponsorship
In reviewing this theme, we noticed a disconnect: Many articles pushed for more development of underrepresented groups. But others noted that L&D isn’t heavily responsible for DEIB efforts (as we saw in the first theme of this review).
These threads seem contradictory. If developing underrepresented groups is so important, why isn’t L&D more central to DEIB strategies? The literature didn’t answer this question directly. But it’s interesting to note that many of the above programs aren’t traditionally L&D’s responsibility (e.g., rotations, ERGs). We think that may be the reason so many authors emphasized the need for L&D functions to partner with key stakeholders, as mentioned above.
L&D functions need to take a hard look at themselves
A few articles in the literature asked L&D functions to do some serious self-reflection. They are not the bulk of the literature—not by a long shot. But we are calling them out as a theme because they highlighted an issue with substantial DEIB implications: L&D’s own lack of diversity. These articles—especially the ones from authors Gena Cox and Katy Peters, Ave Rio, and Maria Morukian—noted that most L&D functions are majority white and majority women (except at senior levels). Most L&D professionals hold advanced degrees. That means:
White women with advanced degrees dominate L&D. At more senior levels, white men with advanced degrees do.
According to these articles, non-diverse L&D functions might find it harder to drive DEIB efforts and make employee development more diverse, equitable, and inclusive. For example, some articles noted that a lack of diversity might allow bias to creep into the ways that L&D functions tend to:
- Define, prioritize, and measure skills, aptitude, and abilities
- Use data to make decisions about learning
- Decide which development opportunities to offer
- Choose learning methods to invest in
These articles explored how the L&D function might need to change itself to address potential biases. They are a great start to a broader conversation about all the ways L&D functions can contribute to DEIB efforts in their organizations.
Hidden gem: A systems approach to DEIB and learning
We found a handful of articles that took a systemic view of how L&D functions might influence DEIB. They thought more broadly about how to make learning more equitable and inclusive, rather than just about the programs L&D functions might create.
J.D. Dillon, CEO of learning vendor Axonify, wrote:
"Restoring learning equity requires a fundamental mindset shift. Rather than relying on programs as the basic unit of learning, professionals should adopt a systems approach."
By a “systems approach,” these articles meant looking at things like accessibility and opportunity:
- Who is offered access to development opportunities, and why?
- How might access to development opportunities vary based on an employee’s location, access to tech, or ability to use nonworking hours for development?
- Are learning opportunities easy for all employees to find? Are they widely and effectively marketed to all employees?
We appreciated these prompts to think about how L&D functions can ensure that all employees have equitable access to development opportunities. And we believe a systemic lens will reveal many additional ways that L&D functions can make learning more diverse, equitable, and inclusive. We plan to investigate this systemic approach in more depth as part of this research.
What caught our attention
Of the literature we reviewed, several pieces stood out to us. Each of the articles below contained information that we found helpful and / or intriguing. We learned from their perspectives and encourage you to do the same. Click on the titles to go to the full articles.
"The biggest opportunities for TD professionals to make a difference lie in three important but often overlooked segments: knowledge management, career and leadership development, and coaching."
This article has detailed, practical advice for L&D professionals who want to do more on DEIB, above and beyond DEIB training. It also has some great examples of what good looks like—and what good doesn’t look like.
- Training courses are one part, but not the cornerstone, of a strong DEIB strategy.
- L&D functions can use their knowledge management expertise to make tacit DEIB knowledge more explicit, storable, and shareable.
- Inclusive, equitable employee development programs require DEIB and L&D staff to work together.
- Coaching can build more diverse, inclusive, and equitable workplaces by equipping managers with DEIB skills.
"What if the L&D professionals who measure achievement of… skills understand the day-to-day experience of only a subset of their colleagues? What if the career progression decisions from those measurements perpetuate some of the same distorted effects that are now evident in educational assessment?"
This article examines how L&D’s potential biases and blind spots might lead to inequitable employee development. It makes a case for a proactive, systemic approach to overcoming those biases.
- The L&D profession lacks racial and ethnic diversity, potentially leading to blind spots, biases, and inequity.
- The way skills are currently defined, prioritized, and measured may lead to biased outcomes.
- Overcoming L&D’s blind spots requires a systemic approach that re-examines many long-standing L&D practices, including how skills are defined and how data are used.
- A proactive approach to addressing L&D’s blind spots will help make workplaces more inclusive.
"Our research made clear that who you know is as important—often more so—than what you know when it comes to rising through the ranks."
Organizational network analysis (ONA) can reveal who knows whom. It can uncover who has access to informal networks and sources of info about development opportunities. Using ONA, L&D functions can also identify marginalized groups who can be invited for specific development.
- One study revealed that men’s informal relationships with their male managers could explain nearly 40% of the gender pay gap.
- Women are less likely to be at the center of the networks that matter: knowledge, innovation, and critical decision-making networks.
- L&D functions can impact DEIB by codifying and sharing the networking strategies of people with solid and diverse networks.
- L&D functions can use ONA to assess the effectiveness of specific diversity training and other learning programs.
"‘Here we are in Taiwan, in Asia, where they were doing training and learning way before the US, and the two major keynoters they got were white guys over 60 from New York,’ Masie said."
This article is packed with quotes from L&D and DEIB experts. These experts explain why L&D functions must reflect the employee population in terms of race, ethnicity, gender, background, etc.
- The number of people of color in L&D does not reflect the communities L&D serves.
- L&D functions are often asked to be the ambassadors of organizational culture, which is difficult if they aren’t representative of the workforce.
- Thought leaders in L&D are often older white men, reflecting the people who pioneered the field in the 1960s and 1970s.
- To increase diversity, L&D functions need to be intentionally inclusive about whom they highlight as thought leaders.
- L&D’s role in DEIB must be part of a larger organizational strategy.
"When asked if their company offers support for women from executives and middle managers, 72% of male respondents say yes, compared with only 54% of women."
This report helps companies identify the specific diversity and inclusion initiatives—including learning initiatives—that offer the greatest payoff for gender equity. It breaks initiatives into 4 helpful categories: Proven Measures, Hidden Gems, Baseline Measures, and Overrated Measures.
- Proven measures are valued by women and known to be effective by leaders. For example, a proven measure related to L&D is sponsoring women at scale.
- Hidden gems are highly effective initiatives that many organizations should pursue. For example, a hidden gem related to L&D is offering professional development for underrepresented groups.
- Baseline measures are basic steps that all organizations should do, but that don’t have a transformative effect on women’s daily experience. For example, a baseline measure related to L&D is mentoring women.
- Overrated measures are seemingly promising efforts that often do not lead to real cultural change. For example, an overrated measure related to L&D is one-time diversity training sessions.
Additional articles to check out
- "Are learning equity issues affecting your company?" J.D. Dillon, TD Magazine, 2021.
- Improving Workplace Culture through Evidence-Based Diversity, Equity, and Inclusion Practices, S. Creary, N. Rothbard, and J. Scruggs, The Wharton School of the University of Pennsylvania, 2021.
- "How internal talent marketplaces can help overcome seven common DEI strategy pitfalls," M. Heiskell, D. Kearns-Manolatos, and M. Rawat, Deloitte, 2021.
- "Assignments are critical tools to achieve workplace gender equity," E. Macke, G. Gall Rosa, S. Gilmartin, and C. Simard, MIT Sloan Management Review, 2022.
- "How does your company support ‘first-generation professionals’?" M. Burwell and B. Maldonaldo, SHRM, 2022.
- "Providing performance feedback to support neurodiverse employees," M. Hamdani and S. Biagi, MIT Sloan Management Review, 2022.
Posted on Friday, February 18th, 2022 at 6:53 PM
One year and 2 weeks ago Microsoft launched their employee experience offering, Viva. We wrote about it back then and explained why it was a big deal for the HR tech market. Yesterday, Microsoft announced that it will transition Glint, an employee engagement solution, from LinkedIn (acquired Glint in 2018) to become a core part of Viva. The company is set to bring Glint completely into Viva in 2023.
Keeping up with the momentum from 2021, the HR tech market continues to provide us with a show in 2022.
Just last week we wrote about Perceptyx, another employee engagement and experience vendor, acquiring Cultivate, a digital coaching tool, and further enhancing its listening capabilities. While the news of Glint integrating into Viva is not nearly as surprising and market shifting (Glint was already integrating into one of the Viva modules as a partner and providing joint customers access to their analytics insights), it is still notable for a few reasons, not least of them being that it makes Viva a serious player in the analytics space.
Before we dive into the details of what this means for the customers, to Glint, Microsoft, and the HR tech market, let’s do a quick recap of what is Microsoft Viva.
What is Microsoft Viva?
Built on top of Microsoft 365 and Teams, Viva is an employee experience solution that offers four modules (see Figure 1), which combine existing Microsoft offerings into a single solution:
- Connections – Creates a “digital campus” where all policy, benefits, communities, and other centralized resources are available.
- Insights –Provides employees with insights on how they work, and gives managers and leaders information about their teams, burnout risk, after-hours work, etc
- Topics – Leverages Project Cortex to identify knowledge and experts across the organization, generating topic cards, topic pages, and knowledge centers (including people – not just information) for others to access – a “Wikipedia of people and information” for the org.
- Learning – Integrates LinkedIn Learning (formerly Lynda.com), Microsoft Learn, and other external sources (including LMSs or LXPs such as Cornerstone and Skillsoft) into a single location within Microsoft.
Figure 1: Summary of Microsoft Viva | Source: Glint, 2021.
When Microsoft launched Viva 1 year ago, Glint was integrating into the “Insights” module of their offering as a partner. It was providing analytics, based on combined data from Viva and Glint to the users by via Power BI dashboards in Viva for Microsoft customers and pulling Viva data into their own dashboards for Glint customers. In the near future, Glint will be completely integrated into the core Viva offering, meaning customers should be able to access insights in the tools where they work, i.e., Teams.
Let’s break this down and find out what it means for everyone involved.
What does it mean for the customers?
Overall, this should be good news for both Glint and Microsoft customers for a few reasons.
- Customers should be able to access employee insights more seamlessly. Currently, customers need to have both Viva insights and Glint, in order to access insights based on a combination of employee perception (Glint) and behavioral data (Viva insights) through dashboards in Viva or Glint. Once Glint is integrated into Viva, customers should allow users to receive these insights more easily and quickly.
- The annual engagement survey from Microsoft will be moved over to Glint and called “Employee Signals” to reflect a more continuous, always-on listening approach. This should allow Microsoft customers to capture employee perception and behavioral data holistically all in one place, without needing to add any additional vendors.
- The integration will provide users with a stronger set of capabilities that provide them with feedback, recommendations, and action items within the productivity tool they already work in, thus, enabling leaders to respond to needs in a timely manner.
What does it mean for Glint?
According to Microsoft, existing Glint customers will be able to continue to use the current Glint offering delivered by LinkedIn. New new customers can purchase the existing Glint service on a standalone basis through LinkedIn or bundled with Viva through Microsoft for now.
Glint will continue to benefit of combining passive data with their perception data and providing valuable insights to their customers around engagement, well-being, and burnout. Specifically, it will help Glint with:
- Access to rich passive continuous behavioral data that will deepen their insights on employees and managers. For example, we expect Glint should now be able to access metrics around learning sources and knowledge topics being accessed (through Topics and Learning modules). The possibilities of how those metrics can be used, such as to understand DEIB, performance management, learning and development, are huge.
- Having a greater impact on their customers. Once fully integrated, Microsoft HR customers should be able to access the deeper analytics side of the product provided by Glint, while managers and leaders should be able to receive recommendations and feedback in the apps they use. This should result in more action taking, wider action adoption, and greater impact in general among the customers.
What does it mean for Microsoft Viva?
Given than Glint was already integrating into Viva’s Insights module, a complete integration into the entire suite makes complete sense for Microsoft. Overall, it does a few things for Viva.
- Equips Microsoft with enhanced employee listening. By rolling their engagement survey into Glint, Microsoft will be able to leverage Glint’s impressive employee listening capabilities for its customers through its various surveys.
- It provides more value for potential customers. Viva just became a whole lot more appealing to existing and potential customers who might have been considering adding Glint to their people analytics ecosystem.
- Adds a more focused approach to supporting managers and employee development. Glint has long been building its capabilities to build a product that empowers managers to engage and develop their employees through feedback and recommendations. As part of Viva, these capabilities should make the solution more attractive to leaders.
What does it mean for the people analytics and employee experience tech market?
There are 2 big implications of this move for the tech market.
- At its inception, Microsoft Viva was essentially a modern take on the intranet, with the addition of analytics. With a complete integration of Glint and its focus on supporting managers and employees by providing them data-based insights and recommendations, Viva should be able to become a serious player in the people analytics space.
- The rise of employee engagement and experience vendors that are able to bring in rich passive data to supplement employee perception data. Perceptyx and Viva have built into their products an impressive suite of capabilities that are able to provide a more complete picture of not just what the employees are feeling, but when and why they are feeling it, and what they can do to improve or address any challenges as quickly as they arise.
We will have to wait another year to see what the final integration looks like. It will be interesting to see if insights from Glint will be connected with those from Ally.io, an OKR (objectives and key results) tool, acquired by Microsoft in 2021 that is integrated into Viva, and how they will be delivered to the users, especially given that Glint has a performance management product as well.
Glint has always applied a thoughtful approach to its product and the organizational needs it should help address. Given that, it’s safe to say that Microsoft Viva will benefit significantly from making Glint a part of it. We look forward to seeing the final product.
Posted on Tuesday, February 15th, 2022 at 1:49 PM
This morning, Perceptyx, an employee engagement and experience provider, announced it acquired Cultivate, an employee listening tech vendor. This acquisition is representative of a broader shift we’ve been talking about within the employee engagement and experience market. As such, we’ll cover what is happening in this space and then dive into the specifics of this acquisition.
Evolutions in the Employee Engagement & Experience Market
As we mentioned in our 2021 study, People Analytics Tech: Deep Dive into Employee Engagement & Experience, vendors in the employee engagement and experience space offer capabilities that enable customers to do four things:
The first item, “Collect and analyze perception data directly from employees,” and the last item, “Highlight areas of concern and recommend actions to users,” is where employee engagement and experience vendors have typically focused. We’ve seen these vendors increasingly dip into the other two items over the last few years.
In addition to this evolution, the employee engagement and experience market has been converging with others. As we wrote in our summary of Peakon’s 6-month integration into Workday, the following areas have been coming together for years:
- Employee engagement/experience
- Performance management
- People data integration
When you look at the intersection points of all these areas and vendors, you get a chart that looks something like Figure 2 (note, we are in the process of conducting our 2022 People Analytics Tech study, so the vendors’ names will be updated shortly).
As we said before, we are seeing so much overlap because vendors and customers realize that it is no longer realistic to separate these concepts. How do you understand engagement without also understanding performance feedback? And how do you give performance feedback without also giving recognition? And how do you understand any of this without solid people analytics?
Given this backdrop, let’s turn to this acquisition.
Who is Cultivate?
Cultivate is an AI-powered coaching platform that absorbs data from other systems – such as email – and provides feedback to managers about how they are communicating with their team members. It uses a combination of data and insights to make managers aware of behaviors and suggest ways to change them. For example, a manager may get a prompt telling her that she is sending less than 5% of emails to her team after work hours—which is good—but that prompt may also identify the individuals to whom she is sending after-hours emails. In our 2021 study, Coaching Tech Landscape: Humans and Robots, we call Cultivate a “coach on the shoulder.”
Some of the things we like most Cultivate include (check them out in our PAT tool):
- The use of AI to make specific individualized suggestions on how to improve digital behaviors and relationships
- Personalized nudges delivered before specific events, such as 1:1s, via email or chat, instead of requiring managers to access dashboards
- Partnered with Harvard Business Publishing (HBP) and mapped HBP’s library of leadership content to Cultivate signals, to pair suggestions and opportunities with HBR’s bite-size learning content on how to take action
Why did Perceptyx buy Cultivate?
It may not be immediately apparent why an employee engagement and experience vendor bought a vendor that’s a “coach on the shoulder,” but there are two really smart reasons for this buy:
- The underlying technology
- Building out and augmenting Perceptyx’s overall product offering
Cultivate has some of the most sophisticated NLP and AI capabilities on the market to understand employee tone, sentiment, and behaviors. This capability can be integrated throughout Perceptyx’s offering. This will widen and strengthen Perceptyx’s ability to deliver insights on qualitative data.
Further, this underlying technology will allow Perceptyx to better offer the other two items in Figure 1:
- Collect and analyze passive data on interactions/work/environment
- Integrate and analyze different data that drive employee engagement and experience
Specifically, Cultivate will enable Perceptyx to move from offering point-in-time perception data to truly “ubiquitous” listening by ingesting passive data on employees’ interactions, looking for patterns, and providing employees with feedback. Incorporating these data into Perceptyx’s overall data analytic capabilities will make the solution more robust. It can start to give the why around employee engagement and experience, not just the what.
Finally, this acquisition – along with the July 2021 acquisitions of Waggl and CultureIQ – enables Perceptyx to better build out its comprehensive employee engagement and experience offering.
As shown in Figure 3, they now offer:
- Ask: Employee surveys (the historical Perceptyx survey product)
- Sense: Lifecycle surveys (again, historical Perceptyx) and always-on listening (Cultivate)
- Dialogue: Crowdsourced idea-generation and voting (Waggl)
- Develop: Multi-rater feedback (historical Perceptyx) and recommendations (Cultivate via HBR mapping)
All of this is anchored by the Perceptyx People Insights Platform, which provides a single location to understand and understand all these different insight “channels.” It also provides a way to prioritize the following steps and act.
We are fortunate to know both companies very well and are very positive about this acquisition. We have long thought a lot of Joe Freed, CEO of Cultivate, his team, and the technology they’ve built, and are pleased to see them align with Perceptyx, another company we respect immensely. This acquisition will round out Perceptyx’s offerings while enabling the Cultivate team to maximize their impact in the market significantly. We think this is a very sound decision for all and will provide some significant benefits to customers.
That said, I would likely as soon stop breathing as stop having concerns and questions. Given that, here’s what we are going to be watching:
- Ethics and monitoring: We all know there’s a fear out there of extensive monitoring by companies – especially with so many people working from home – and lots of people are searching for monitoring tools.
(Just Google “big brother monitoring” and see what the autofill provides. Don’t worry; I will wait here – go do it. Oh, you also found “big brother monitoring tool free download”? So surprising!)
Now, Cultivate has historically avoided this issue by making their tool opt-in – so employees have the choice of whether they leverage the software – and ensuring that only employees get individualized data and recommendations. Companies and managers only get high-level summaries of what is happening. So far, this seems to be working – our understanding is that, to date, Cultivate has seen less than 5% of folks either opt not to participate or opt-out once they have participated.
Perceptyx will have to continue to hold that line on opt-in and privacy with this acquisition, and they have indicated to us that they plan to do so. Further, there will be questions about how passive data should be combined with engagement data. The ethics and privacy considerations around these questions are not necessarily defined, let alone answered. This will be something for Perceptyx to work through in tight concert with their customers in the coming months.
- Acquisitions everywhere: Perceptyx has acquired three companies in nine months, which is a rapid clip for any company, let alone one that has fewer than 500 employees. Perceptyx is aware of the challenges this can create and is rapidly integrating its acquisitions and reconfiguring teams. Hence, there aren’t historical Perceptyx and new acquisition teams, just new teams. Even so, the frequent evolution of teams, additions of new people, combinations of cultures, etc., will be challenging, and making everyone work together well will be a constant effort for months to come.
- Slowing evolution to PM and recognition offerings: As mentioned in Figure 2, we see vendors increasingly adding performance, recognition, and learning to their engagement offerings as customers look for less fragmented solutions and easier data integration across core talent management activities. We understand that Perceptyx was also headed down this path before these acquisitions and that these events will slow down their addition of these offerings. On the one hand, this will allow Perceptyx to focus more specifically on employee engagement and experience – on the other hand; it may prevent them from keeping up with some of their competitors. We will watch how these acquisitions impact their customer growth rates and satisfaction.
Despite these concerns and questions, the bottom line on this acquisition is that it is a good thing. Congratulations to the Cultivate and Perceptyx teams on an exciting new future ahead of them!
Posted on Wednesday, February 9th, 2022 at 2:44 PM
Download This Report[/button]
At this point, the business case for diversity, equity, inclusion, and belonging (DEIB) is clear. Our own research (see Figure 1) shows the relationship between having a strong DEIB culture, and critical individual and performance outcomes.1
Yet, for years, the representation of diverse populations in organizations improved almost imperceptibly.
Then we had a global pandemic and the rise of a social justice movement, sparked by the murder of George Floyd. Along with that came the heightened awareness that the pandemic was impacting diverse populations much more—particularly for women and people of color who were dropping out of the workforce at higher rates than other populations. As a result of this confluence of events, organizations began making big promises on DEIB in the summer of 2020.
When this happened, one of our first questions was how organizations would show that they’d made good—or at least made progress—on those commitments. While DEIB metrics measurements designed to understand DEIB—are the obvious answer, how to select, collect, use, and maintain those metrics is not so clear.
Thus, this research initiative on DEIB metrics and analytics was born. The first article in this series, “DEIB Analytics: A Guide to Why & How to Get Started,” provides leaders with a plan on how to begin using DEIB metrics and analytics. We’ve shared an 8-step guide with details on the actions and considerations that organizations need to take to effectively implement DEIB metrics.
This article: An essential guide to DEIB metrics
This report focuses more narrowly on the appropriate metrics and analytics for DEIB. We aim to provide DEIB leaders, people analytics practitioners, HR business partners, workforce planning and talent management leaders with:
- A foundational understanding of the different metrics that can be used to measure and track their DEIB performance
- Insights on how those different metrics might vary, depending on their org’s sophistication with DEIB and analytics
This article is based on a wide range of information, including our research on:
- People analytics technology2
- DEIB analytics3
- DEIB strategies4
- DEIB technology5
- A literature review of DEIB and analytics6
- Interviews with ~20 people analytics and DEIB practitioners
Our research focuses specifically on the people within an organization’s existing workforce. We know a number of other DEIB metrics exist that orgs should also consider, such as those which apply to their supply chain, community efforts, ESG (environmental, social, and governmental) requirements, etc. While critical, those areas are outside the scope of this report.
We would also like to mention that this report is the first of its kind, in that it attempts to provide a holistic look at all talent-related DEIB metrics. Any first try will miss some critical elements and we acknowledge this report may be incomplete. We invite you to share any suggestions, feedback, or additions you think appropriate by emailing us at [email protected].
The DEIB space is evolving quickly, and we will only make progress by putting out our best ideas and amending them quickly as new information becomes available. Thank you for being part of that process and pushing forward toward greater opportunities for all.
Let’s start our essential guide by defining our terms (see Figure 2).
Why are DEIB metrics & analytics important?
Some of the common reasons why leaders start to focus on DEIB metrics and analytics include:
- Creating a clear business case for DEIB
- Measuring the return on investment (ROI) of DEIB expenditures
- Tracking the impact of critical DEIB initiatives
In addition to these, a few more reasons why orgs should use DEIB metrics and analytics include:
- Busting myths or addressing anecdotes that may or may not be true
- Checking assumptions about DEIB
- Meeting consumer, investor, and employee expectations when it comes to progress on DEIB
While these are all good reasons to use DEIB data, one of the most compelling motivations for why DEIB is critical was articulated by one of our interviewees:
“Companies have been setting diversity goals for decades but have struggled with “goal-getting”—meaning the clear accomplishment of those goals—because of a lack of feedback and data to help them get after those goals every day. Without any feedback on progress, companies lose sight of the goals.”
—Phil Willburn, Head of People Analytics & Insights, Workday7
Why do orgs find DEIB data difficult to use?
Many leaders struggle to use DEIB data for reasons such as the following (see Figure 3):
- Challenges in identifying and using appropriate metrics. Historically, very few orgs have attempted to track metrics for DEIB and even fewer have ventured beyond collecting diversity data. Often, leaders are unsure which metrics can and should be measured for DEIB. Even if they’re able to identify them, leaders then often face challenges around tracking and integrating the data.
- Legal, security, and privacy issues. DEIB data involves sensitive information—and this comes with legal and security challenges around data collection, storage, and usage. As a result, some orgs hesitate to collect and use it. Additionally, employees may be hesitant to provide it, due to data privacy and access concerns.
- Poor alignment with goals. Orgs find it challenging to use the data if there’s no or poor alignment between the data collected and the overall DEIB goals that the company wants to achieve. As result, there can be a sense of helplessness, which can render the data not as helpful.
- Data responsibility issues. Because DEIB data can reside in multiple systems under several functions (e.g., HR, D&I, IT, sales), there can be a lack of clarity around who is primarily responsible for the data and how / when it can be shared.
- Data interoperability issues. Related to the previous point, orgs often find it challenging to use data collected in one system on another due to integration issues and capabilities of the tech solutions in place.
For this article, we focus on the first bullet to help orgs identify the range of metrics they can use.
“When you have members of a minority group who are leaving at a higher rate, that’s telling you something is wrong, and it helps steer you to where the problems are. It needs to be measured at quite a low level in the company because that’s the way you find where your hot spots are.”
—Fiona Vines, Head of Inclusion and Diversity and Workforce Transition, BHP8
Clarifying diversity metrics
As we highlight in our report “DEIB Analytics: Getting Started,” the essential first step to creating diversity metrics is collecting appropriate demographic data. Essentially, the data collected should allow orgs to answer 3 questions:
- What does our current workforce look like across different levels (hierarchy) and functions / business units?
- Who are we hiring (internally and externally) across different levels?
- Who is leaving the org and at which level(s)?
It’s important that leaders not only look at simplistic diversity numbers, such as gender or race / ethnicity—they also need to consider multilevel diversity, known as intersectionality, such as Black women or gay Asian men. This additional analysis helps leaders understand their workforce at a more nuanced level, and make better recommendations and changes.
Many orgs track basic diversity numbers: 96% of U.S. companies report the gender representation of their employees at all levels and 90% report gender representation at senior levels.9 However, far fewer orgs look at intersectionality: Only 54% of companies track gender and race / ethnicity—such as Black or Latina women in senior leadership.10
Figure 4 is a list of common demographic data that we’ve seen orgs collect (for a more comprehensive list of data that could be collected, please see our definition in the earlier section). It’s important to note the significant legal limitations in different countries as to which of the following can be collected and stored. Your org’s legal counsel should always be involved in determining which data to collect.
While comparatively easy to collect and analyze, orgs should be wary of trying to do everything at once when it comes to diversity metrics. Leaders should first figure out the immediate challenges or business issues they want to solve for and identify the appropriate metrics accordingly.
Examples of diversity metrics
Figure 5 offers a list of the metrics that orgs can use to measure diversity. Many orgs already collect most of these metrics through their human resource information system (HRIS) or applicant tracking systems (ATS). By adding a demographic lens to these metrics, orgs can quickly understand the state of diversity within the org.
Using diversity data to improve hiring11
As part of its diversity goals, an industrial manufacturer wants to achieve 50% female parity in leadership roles by 2030, and create a globally diverse workforce with inclusive leaders and teams. In order to do so, the company needed an accurate picture of their current workforce diversity mix and the recruiting pipeline.
Working with a technology provider, the company looked at its recruiting pipeline to better understand how women and minorities move through the full process from recruiter review to meetings with the hiring manager. A review of the talent acquisition process revealed that the number of women applicants was disproportionately lower than their male counterparts. Additionally, as women moved through the hiring process, they were more likely to be dropped during the interview process.
To tackle these challenges, the company implemented:
- Programs for hiring managers, including unconscious bias training
- Workshops on inclusive conversations to enable a better hiring experience for women and minority candidates moving through the process
As a result of these actions, the company is in a better position to meet its 2030 goals. It’s also working to attract more women and minority job applicants through strategic partnerships with the Society of Women Engineers and the National Society of Black Engineers, among others.
Understanding equity metrics
Equity metrics can help orgs understand the effectiveness of their processes, and identify unfair or biased systems, practices, and policies. Research conducted in 2021 revealed that when employees are treated fairly, they’re:12
- 8 times more likely to look forward to going to work
- 3 times more likely to have pride in their work
- 4 times more likely to want to stay a long time at their company
Equity metrics can be measured from data collected via several sources, such as:
- Learning and development data
- Performance management data
- Employee engagement / experience data
Ensuring fairness in the distribution of resources, opportunities, and access can help leaders address existing systemic inequities within the orgs. The point to note here is that the distribution needs to be fair, not equal. The difference between these two concepts is shown in Figure 6.
Thus, the goals of measuring and tracking these metrics should not be to ensure equality or sameness for everyone, but rather to:
- Detect areas in which systemic inequities exist
- Identify differences in capabilities, resources, and needs
- Implement systems and process that take these into account
While orgs have a strong case for creating a fair and equitable environment, many struggle to do so. For example, our 2021 study on performance management trends revealed that only 48% of employees believe their performance evaluation process is fair and consistent.13 As orgs continue to manage unique needs and challenges for different employees, leaders will increasingly need to address issues around managing fairness and equity across varied employee experiences.
Source: Robert Wood Johnson Foundation, 201714
Examples of equity metrics
Below is a list of metrics that orgs can use to understand, measure, and track equity. All metrics should be analyzed by the different demographics collected by the org to understand the differences in opportunities, access, and renumeration for various groups.
Using people analytics to create a more equitable environment
- Uber.15 Shortly after the start of the COVID-19 pandemic, Uber’s People Analytics team found that employees with children younger than 5 years of age scored lower than the company average on engagement and satisfaction metrics. To help provide them with the support they needed, the company added some flexibility options to help those employees balance childcare with work.
- A midsized U.S. law firm.16 Upon auditing its performance evaluations, the company found that only 9.5% of people of color at the firm received mentions of leadership in their performance evaluations—more than 70 percentage points lower than white women. The company changed the evaluation form that broke down job categories into competencies and asked that ratings be supported by at least 3 pieces of evidence. They also developed a 1-hour workshop to teach everyone how to use the new form.
As a result of these changes:
- Comments with constructive feedback for people of color increased from 17% the year before to 49%
- Women also received greater constructive feedback (from 10.5% the previous year to 29.5%)
Identifying inclusion metrics
After diversity, inclusion is the most common area that organizations tend to measure. According to a 2018 study, a little more than 50% of orgs measured inclusion.17 While the focus and urgency around this area has increased over the years, few orgs are doing anything beyond tick-the-box exercises.18
“Let's say that the engagement score for our company is high at 80%, and that makes us happy. And then you realize that 80% of your employees are White—which means that you’re not really hearing the voice of those under-represented groups. Inclusion analytics is about pulling that out, and making sure you have a good sense of where everybody's falling on all of your core metrics.”
—Hallie Bregman, PhD, Global Talent Strategy and Analytics Leader19
There are a few reasons why orgs should focus on understanding and measuring inclusion. Orgs with an inclusive culture:20
- Are twice as likely to indicate they met business goals from last 3 years
- Are 81% more likely to indicate high customer satisfaction
- Have employees that are 45% more likely to stay
- Have employees that are 2 times more likely to give a positive Net Promoter Score® (NPS)
If these reasons weren’t enough, the volatility of 2020 and 2021 has resulted in many companies facing tough questions around their efforts in this area. According to a recent analysis of S&P 500 earnings calls, the frequency with which CEOs talk about issues of equity, fairness, and inclusion on these calls has increased by 658% since 2018.21
Inclusion metrics can help orgs understand whether employees feel:
- Accepted by others in the workplace
- Integrated into and a part of the wider organization
- Respected for their work by others
As alluded to above, orgs can typically approach inclusion metrics in 2 ways—employee perception data and object data. We explain the differences between the 2 in Figure 8.
Examples of inclusion metrics
Figure 9 offers a list of metrics that orgs can use to understand, measure, and track inclusion. These include metrics that directly impact an employee’s sense of inclusion (e.g., mentor relationships and strength of connections with others), as well as some not-so-obvious metrics that can drive inclusion (such as the average distance between office and home, which can adversely affect employee experience).
Real World Threads
Understanding and embedding inclusion within everyday behaviors
When it comes to inclusion analytics, an international electronics company believes in embedding inclusion in everyday behaviors, activities, and processes across the company. It’s been collecting data and doing the research for more than 5 years to understand the key behaviors that impact inclusion at the organization. Because of its groundwork, the company was able to identify 4 metric areas that they needed to track and analyze on a regular basis:
- Net Promoter Score
- Job fit
- Employee engagement score
- Intention to turnover
The people analytics team approaches these metrics in 2 ways, by:
- Checking in with new hires and collecting the data from them
- Making sure that all employee surveys administered by the org contain questions that tie into these metrics
By collecting this information regularly, the company has been able to identify pain points and concerns experienced by diverse populations, especially in the current times—and plan initiatives and appropriate decisions around topics, such as vaccinations, return to offices, rollouts of wellbeing programs, and measurement of the financial impact of those programs.
Specifically, the company has extended its remote working policy because they determined that return to office will disproportionately impact their female workforce and potentially increase their turnover by 33%. It also rolled out a $300 COVID Wellbeing credit that can be used towards children’s tutoring costs, wellbeing app subscriptions, tax preparation costs, etc. to help employees—especially parents and caregivers who are more impacted by the pandemic. Additionally, the company re-examined and adjusted its communication and approach on vaccine education as result of employee feedback.
In addition to these measures, the people analytics team has also been able to use insights from inclusion analytics to identify areas in which different groups need support. For example, the company found that its millennial workforce needed and wanted greater support for financial planning as part of its benefits program. The company added specific financial wellbeing offering in its annual benefits open enrollment to support Millennials and Gen Z.
In another example, the company was able to build more inclusive policies around statutory and floating holidays that take into account the fact that employees with different religious backgrounds might want to take different holidays.
As a result of these efforts:
- Net Promoter Score of the company increased by 7%
- Confidence in Leadership increased by 8%
- Employee Engagement increased by 5%
Defining belonging metrics
While closely related to inclusion conceptually, it’s important that orgs pay equal attention to measuring and understanding belonging. We explain how belonging is different from inclusion in Figure 10. A high sense of belonging among employees can result in:
- An increase in employee happiness and employee engagement, which in turn impacts employee retention22
- A significant increase in job performance23
- A reduced turnover risk and a decrease in employee sick days24
Analytics based on belonging metrics can serve as a leading indicator of critical diversity outcomes as well. Specifically, belonging metrics can help orgs to:
- Gain a deeper understanding of the sense of security experienced by employees
- Find out if employees feel connected with the org’s values and purpose
- Bolster their ongoing efforts around inclusion and equity
“When someone is experiencing a sense of Belonging, they feel freer, they feel more creative and their opportunity to potentially have an impact at work is significantly increased.”
—Kate Shaw, Director of Learning, Airbnb25
Examples of belonging metrics
Figure 11 offers a list of metrics that orgs can use to understand, measure, and track belonging. While some metrics speak to belonging directly (e.g., a belonging index as part of an engagement survey), others should be used in combination with one or more additional metrics to gain a better understanding. For example, by looking at metrics around the number of resources groups offer and the participation rates for them, orgs can try to understand if employees feel supported. Employee feedback comments specific to these topics can provide even more context of the underlying issues.
Real World Threads
Using nontraditional metrics to add depth to understanding26
A number of companies look beyond the obvious metrics and data to gain a deeper understanding of the current state of DEIB within their orgs. For example:
- Cindy Owyoung, the Vice President of Inclusion, Culture, and Change at Charles Schwab, looks at the metrics around growth and vitality of the company’s employee resource groups (ERGs). By tracking metrics such as the number of ERGs and the number of participants in them, the company is able to really understand the work Schwab’s ERGs are doing and whether they are providing value to their members.
In addition, these metrics can also be indicative of whether employees have the support they need to be able to participate in the ERGs and do the work that needs to be done.
- Zoom Video Communications is another company that lays emphasis on such metrics. According to Damien Hooper-Campbell, the company’s Chief Diversity Officer, these nontraditional metrics “serve as bellwethers.” The company looks at metrics around the ERGs and keeps a track of the number of allies who are active in ERGs.
According to Hooper-Campbell, “If you have a women’s employee resource group, do you have any men who are part of it? How many non-Latinx folks are part of your Latinx employee resource group and are contributing to it, or coming and listening to it?”
Such metrics can offer a more nuanced understanding of the extent of support experienced by different groups across the org.
DEIB metrics: Strengths & limitations
DEIB metrics are most effective when multiple types of metrics are combined to gain a clearer picture of DEIB holistically. (See Figure 12.) For example, by combining inclusion metrics with equity metrics, orgs can understand not only that different groups may be feeling less included, but also the specific reasons (e.g., unequal development opportunities or biased performance reviews) for it.
Using data sources for DEIB
Now that we’ve covered the specific metrics, let’s look at the data sources orgs can use for them. Orgs should keep a few things in mind when using such data:
- All data should be looked at with a demographic lens. For example, the number of trainings accessed by the workforce would mean little unless analyzed to see if white women access training more often than Black women.
- Data are more powerful when combined with other data. For example, data from the HRIS that shows exit rates should be combined with data from exit interviews, surveys, and employee comments on external review websites.
- Connectivity between data sources is essential to being able to use the data effectively. Data interoperability, or the ability for different data between systems to work together, is a necessity in order for orgs to drive DEIB. As such, they should look for tech and tools that enable them to do that.
- The partnership between DEIB and people analytics functions is critical. As we mention in our report “DEIB Analytics: Getting Started,” DEIB and PA leaders often come from different backgrounds and parts of the org, which mean partnership challenges may exist that must be addressed. The insights and expertise of both groups are necessary to use and interpret DEIB metrics effectively.
Common data sources for DEIB
Figure 13 shows that most of the data sources can be used for more than one DEIB area.
Beginning the DEIB metrics journey
Orgs at the beginning of their DEIB journey should try to answer the question: What’s the current state of DEIB within the org? As such they should focus on 2 things:
- Understanding the state of diversity
- Identifying “low-hanging” challenges—areas that need attention and are easy to quickly start working on
When it comes to selecting metrics, orgs should start with the basics, like:
- Getting their basic demographic data in order
- Measuring metrics around headcount, retention, and turnover to understand diversity
- Leveraging employee perception data—such as engagement surveys, feedback, and focus groups—to understand how different groups perceive DEIB at the org
Orgs should ensure that the selected metrics are clearly tied to overall strategy and that processes exist to track their progress.
A people analytics leader we spoke to mentioned creating a Python script to pull different metrics that they’re already collecting around talent acquisition, internal mobility, performance, engagement, and exit rate to understand where the biggest gaps are between different employee groups. This allowed them to quickly identify areas with the biggest gaps, start working on them, and track progress over time.
“The DIB world is so enormous, and you could do a thousand things. It's hard to understand where to start and where to focus your efforts. We should be intentional about identifying our biggest gaps. Every company has some problems around DEIB, but we should work on finding where our biggest internal gap is and focusing on that first.”
—Head of People Analytics, a large technology company
Moving up to an intermediate level with DEIB metrics
Once the orgs have a clear sense of where they stand or the “what,” they need to understand the “why,” such as:
- Why do certain groups experience a low level of inclusion and belonging?
- Why are certain groups being promoted at lower rates than others?
Orgs can begin to supplement existing data to gain a deeper understanding of the systemic issues that impact DEIB. When it comes to metrics, orgs should look at data from existing systems:
- Learning & development data
- Performance management data
- Payroll data
- Wellbeing data
- Data from employee feedback comments
A technology provider shared an example of a customer project that conducted text analysis on data from employee feedback to understand why promotion rates for women were low in a company. The analysis revealed that the existing initiatives to drive promotions favored men and received positive feedback from them, as compared with women. Some of the concerns that surfaced included difficulties faced by women around childcare and the inflexibility around work schedules. The analysis of the data allowed the company to identify the systemic issues that were negatively impacting promotion rates for women and their overall DEIB efforts.
“Metrics are a way to communicate what’s important. Orgs should limit themselves to how many metrics they push. It’s like the weather, I don’t want a million different metrics to know if the weather is good of not. Orgs should figure out the goal (what is ‘good’ weather) and the metrics should help achieve that.”
—Dirk Jonker, Chief Executive Officer, Crunchr
Using a mature approach to DEIB metrics
The questions orgs should look to answer at this stage are:
- How can we address existing issues and drive our DEIB efforts effectively?
- How can we measure progress longitudinally?
- What creative analyses or approaches might help us answer questions we haven’t yet been able to answer?
When it comes to metrics and data, orgs should consider complementing existing data with:
- Network data
- Communication data from sources such as emails, calendars, meetings, etc.
- Workplace tech data from tools used by employees to get work done such as Zoom, SharePoint, Slack, Teams, and Asana
- Employee reviews and comments on external websites
Orgs should consider using advanced approaches to people analytics such as connecting text analytics with social network data. Text analysis can help orgs identify existing gaps in inclusion. Network analysis can help identify influencers. Orgs can relay feedback to influencers and leverage them to fill those gaps and drive greater efforts.
DEIB is a continuous effort rather than a “once-and-done” approach. Orgs should look externally to compare their performance to avoid becoming complacent in their efforts and update their goals regularly. Specifically, orgs should look at how other high-performing orgs that rank high on DEIB are performing, instead of industry or national averages.
“When it comes to selecting metrics, don’t go with the flow, and get something off the internet or another company. How you define metrics really matters, and orgs need to be intentional about what and how they measure them.”
—Lydia Wu, Head of Talent Analytics and Transformation, Panasonic North America
When it comes to DEIB, orgs need to do more than provide training and courses to employees. They need to think about and approach it in a holistic manner so that it’s built into the way the business is managed, instead of something that’s an afterthought or special.
To that end, orgs need to:
- Understand where they currently stand and how are they perceived by their employees. They should know what issues currently exist.
- Understand why those issues exist. Orgs need to find out the reasons why they are falling short in those areas.
- Identify what can they do to fix them. Orgs should plan their targeted initiatives and interventions in order to get the maximum value and results from their efforts.
In order to achieve that, companies need to apply a greater focus, and put more emphasize on using metrics and data than they currently do. As we’ve mentioned before, the growing demands from customers, investors, and employees around more action on DEIB is likely to keep increasing. Orgs stand to lose a lot more if they do nothing, not just in terms of lagging performance, engagement, and innovation—but also in future talent that’s going to place a lot more importance on these issues going forward.
It's time companies take their DEIB data seriously. Moving forward, we hope to see a greater acceptance of and creative thinking around how these data and metrics can be used to enable all people and do their best work.
Below we share our own as well as indices used by other organizations to help understand their DEIB culture.
Figure 18: Gartner inclusion index | Source: Gartner.27
Figure 19: University of California San Francisco’s Belonging Index | Source: University of California San Francisco.28
Posted on Tuesday, January 4th, 2022 at 10:06 AM
Over the past few years, several orgs have dramatically changed their existing practices, including those around performance management. Our latest study on performance management trends compares our fall 2019, fall 2020, and new fall 2021 data on performance management practices. We also conducted a literature review of more than 60 articles, a roundtable discussion with over 25 leaders, and a quantitative survey of 621 HR leaders and employees and conducted over the summer and fall of 2021.
This infographic (click on the image below to get the full version) highlights key insights from our report Modern Performance Management Trends.
As always, we’d love your feedback at [email protected]!
Posted on Tuesday, December 21st, 2021 at 1:09 PM
This readout provides an overview of key themes, insights and quotes from our second roundtable. We brought together learning leaders and professionals who shared what skills the L&D function needs and strategies L&D leaders can use to make an impact in their orgs.
Our discussion centered around 4 topics:
- The evolution of the L&D function
- Skills for the L&D function’s strategic role
- How L&D skills are being continuously upgraded
- How the L&D function is partnering with other org functions
From our discussion, we identified 4 key takeaways:
- The L&D function has more influence and should take advantage of it
- L&D leaders have a seat at the table; they should pull up a chair and sit down
- L&D functions should keep it simple
- L&D functions should leverage internal champions
The following sections provide an overview of the major points associated with each key takeaway.
The L&D function has more influence and should take advantage of it
L&D functions are moving closer to the center of business strategy, which means that the scope of problems they’re being asked to solve are broader and more complex. We’re talking about big, hairy, audacious priorities such as DEIB, performance, mobility, technology, reskilling/upskilling, etc. More leaders are finding themselves in strategy discussions where the C-suite is asking them to help craft initiatives to solve some of these challenges.
Participants identified skills like seeing around corners (also defined as anticipating needs), understanding learning trends, and leveraging tech, as important to materially adding to larger business strategy discussions.
“The move from content and events to experiences and from the margins (of the org) to the center (of business strategy) are related, because we’re being given bigger, tougher problems to solve. And we’ve been preaching for years that people don’t learn from a class. It needs to be put into action.”
– L&D Consultant
L&D leaders have a seat at the table; they should pull up a chair and sit down
Many L&D leaders in our roundtable recognized that they (finally) have a seat at the table, but many mentioned that their teams feel unequal to the task. While they recognize that they bring unique skills and points of view that can help orgs build skilled workforces, they were uncertain about their ability to influence or persuade.
Participants also listed skills like intellectual humility (being able to unlearn and relearn quickly), design thinking, storytelling with data, and agility as essential to L&D leaders being confident, strategic contributors to strategic discussions.
“We’ve got the seat at the table so pull up a chair – I'm feeling that gap…the ability to really connect what’s happening in the business world to a change in how we approach learning.”
– Career Development leader
L&D functions should keep it simple
Participants talked about the complexity and interplay of the challenges they are being asked to solve and identified that complexity as a challenge moving forward. Specifically, they spoke of the need to develop skills like information literacy and environment-sensing to identify the most important priorities and the simplest solutions.
They also noted that many L&D professionals feel an emotional stake in wanting to respond to everything, which can hurt their ability to simplify what signal is most important. Leaders mentioned that skills such as prioritization, business mindedness, and critical thinking can help them avoid this pitfall.
“We’re people people so we hear a need and want to respond, but that then can distract us from the most important priorities. That critical appraisal to maintain focus while also being sensitive nd responsive to what’s happening on the ground.”
– EVP of Talent Engagement & Development
L&D functions should leverage internal champions
The L&D function’s responsibilities have shifted from creating and distributing all training to instead building systems that enable all in the org to play an active role in their own and others’ development.
One really effective way to do this is by partnering with internal champions.
A few participants used the analogy of ‘following the energy’ when searching for partners. L&D leaders are seeking out leaders in other departments that are willing to experiment with new methods and tools and ways of developing employees.
Participants identified a focus on tech, tools, and systems (notice there wasn’t a lot of talk about content) to enable leaders and employees within their orgs. They also mentioned strong alliances with groups like IT, Comms, and Analytics departments to gain necessary skills, but also leverage the skills of those departments in order to build these systems and tools.
Participants also mentioned that as L&D functions deputize leaders and others within the org and share responsibility for employee development, it frees up their time to focus on some of the larger strategic initiatives that they’re being asked to tackle. Skills like business acumen, relationship building, communicating strategic ideas can allow L&D leaders to more effectively work with their champions.
“We’re building the infrastructure for a learning culture – enabling it so all the tools are there – but relying on external partners or partners within the business to deliver. We’re not delivering in response to requests anymore, we’re directing people to the right places in the infrastructure to get what they need.”
– Chief Learning Architect
Thanks to all the roundtable participants and breakout leaders who contributed to an incredibly engaging conversation. We learned a lot about important skills and strategies the L&D function can develop and use to make an impact on their orgs.
Posted on Wednesday, December 15th, 2021 at 11:04 PM
The past few years have been a difficult journey for most organizations. To manage, orgs have had to dramatically change their practices, especially when it comes to performance management. While we heard from leaders and read articles about how different companies are approaching performance management in 2021, we wanted to understand in-depth how performance management is evolving in current times, and specifically how is it different from 2019 in terms of philosophies, practices, and approaches.
Our latest study compares our fall 2019, fall 2020, and new fall 2021 data on performance management practices. We also conducted a literature review of more than 60 articles, a roundtable discussion with over 25 leaders, and a quantitative survey of 621 HR leaders and employees and conducted over the summer and fall of 2021.
As you will read in the report, there are some very exciting changes happening in orgs, such as:
- Employees having a much clearer understanding of their goals and expectations
- More frequent conversations between employees and managers
- Increased transparency in terms of compensation and feedback
Yet, there are some real concerns, too. For example, managers’ openness to new information and providing employees with autonomy have both declined since the fall of 2020. Further, less than half of employees indicated that the performance assessment process is consistent and fair.
Clearly, there’s still a lot of work to be done on improving performance management and we hope this report will help you more quickly identify areas of needed focus.
Posted on Tuesday, December 7th, 2021 at 11:32 AM
Posted on Tuesday, November 30th, 2021 at 3:02 PM
The panic—and the opportunity
The pandemic forced most L&D functions to throw out their tried-and-tested, in-person, instructor-led-learning playbooks. Indeed, in the early months of the pandemic, one of the most common questions we got was, “How do I get all my learning online, ASAP?”
And then there were several months when leaders realized that they might never get all their classroom training online, and what’s more, maybe that shouldn’t be the goal. Even before the pandemic, it was increasingly clear that the waterfall development methods, reliance on courses, and one-size-fits-all approaches of the past were no longer working.
Leaders realized they might never get all their classroom training online, and what’s more, maybe that shouldn’t be the goal.
For one thing, these approaches haven’t supported the ways employees learn for a long time. Survey after survey has shown that employees learn more through the informal stuff—and therefore rely more on it—than the heavy, expensive courses L&D functions have tended to focus on.
And for another, the logistics of traditional learning approaches keep orgs from being as agile and responsive as they need to be in an unstable and fast-changing world. Today orgs can’t afford to wait 6 months for a training course to come online, constantly take employees away from their work to learn, and narrowly define “learning” so that only the formal stuff counts. Instead, they need a continuously upskilling workforce. And even the very best instructional design team can’t do that by themselves.
As orgs settle into new ways of working—hybrid, remote, flexible, whatever—leaders dealing with this new reality are hyperaware of the need to do learning and upskilling differently. And not just different-for-this-point-in-time, but differently forever. The ways people work have changed and will continue to change; the ways they learn must help them keep pace with and even stay ahead of those changes.
The ways people work are changing; the methods they use to learn must help them keep pace with those changes.
Learning methods—literally, the ways people learn—are key to the question of how orgs can enable learning and upskilling differently. There’s a wealth of learning methods that can be leveraged in different ways to help employees develop their knowledge and skills.
To do this, though, L&D functions must know what those methods are and decide on the right ones, in the right combinations, for their org.
Which brings us to this study. Over the past few months, we’ve investigated both the methods themselves and how organizations are choosing them. We looked at over 60 articles, hosted a roundtable on the topic, and talked in depth with 15 learning leaders.
This report outlines what we found. Specifically, we’ll introduce:
- An overview of learning methods and how they align to RedThread’s Employee Development Framework
- How leaders are deciding (on a continual basis) what methods work best for their orgs
- Real-life examples of how orgs are leveraging learning methods in different ways to help employees develop
The next section introduces a comprehensive list of learning methods we’ve found in our research and discusses some of the major trends we’re seeing. We then examine how those methods map to the RedThread Employee Development Framework and how different methods enable different employee behaviors.
(Mostly) familiar methods, new applications
When we started this research, we were in search of the novel: innovative learning methods that cropped up in response to (or in spite of) the pandemic. But, surprisingly, most of them were familiar to us. Figure 1 shows the major learning or development methods we found through our literature review, interviews, and roundtables.
That isn’t to say we didn’t see innovation: if we hadn’t, this would be a very short paper. But it didn’t take the form we expected. While the discrete learning methods were familiar, some of the ways those methods were being utilized were surprising.
While many learning methods are familiar, the ways they’re being used are new and innovative.
We’ll provide specific examples throughout the paper; here are the general trends we’re seeing.
Unsurprisingly, we have all gotten much better at using technology over the past 2 years. Also unsurprisingly, that improvement has yielded greater know-how about automating learning. Many of the vendors we spoke to are actively taking the “stupid work,” like curation of learning content, off the plates of L&D professionals and using automation to enable employees to find the learning content and opportunities they need.
As L&D has gotten better at automation, we’re also seeing more personalization as orgs move away from rote, unchanging learning paths to something much more dynamic. We’re not just talking about branching scenarios: L&D functions are leveraging learning methods that help to personalize the entire development experience, helping both the individual and the org accomplish their goals.
Leveraging the existing
They say that necessity is the mother of invention, and we’ve seen that in the past couple of years. Many L&D functions are leveraging what already exists—content, technology, ways people are already learning—instead of investing in or developing new ones. For example, one company we spoke with ditched a “social learning platform” for WhatsApp groups, which accomplished the same goals in a platform employees were already using.
More in the work itself
The increasingly urgent conversations in many board rooms and in cyberspace about skilling, reskilling, and upskilling have changed the types of methods orgs are choosing for development. Traditional methods like classroom training will always have their place, but increasingly apprenticeships, individual development plans, job rotations, and stretch assignments are being leveraged to build skills while the employee is doing the work. Our friend Chris Pirie likes to say, “Learning is the new working.”
The pandemic made asynchronous and self-service learning an imperative, building on the fact that employees are increasingly likely to create their own career paths rather than following traditional, predictable ones. In response, orgs are offering more self-service, employee-driven learning methods, rather than curricula that serve only the most obvious or common career paths.
More combinations of methods
We mentioned earlier that many of the methods identified by this study are familiar. What’s new, though, is that more combinations of those methods are being used to accomplish certain development goals. L&D leaders are thinking more holistically about using learning methods to accomplish a goal—so a leadership course may have a coaching element, an on-the-job capstone project, and technology that nudges participants toward the right behavior, rather than relying solely on classroom instruction.
In the midst of all this innovation, it might be helpful to introduce a structure that shows how all these learning methods can complement one another and be used systematically toward org goals. That's where we turn next.
Learning methods and the Employee Development Framework
A few years ago, we introduced the RedThread Employee Development Framework, shown in Figure 2. This framework describes the behaviors orgs should be enabling in their employees in order to have a solid learning culture. We use this framework to make sense of the world of employee development and to help leaders identify any gaps they should be paying attention to.
The Employee Development Framework offers a structure leaders can use to understand the universe of learning methods.
The Employee Development Framework shows that L&D functions should focus their time on enabling employees to:
- Plan: Understand their career options and the development they’ll need to get them where they want to go.
- Discover: Find the opportunities and content that will help them develop the knowledge and skills they need to take their career in the direction they want.
- Consume: Easily access relevant learning content—a challenging feat, given the amount of content available.
- Experiment: Practice new knowledge and skills on the job; try, fail, and learn from that failure.
- Connect: Learn from one another to gain new knowledge and skills.
- Perform: Learn on the job and improve performance at the same time.
For this study, we mapped the learning methods we identified earlier in this report (Figure 1 above) against the 6 behaviors in the Employee Development Framework. The results are shown in Figure 3 below. Similar methods are then grouped together under each behavior. This clarifies which learning methods can be leveraged to enable which behaviors.
Different learning methods fall into different categories and enable different behaviors.
We see, for example, that courses enable consumption, talent marketplaces enable experimentation, mentoring enables connection, and so on.
The remainder of this section addresses each of the 6 behaviors and the categories of learning methods that enable them. We’ll also highlight real-life examples of orgs using these learning methods to enable each behavior.
Helping employees Plan their development
Helping employees plan their careers hasn’t always been considered part of L&D’s job. In recent years, however, L&D functions have recognized that career planning is a critical part of employee development: As L&D moves away from a one-size-fits-all approach to development, employees will need help figuring out what their own paths look like.
But L&D functions aren’t the sole owners of career planning. It touches other areas like performance and workforce planning. To successfully enable employees to plan, L&D functions need to work with other HR teams and business units to ensure systems, policies, processes, and methods are synced up. And as we’ll see below, some of the methods that support planning may not be owned by the L&D function, either, highlighting the need for close collaboration.
Learning methods that enable employees to plan their careers fall into 2 broad categories (shown in Figure 4):
- Information gathering
- Development planning
These 2 categories approach planning in different ways. Let’s look at how they do this in more detail.
Methods for information gathering
Info-gathering methods help employees collect information about the skills they have and the skills they need. Methods in this category include:
- Skills assessments
- Skills ratings
- Informational interviews
- Critical org skills definitions
These methods help employees develop a clear understanding of their own current state as well as the “skills market” they will likely face in the future. Leading practices surface info about both the supply side (employees’ skills) and demand side (org needs) of that market.
Methods for development planning
Development-planning methods help employees identify and commit to the development activities they’ll undertake to achieve their goals. They enable employees to plan their development activities, and their order. Methods for doing this development planning include:
- Career coaching
- Individual development plans
- Action planning
Orgs leveraging these methods well tend to tie together development planning, performance, and the employee’s and org’s skills needs. While learning and skills platforms are making this more possible, it takes some insight on the part of talent leaders to align all the methods, as well as the motivations for using those methods.
Real-world thread: Digitizing the career development plan
Career development plans (CDPs)—also known as individual or personal development plans—are one method orgs use to link development planning, performance, and skills. Until relatively recently, though, CDPs tended to be manual, static, and paper-based: An employee filled out a form that listed their goals and planned development activities. The employee had to find development opportunities themselves, list those activities on the form, and update the form as activities were completed. All too often CDPs would be filed and forgotten because they quickly felt irrelevant.
Career development plans are becoming more relevant, helpful, and dynamic as they are digitized and linked more closely to development, performance, and skills.
Digitization can help address these challenges by automating pieces of the process and linking CDPs to systems that contain relevant info (e.g., HRIS, learning libraries, skills platforms).
For example, an American multinational tech company makes CDPs that incorporate skills assessments available to all employees. Employees can use the online CDP tool to:
- Self-assess their skills in an area they’re interested in
- Ask their manager to verify the skills
- Receive recommendations for relevant learning opportunities to develop the skills
- Log activities they do to develop the skills
The CDP tool tracks the difference between current and desired skills and recommends learning paths to close the gaps, updating the recommendations as new activities are logged.1
Digitizing CDPs has 3 main benefits. First, it takes much of the paperwork burden off employees. Second, the automatically updated learning recommendations are far more relevant and useful to employees than a list they themselves created a year ago. And third, data from CDPs can give the org a dynamic picture of the workforce’s current and projected future skills.
Helping employees Discover the right development
Discovery is a critical component of learning: it connects employees to the development (content and opportunities) they need. L&D functions should make it easy and intuitive for employees to find relevant learning opportunities.
It’s a challenge that’s only getting harder as the ocean of learning content and development opportunities gets bigger and bigger. In the past few years, L&D functions and vendors alike have tackled this discovery problem with a vengeance. Earlier this year we offered our take on making sense of the chaos of learning content.2
Orgs are using increasingly scalable, automated methods to help employees find personalized, relevant development opportunities.
Here, we focus on how orgs are using learning methods to help employees discover learning opportunities more easily. Learning methods that enable discovery fall into 3 categories (shown in Figure 5):
- Centralized “push” communications
- Employee browsing / searching
We discuss each in more depth below.
Centralized “push” communications
Almost all the leaders we talked to said that L&D functions rely on “push” communication methods to tell employees about available development opportunities.
Methods in this category include:
- Informational emails
- Newsletters highlighting offerings
- Nudges to explore or complete assigned training
The big pro of centralized “push” communication methods is that they can be easier to administer and automate. It is relatively easy to send mass emails to specific groups—for example, all new managers are sent a list of available courses, learning pathways, and articles that pertain to them. It’s also getting easier, through technology, to personalize these communications at scale based on preferences or career paths.
Employee browsing / searching
Employees can discover development opportunities on their own by searching or browsing. The “Netflix of Learning” movement relies heavily on employees knowing what they want to consume and how it may benefit their career. Methods in this category include:
- Searching or browsing on the internet
- Searching or browsing the org intranet
- Searching or browsing in an LMS, LXP, or other learning platform
A challenge with these methods is helping employees find development opportunities that are relevant to them. To tackle this problem, many orgs are implementing methods that rely on ratings and reviews to surface the best opportunities. We’re also seeing methods that rely on artificial intelligence to parse massive amounts of text, audio, and video content to draw out themes, assign tags, and serve up highly relevant content.
Learning methods in the Recommend category personalize suggestions for development opportunities for each employee. Recommendations help employees quickly cut through the masses of learning content and opportunities to find something relevant to them. This category includes:
- Automated recommendations (learning platforms)
- Recommendations from managers
- Recommendations from colleagues / peers / social network
Initial, non-scientific observation tells us that employees may value certain types of recommendations over others, as the following example shows.
Real-world thread: Personalized recommendations to help employees discover
Different types of recommendations hold different value to employees—a fact that may influence leaders’ choices about learning methods.
Matthew Daniel, a principal at Guild Education and former head of learning innovation and technology at Capital One Bank, once ran a test to see how employees relied on different types of recommendations for learning opportunities. The results were:
- Employees relied overwhelmingly on recommendations from a manager or teammate
- Recommendations from business executives were next
- Recommendations from the learning tech system or L&D team were dead last, because “What do they know about me?”
“Keep in mind that not all recommendations are made equal." – Matthew Daniel, Principal, Guild Education3
Although Daniel’s experiment was limited in size, it aligns with our own observations that the more personalized a recommendation, the higher value employees tend to give it. However, we expect to see more and more improvements in learning tech tools’ ability to deeply personalize recommendations at scale. As leaders consider what learning methods to invest in, it’s worth keeping in mind the value of these personalized recommendations.
A final note on methods for Discovery: Discovery has always been important, but it is becoming even more important in the wake of some of the social justice movements. Many orgs are realizing that their Discovery methods are inherently biased. Our work with orgs has surfaced 3 explicit ways this bias makes itself known:
- Opportunities open to only a few. As orgs make use of more learning methods, they should open those opportunities to as many as possible. One complaint we have heard over and over is that fairly inexpensive (or even, if scaled, free) learning opportunities are only open to some people in some parts of the org. Is there really any harm in opening up a basic accounting class to someone who is currently in supply chain?
- Not making opportunities explicit. Information within an org often flows through informal channels; many learning opportunities, like job rotations or special assignments, are open only to those who know about them. We have seen a recent push by orgs to explicitly state all opportunities so that everyone knows what’s available.
- Failing to take data into account. Finally, data can help orgs understand who their message is reaching (or not). One org found that the majority of people taking advantage of an upskilling opportunity were white males, presumably because they had the most discretionary time. Data can help leaders understand how their messaging needs to change in order to provide opportunities to all.
L&D functions have quite a bit of power when it comes to Diversity, Equity, Inclusion, and Belonging (DEIB). Understanding the data with a focus on inclusivity can ensure that they wield that power for the good of the employees and the org as a whole.
Helping employees Consume relevant learning content
Enabling employees to consume content is where L&D functions have historically spent most of their time and energy. Creating and delivering training courses is a core L&D competency. Which is great: there will always be a place for courses.
What this research emphasizes, however, is that there’s much, much more to employee development than courses alone—and L&D needs to expand its repertoire of learning methods accordingly.
There’s an abundance of learning methods that enable employees to consume learning content—and L&D needs to expand its repertoire.
Given L&D’s historical focus, it is unsurprising that there’s an abundance of methods that enable employees to consume learning. At their core, these methods aim to deliver relevant (and, ideally, personalized and timely) learning opportunities to employees. They fall into 3 categories (shown in Figure 6):
- Consuming in groups
- Consuming individually
- Interacting with content
Let’s look at these categories in more detail, highlighting leading practices for each.
Consuming in groups
This category encompasses methods that orgs have traditionally thought of as “learning”—instructor-led courses that primarily “download” information from a teacher to a group of students. It also includes other methods that primarily rely on one-way flows of information to a group of people. Specifically, methods included in this category are:
- Instructor-led courses (virtual, in-person, hybrid)
- Town halls
- Live webinars
Because these methods are delivered in a prepared format to a large group of people, they don’t allow for much personalization in terms of content or flow, and generally don’t take into account (too much) individual needs or preferences.
Not surprisingly, the pandemic has motivated L&D functions to look for ways to mitigate the shortfalls of these methods. A whole category of tech that integrates with traditional meeting software has popped up to help prevent “Zoom fatigue” and make the online environment more engaging.
Some miss the mark dramatically (conducting meetings via Second Life? For reals?), but we applaud any attempt to engage participants through interactive elements, reflections, discussions, and the like.
Consuming content individually includes methods where employees receive “downloads” of information individually, at their own pace, and often, without the blessing of their L&D function. Methods in this category include:
- Self-paced online courses
- Articles / blogs
- User-generated content
- On-demand webinars
This list shows that orgs are taking content well beyond the simple course. For years, the e-learning course was the default for individual consumption—and there is still a place for it. But orgs are beginning to adapt to the way their employees want to learn, through curated articles and videos, podcasts, webinars, and books.
Some orgs provide access to these learning methods by adopting a next-gen LXP content aggregator to leverage machine learning and create a “front door” from which all learning can be accessed. Others have built content directly into the work, providing access to information where it is most needed. Still others invest in digital or actual libraries to provide access to learning methods that appeal most to employees.
Interacting with content
Learning methods that encourage employees to interact with content add dimension to the experience; instead of being passive participants, employees become active ones, interacting with the content and / or each other. The methods included in this category are:
- Interactive apps
- Adaptive learning
Leaders tell us that these methods can be a blessing or a curse, depending on how they’re used. For example, apps can be seen as gimmicky or bothersome if used incorrectly. But with the right application and engagement, they can be really helpful. One leader said about a SMS-based learning app:
“An HR business partner in my org is using a texting app to develop new habits. It texts her a little bit of info every day. She told me recently, 'I didn’t realize how much I’ve learned!'" – Kelly Rider, CLO, PTC4
All of these learning methods are enabled by tech and share a potential shortcoming: they can be used as hammers in search of a nail. L&D functions should understand when these methods are most appropriate and utilize them accordingly.
Real-world thread: Implementing a system with little formal training
Professional services firm Deloitte US was launching a project to implement a new technology that would help improve processes in its audit and assurance business. Previous comparable system implementations had been accompanied by roughly 24 hours of classroom training, which created some challenges for employees who often faced huge separations (in terms of time and distance) between learning and actually using the new system.
Deloitte LLC's shift in learning methods made learning more relevant, contextualized, and useful to employees.
Deloitte’s US L&D team, headed by Eric Dingler, Chief Learning Officer, Deloitte LLP, decided to take a totally new, “minimal formal training” approach to onboarding employees to the new system. The team reduced the number of classroom training hours from 24 to 3 and ensured those 3 hours of training were delivered right before an employee started to use the new system.
They also developed over 175 learning assets that were delivered as popups as employees used the new system. The assets, which included videos and text, were based on skills (rather than roles) and aligned to employees’ workflows. Many assets could be skipped if the employee already had the skill. They were regularly updated based on usage data.5
We liked this example because it so clearly demonstrates how a shift in methods can make learning more relevant, contextualized, and useful to employees, driving adoption of new behaviors.
Helping employees Experiment with new knowledge and skills
A well-established and growing body of research points to the importance of experimentation, failure, and reflection in learning.6 The more opportunities employees have to try new skills in realistic environments and then reflect on their mistakes, the better.
The methods we saw that help employees experiment with new knowledge and skills fell roughly into 2 categories (shown in Figure 7):
- Experimenting within learning experiences
- Experimenting on the job
Experimenting is a behavior where the L&D function doesn’t have complete control. For example, they can purposely build reflection exercises and role plays into the stuff they create, but they can’t ensure managers help employees learn from mistakes on the job—they can only prepare managers to do so.
L&D functions should put in place learning methods, systems, and processes to make experimentation easier and more natural.
This means that helping employees experiment is more about putting the right learning methods, systems, and processes in place so that experimentation is as easy and natural as possible.
Experimentation within learning experiences
When L&D functions think about enabling experimenting, they usually go straight to experimentation within a planned learning experience. Which is not surprising, nor is it wrong.
L&D functions can have quite a bit of sway in helping employees use new knowledge and skills. A phrase we hear a lot is “experimenting in a safe place.” This generally includes methods like:
- Role plays
- AR / VR / immersive 3D
- Reflection activities
In each of these instances, employees are given the opportunity to try out their new knowledge in low-risk situations. These types of methods are excellent in situations (and cultures) where mistakes have large consequences, whether those consequences are perceived or real.
Some of the situations where these methods may be the most appropriate option include:
- Realistic simulations that take participants through catastrophic failure scenarios or scenarios that would be physically dangerous in real life (e.g., nuclear power plants, active shooter scenarios)
- Role plays that help new managers learn how to give appropriate, non-biased feedback
- Use of AR / VR to help employees get over their fear of public speaking, or as part of a DEIB training to build empathy
These experimentation methods tend to be a bit more costly than many other methods, and this is where we caution L&D leaders to use some discretion: Many we spoke to are enamored with the idea of AR / VR, for example, but just because it can be used in a situation doesn’t mean it should be.
Experimenting on the job
The methods that fall in this category are some of our favorites—and for good reason. We’ve long been proponents of defaulting to the work first to teach new knowledge and skills. This means that L&D functions should think about developing skills in the context of the work and try to build opportunities into the flow rather than defaulting to learning activities that would take employees out of their job, and therefore out of context. Methods that fall in this category include:
- Job rotations
- Talent / gig marketplaces
- Volunteering outside of work
- Stretch assignments
- Job shadowing
- Reflection activities
Most of L&Ddom understands that these methods are effective. Why, then, are they not used more? We think it comes down to the fact that the L&D function often doesn’t “own” them. They don’t have direct control over the systems that determine job rotations or stretch assignments, for example.
L&D functions often don't "own" the systems that determine job rotations or stretch assignments—but that's changing.
That is changing. In many orgs, for example, the L&D function is one of the strongest proponents of a talent or gig marketplace. They see these marketplaces as a way not just to build new skills and knowledge, but also to collect information about those skills and knowledge, helping the L&D function to determine where the greatest need is.
L&D functions are also being included in larger discussions about talent development in general. Performance, engagement, mobility, and employee development are becoming 1 conversation instead of 4, making it easier to influence how and if these methods are leveraged.
For L&D functions that aren’t yet included in those discussions, we strongly recommend finding a way to be included. One leader we spoke to invited himself to important meetings about talent. Another used her influence to build relationships with her peers in other HR practices so that these methods could be included in the overall development strategy.
Real-world thread: Enabling employees to experiment through job rotations
The L&D function at Boston-based software company PTC is a strong proponent of job rotations to develop employee skills.
PTC’s early-in-career rotational program, which moves junior employees through various business functions over the course of 2 years, has been highly successful, with 100% retention of employees who participate. Similarly, the HR function rotates employees across the various HR teams.
These programs offer 2 main benefits. First, employees get to practice new skills in real work environments, giving them context for the things they’re learning. Second, PTC is building a more agile and resilient workforce by developing employees with transferable, cross-functional skills.
Kelly Rider, CLO at PTC, said:
"We’re more agile now because we can say, for example, 'Oh, this person has skills in recruiting. Let’s pull them over to this project that needs those skills.'" – Kelly Rider, CLO, PTC7
As leaders consider which methods to invest in and how to message their decisions to managers and employees, it’s worth remembering that these on-the-job experimentation methods benefit both employees and the entire org.
Helping employees Connect with each other for learning
Also known as “social learning,” methods that connect employees to each other for purposes of sharing knowledge and developing skills are already in many L&D functions’ quivers. L&D functions have long been interested in these methods and have tried to codify and formalize them for years.
Tech platforms have been developed to aid in this socialization, but connections often happen more organically: one employee asks her colleague how to do something; other employees attend conferences together, tend to their social media channels, or seek out a mentor.
The L&D function’s role in helping people connect has as much to do with building cultures that encourage sharing as it does with formal processes or tech. For example, coaching has seen a revival in recent years, and is an important formal way to help people connect for learning. Equally important is normalizing virtual collaboration channels, such as Teams or Slack, as ways to collect and share information that could be useful more broadly.
As with some methods for experimentation, L&D functions often do not own a lot of the methods that enable employees to connect. Rather than trying to control these efforts, we’re seeing L&D functions reach out, partner with other functions, and focus on convening people and amplifying what’s good.
The L&D function can help people connect by focusing on convening people, amplifying what’s good, and creating a culture that encourages connection.
For example, a handful of leaders in different orgs gave us the specific example of mentoring programs. Rather than creating mentoring programs themselves, they’re codifying and sharing info about what’s working in some pockets of the org so people who are interested in starting mentoring programs in other areas don’t have to reinvent the wheel.
Interestingly, as the pandemic forced many employees into their homes and away from face-to-face work interactions, the value of true human connection skyrocketed.8 This theme showed up as strongly in learning methods as it did in other areas of work: in the roundtable we held as part of this research, a significant portion of the discussion was dedicated to how L&D functions can invest in learning methods that foster meaningful connections between employees.9
Methods that help employees connect for learning fall into 3 categories (shown in Figure 8):
- Connecting 1:1
- Connecting groups
- Connecting to the outside
Let’s discuss these categories in more detail.
It’s not surprising that employees connect 1:1 to learn—one of the easiest ways to find something out at work is to walk down the hall and ask a colleague (or, these days, send them a Slack or Teams chat). It’s hard to beat the level of personalization and contextualization that comes with a 1:1 conversation with someone who’s been there, done that.
L&D functions have some options when it comes to connecting people 1:1. The methods we ran across in this study include:
- Leader as teacher
- Expert directories
All of these methods can happen with or without the involvement of the L&D function. Many employees find their own coaches, mentors, and experts who can help them develop the knowledge and skills they need for their career.
For years, L&D functions have tried to provide a more systematic and scalable approach to these methods to ensure that those who need or want this type of experience get it. This is increasingly important as orgs are being scrutinized by their boards and by the public for their efforts to provide equitable access to development opportunities and advancement.
Recently, there has been an uptick in the use of these methods, particularly coaching. Orgs have many creative ways to scale these 1:1 connections. You can read about them in the final report of our recent coaching study.10
As orgs adopt more team-oriented workstyles and compensate and judge team performance accordingly, it makes sense that employees would also connect more as groups for learning. Methods for connecting groups for learning that we identified from our research include:
- Employee resource groups (ERGs)
- Communities of practice
- Virtual collaboration (Slack, Teams, etc.)
- Team coaching / training
- Knowledge sessions (e.g., brown bag lunches)
- Book clubs
- Discussion forums
While L&D functions may not have direct responsibility for all the methods listed above, they can either influence their usage or leverage them for learning purposes.
Group-based development both fosters human connection and enables employees to practice new skills with the people they'll be using those skills with on the job.
L&D functions likely own methods like team coaching and knowledge sessions and can therefore structure them in ways most beneficial to the learning goals of the org. Leaders said they’re experimenting more and more with group-based development because it does 2 things at once: it fosters human connection among group members and enables employees to practice new skills with the people they’ll be using those skills with on the job.
L&D functions may not be the sole owner of ERGs and communities of practice, but they can influence them by helping to craft charters and training leaders, and they can leverage them to develop necessary skills and awareness in those taking part—DEIB awareness or wellness, for example. Participants can also be tapped to help L&D functions understand where the development needs are and, in some cases, help develop the content.
One learning leader told us that convening groups across the org is a key way his team supports connection at scale. His org has a strong culture of building communities of practice that connect people across disciplines to share templates, best practices, and insights.11 The central learning team leverages this culture (remember that trend of building on what already exists?) to connect groups for learning.
Connecting to the outside
Finally, orgs use methods that connect employees to people and ideas outside of their own walls. This category focuses on ensuring employees have the external connections they need to be successful, both in their current roles and in their long-term careers. With so much change in almost every industry and function, both orgs and employees benefit when people are able to forge connections outside the org. The methods in this category include:
- Industry conferences
- Professional organizations
- Professional / personal networks
While these methods are sometimes seen as the responsibility of the individual employee, we think that L&D functions should invest to enable them, as they yield pretty large benefits to orgs:
It’s a (true) cliché that who you know influences what jobs you get; it’s also true that who you know influences the development opportunities you’re able to secure. Our research earlier this year on skills and internal mobility both revealed that employees’ personal and professional networks strongly influence the opportunities they find out about and have access to.12 Enabling all employees to build these relationships is a critical task for L&D functions.
While methods to bring the outside in may often be seen as the responsibility of employees, we think orgs should invest in them because they benefit the org, too.
Bringing outside info in
We’ve worked with a number of orgs this year who’ve talked about the need to “look up and around”—to stay current on trends, leading thinking, and leading practices in a fast-changing environment. Enabling employees to connect to the outside is one way to do this.
Developing interpersonal and networking skills
Networking is an increasingly important skill. Because one person cannot know everything, orgs should enable employees to know the people who know all the things. Conferences, professional organizations, and building professional and personal networks are key. In some industries, such as consulting or sales or politics, a good network is crucial to good performance.
Real-world thread: Supporting employees to connect for learning
We mentioned above that orgs are finding lots of ways to systematize and scale 1:1 connections. One way they’re doing this is by focusing on sharing info, providing guidance that applies to the whole org, and highlighting leading practices.
An org that’s doing this well is a large US insurance company which supports a variety of mentoring initiatives. The central L&D team understands that in an org of 60,000 employees, different business areas have different ways of doing things and that sometimes employees find mentoring opportunities on their own—and that’s a good thing.
Kaitlyn M., formerly a learning leader at this company, said:
“Mentoring happens in the community, at centers of worship, at connections from other companies. Those aren’t things our company can control, nor should we." – Kaitlyn M., former learning leader, large US insurance company
Accordingly, the central L&D team focuses on providing enterprise-wide guidance and sharing leading practices about mentoring in the company. They disseminate answers to questions like:
- What’s cutting-edge in mentoring?
- What’s the definition of mentoring at State Farm?
- What are the qualities of a good mentor?
- How long should mentoring relationships last?
The central L&D team intentionally looks for answers to these questions inside as well as outside the org. An analyst on the L&D team does research within the org to identify which groups are doing mentoring well. The team then works to formalize and scale these leading practices as enterprise guidance.13
Focusing on org-wide guidance and sharing what’s already working strikes us as a highly effective and efficient way of fostering connections at scale.
Helping employees Perform better on the job
L&D functions should be helping employees learn on the job and perform better while doing it. Performance support has long been part of the L&D repertoire—think standard operating procedures and other methods that attempt to make task performance as predictable and standardized as possible. But there’s more to it than that.
We’re seeing new ways of enabling learning-while-performing in 3 key areas:
Pushing data down
More orgs are enabling employees to access data about their own learning and performance. Data, such as customer feedback, sales numbers, performance reviews, and learning strengths are shared with managers and the employees themselves because orgs are recognizing that the employee is the person best equipped and most motivated to act upon it.
Leveraging employee knowledge
In keeping with the trend toward user-generated content, leaders said they’ve started to invite employees to take a more active role in creating and updating documents related to performance support (such as job aids).
Culture of feedback
Historically, feedback has been given from manager to employee, and in some rare, formal cases, from employee to manager via 360 (or formal complaint to the HR department). However, many orgs are actively looking for ways to make feedback a part of the culture. More on this below.
Orgs are experimenting with learning methods that push performance data down to the individual—the person best equipped and most motivated to do something with it.
Methods that enable employees to perform better on the job and learn while doing it fall into 3 categories (shown in figure 9):
- Collaboration spaces
Let’s discuss these categories in more detail.
For about a century, one of L&D’s primary roles has been to help employees perform certain tasks as similarly as possible. Efficiency, standardization, and predictability were the name of the game—think manufacturing. The instructions category reflects that history, and is all about providing employees with detailed descriptions of how to perform critical tasks in their role. Methods in this category include:
- Job aids
- Standard operating procedures
- Job safety analysis
But there have been some exciting developments in recent years, moving these established methods into the 21st century. Newer approaches to these methods include ways to involve employees in creating and updating the information. People who have done the job are often responsible for updating the documents, reflecting a belief that learning doesn’t need to be prescriptive or from a centralized source—it can be more helpful “from the horse’s mouth.”
If the instructions category is fairly set—“this is how this job is done”—then methods in the collaboration spaces category are more fluid. They’re about creating places for employees to share their knowledge on a topic, update that knowledge as needed, and easily access the knowledge from inside their work. These methods work best, of course, when they’re well-organized or easily searchable so people can find what they’re looking for.
Methods in this category include:
- Shared files
In many cases, L&D functions do not own the sources of information for these methods: someone closer to the work does. L&D’s role therefore becomes about providing visibility and accessibility to content, not necessarily creating or monitoring that content.
Because they don’t own the sources or the content, L&D functions often need to work with other HR functions or the business function to make sure employees have the needed visibility and access.
Learning methods in the feedback category give concrete and actionable information to employees about how they’re doing and how they can improve. This is one of our favorite categories, since it can so strongly enable a learning culture.
Feedback gives concrete and actionable information to employees about how they're doing and how they can improve.
Methods in this category include:
- After-action reviews
- Performance statistics (e.g., sales revenue)
- Feedback from customers, managers, and peers
L&D functions often hit this category indirectly: they don’t have direct control over the feedback people give one another, but can influence how those people learn (e.g., they can’t determine if managers are going to provide feedback, but they are responsible for what managers learn) about when and how to give good feedback. They also can help employees seek feedback, which just makes the situation better all around.
Feedback can be tech enabled or not. On the tech-enabled end of the spectrum, a lot can be done by putting actionable data directly into employees’ hands. For example, one vendor we talked with showed us their tech platform, which sends customer ratings and reviews to call center reps within minutes of a service call—allowing the employee to adjust efforts in time for the very next call.
On the less tech-enabled end of the spectrum, we’ve heard a lot about working with other functions to build feedback into processes, as in the after-action review example below.
Real-world thread: Creating a culture of feedback
An L&D function can’t create a culture of feedback alone. But they can influence the culture by working with other functions to implement systems and processes that make feedback part of the way business is done.
The US Navy SEALs have embedded feedback deeply into their culture, partly through the use of their famous After-Action Reviews (AARs). In an AAR, all members of a SEAL team gather immediately after a mission or training session to break down the event in detail. They ask:
- What went wrong?
- What did each person do, and why did they do it?
- What will we do differently next time?
The candid feedback team members deliver to one another in AARs is often uncomfortable, but it’s considered an essential part of how the team gets better.14
We’ve implemented AARs in our teams at RedThread and have found enormous benefit in the structured ways they help us learn from mistakes and improve our work. We’d encourage leaders to consider this and other methods of embedding feedback—first in the L&D function itself and moving out from there.
All the methods
There you have it: The dozens of learning methods we’ve found, the categories they fall into, and the 6 behaviors they can enable (Plan, Discover, Consume, Experiment, Connect, and Perform). It’s a lot—a lot of methods, and a lot of ways they can each be used.
Not all orgs need all learning methods, of course. But L&D functions do need to think about whether they are appropriately supporting each behavior, and whether they have the right mix of methods to enable their employees to build the skills that serve the org’s strategy.
One learning leader, whose team focuses on enabling external customer learning, described her org’s thought process this way:
“We think about the customer’s learning journey and the different methods that are required at every single point. What methods drive discoverability? Adoption? And so on.” – Sonia Malik, Learning Alliances Manager, IBM15
With a clearer understanding of what learning methods are available and how the Employee Development Framework can be used to conceptually organize those methods, let’s turn to the question of what methods might be best for your org—and how to decide.
The right methods in the right combinations
So far, we have reviewed the Employee Development Framework and identified several dozen learning methods that can enable each of its behaviors. Just having a (fairly) comprehensive list of learning methods can help L&D functions begin to think beyond the course and introduce, recognize, or support other ways of learning in the org.
However, as most L&D leaders understand, having information and knowing what to do with it are 2 different things. In our conversations, we heard of 5 innovative ways leaders are deciding which methods work best for their orgs:
- Understand the messages your methods send
- Experiment, iterate, push boundaries
- Design bike paths, not buses
- Squirrel!: Don’t get distracted
- Let go of what’s not working
Note that none of these 5 approaches indicates a one-and-done decision. Instead, “deciding” on the right learning methods in the right combinations is an ongoing process of experimentation and improvement, trial and error.
Deciding on the right learning methods is an ongoing process of experimentation and iteration.
Let’s look at each of the 5 approaches to deciding on learning methods in more detail.
Understand the messages your methods send
We heard a consistent refrain from leaders in this research: know your audience. We’d expand this sentiment to: know your culture, and know the culture you want to create. The decisions L&D functions make about learning methods strongly shape how learning happens in the org. That means those decisions need to both fit the current ways learning happens and nudge the org in the direction of the learning culture your org wants to have (or further in that direction, if you’re already on the way).
For example, a business services org we have worked with was heavily relationship-oriented. That feature was a strength of the culture, and one the L&D function wanted to support and enhance. Instead of drowning their employees in a sea of e-learning courses, they focused on events, both live and virtual, coaching, and using leaders as teachers.
When it comes to deciding on learning methods, the saying, “actions speak louder than words” is true. L&D leaders should think carefully about what their learning methods “say” to the org, because those messages help nudge the org toward or away from the culture the org wants to create.
Here are a few examples of intentional or unintentional messages sent by chosen learning methods.
Mandatory individual development plans
An org that institutes individual development plans (IDPs) for all employees and makes them easy to access and linked to learning opportunities and performance data might send the message that, “We continuously develop at this company. We want and expect you to develop while you’re here.”
Many orgs have some development opportunities that are limited to certain groups or types of employees. Making those opportunities visible to all, but available to only a few, might send the message that, “This opportunity is only for special people (and you’re not special).”
Coaching for all
An org that actively implements various types of coaching, at least some of which are open to all employees, might send the message that, “We are investing in you. We care about everyone’s development, including yours.”
Orgs that implement job rotation programs and market them widely internally (as opposed to programs that exist but remain largely unknown and unused) might send the message that, “Mobility is important to us. It’s important for you to get exposure to other areas of the org.”
Orgs that invest in sending employees to conferences on a regular basis might send the message that, “We’re interested in your gaining outside perspectives.” Orgs that also encourage employees to present at conferences send the message that, “We want you to be seen as an expert in this area.”
Orgs that use after-action reviews as a part of how they operate might send the message that, “We learn from our mistakes.”
These are just a few of the potential messages these learning methods might send. Bear in mind that whatever methods you decide on are going to send a message. We encourage leaders to think carefully about what those messages might be and whether they will nudge the org in the direction you want.
Experiment, iterate, and push boundaries
Given how traditionally many L&D functions say they operate, we were pleasantly surprised at the extent to which leaders said they take an iterative approach to learning methods. They try new methods, or new approaches to existing methods. They see what works and improve over time.
This iteration mindset—and it is a mindset—takes conscious effort to embed into the org. Leaders said they make a point to celebrate failures as well as successes, highlighting failures as opportunities for learning. They also work hard to build a sense of community and psychological safety so that all L&D team members feel encouraged to try new things.
Leaders should experiment, see what works, assess what doesn’t, and improve over time.
Another way leaders encourage experimentation and iteration within the L&D function is by thinking about learning as a product. They think in terms of a product development cycle: discovery / understanding needs, developing a minimum viable product, piloting and testing, and so on.
And like good product developers, the L&D function shouldn’t be just an order-taker that simply caters to the current needs and desires of employees or other business functions. Instead, L&D functions have a responsibility to help the org and its employees continually develop. That sometimes means gently helping people step outside their comfort zones by painting a picture of what the future might look like and how they will benefit from that future.
Leaders in this area also obsessively use data and feedback to inform experiments. One leader placed trackers on different pages in a new learning portal and used A/B testing to see which pages were most effective. Others pay close attention to the comments and reviews from employees in their learning systems and adjust based on that feedback.
Real-world thread: Pushing boundaries
L&D leaders can use data to push boundaries. At a Fortune 100 manufacturing org, a new method for cybersecurity training was making some people uncomfortable. The L&D function was able to use data to help people see that it was working.
In early 2021, the central L&D team paired with the IT group’s cybersecurity team and business functions to launch a cybersecurity training. The training sent mock phishing emails to employees, tracked how they handled the mock emails, and recommended follow-on training opportunities in the L&D org’s learning tech platform. The learning tech platform also captured employee comments on the training.
After sending the mock emails, the IT group received about 30 emails (including from one very senior leader) criticizing the training. However, the mock email campaign also received about 300 positive comments in the learning tech platform, and many leaders began sharing the recommended training with their teams.
The central L&D team was able to look at the data and fully contextualize those 30 criticisms against everything else that was going on.
The CLO said:
“All this goodness was happening, and we were able to package all the data to show the full picture to our partners in IT and our senior leaders. That was an early example of not freaking out about something new because we had the data to show it was working." – CLO, Fortune 100 manufacturing org16
This focus on data and feedback can help make the case for new approaches that might otherwise be overcome by org inertia or resistance.
Design bike paths, not buses
We love the following analogy from Eric Dingler, CLO, Deloitte US. Eric shared this analogy to illustrate how L&D functions have historically used inflexible, course-focused learning methods:
“L&D is great at designing buses. We need to get better at designing bicycle paths. Not 1 million paths, sure, but a lot more than 1 per year per level.” – Eric Dingler, CLO, Deloitte US17
Buses accommodate a big group of people but are inflexible in their seating, their route, their destination, and their stops. They’re designed to efficiently get a number of people with the same general need to a place relatively close to their final destination.
Bike paths, on the other hand, may lead to the same destination, but accommodate slower riders or professional cyclists, those out for a Sunday roll or those in a race. People can ride together, but don’t have to. Parents can teach kids to ride on their own along the way. Everyone can stop for some ice cream, and some can even get off the path for a while.
Eric is in favor of methods that act more like bike paths. At Deloitte, he and his team are building learning opportunities that are more flexible and therefore easier for each individual employee to use in ways that work for them: employees can jump on a path whenever they want, proceed at their own pace, and jump off when they’re at their “destination” or need a break.
Note that the analogy doesn’t imply pure chaos or lack of structure. Indeed, L&D functions should still provide a lot of structure to ensure employees get where they need to go. The key seems to be creating systems that allow for flexibility within that structure.
We heard similar sentiments from other leaders when describing their attempts to enable flexibility within structure. One put it this way:
"It’s necessary to provide enough structure so that experimentation happens organically. So that it feels easy and effortless." – Ryan Cozens, L&D Lead, Well Health18
This type of flexibility benefits more than just the employee. Orgs using “bike paths” can respond more quickly to disruption—changes in strategy or worldwide pandemics, say. They also have more data because they have more employee touch points and can therefore better understand what skills they have and where they should focus their development efforts. And they can more easily experiment with new methods and change things that aren’t working without a huge investment of effort, time, or money.
How are leaders going about building flexibility within structure? The exact approaches vary from company to company, but 3 common themes emerged.
If flexibility is the goal, then it makes sense to make learning opportunities more modular so they can be switched around, combined, or removed as needed. This movement toward modularity applies to learning content, sure, but also to the ways learning opportunities are pieced together.
For example, if an org wants all employees to have data analytics as a skill, traditionally they might have sent all employees through the same course. However, if they’re thinking in terms of modularity, they may combine many learning methods to meet the employees wherever they are.
A financial analyst may just need to verify that she already has the necessary skills. An engineer may find he needs a few refresher videos and a practice exercise. And an artist may discover that she benefits from all of the modularized videos, all of the practice assignments, and verification that she has the necessary skill.
Integrated with the work
Leaders also told us that they were trying to integrate the learning with the work as much as possible. This meant different things to different leaders. Those new to the idea focused on “bite-sized” bits of learning that could be easily integrated. Those with more experience were utilizing methods in Experiment, Perform, and Connect, specifically, to use the work as much as possible as the main learning tool.
Most of the leaders we talked to are adopting a skills-based approach to learning methods. The granularity of skills makes it much easier to assemble, break down, and reassemble learning methods, and to set up “create-your-own-journey” learning paths for employees. L&D’s role in this case is to clearly tie methods and content to skills, and to provide assembly instructions for those building their own paths.
Squirrel!: Don’t get distracted
From both the roundtable and our interviews, leaders told us that the first criterion for selecting learning methods should be that the development helps to solve a business challenge. We love this, and we absolutely think it should be applied to learning method decisions.
Knowing the L&D space like we do, and also understanding L&D’s propensity for research and love of tech, we want to point out an obvious but maybe painful truth about this group: we like shiny things. We like new, and we’re constantly looking for things that can help us do what we do (and therefore help employees do what they do) better / faster / shinier.
That said, the well-grounded leaders in our roundtable pointed out that the methods you select should solve a challenge the org is having. While methods don’t tie directly to business challenges (you can use several to solve the same challenge), we agree that a focus on the org, not just the employees and their experience, is key when choosing methods.
The methods you select should solve a challenge the org is having. A focus on the org, not just employees and their experience, is key.
In this study, leaders mentioned 3 questions they ask themselves when considering a new method for their org to keep themselves from getting distracted.
Does the method make financial sense?
There are lots of sexy learning methods out there, and all of them cost money. Leaders spoke not just of finding the budget or justifying the cost, but actually analyzing the method financially to determine whether it made sense for the problem it was meant to solve, including internal resources that will be needed to implement, coordinate, and service it.
Does the method work for the intended audience?
We heard a lot from leaders about the need to know your audience. They ask questions like: Are the methods likely to achieve the goal, given the audience’s needs and preferences? For example, the needs and constraints of an office worker are quite different than the needs and constraints of a manufacturing supervisor. Understanding those differences and choosing methods that work well for them is pretty important when enabling their development.
Does the method work with your org’s tech stacks?
Finally, leaders talked about their org’s tech stack—both leveraging what is currently there and implementing new methods with tech components. Two areas they mentioned specifically were user experience (did this align with what learning meant in the org?) and data (how would this data be passed to other tech or stored in a data lake?).
If the answer was yes to all of these questions, leaders were more comfortable implementing—or even just experimenting with—the method in question.
Let go of what’s not working
Leaders told us that the first thing to do when a method isn’t working is to figure out why. Sometimes a quick tweak, more communication, or better visibility can increase engagement and value to the org. Sometimes it can’t. Sometimes you have to let things die. And sometimes you have to cut them loose.
Letting things die is often a challenge for L&D functions. Candidly, there’s an emotional aspect to letting go. L&D teams often work hard to create learning methods that are often successful for a time. Or they work hard on things that just flop. In either case, there can be natural resistance to letting go of methods that no longer work. But as one leader said:
"The world is changing. Don’t get attached to a learning method. If it works, great. If not, move on and find something that works better." – Jeffrey Mills, Manager, Org Learning & Development, QSC19
More tactically, the leaders we spoke with revealed 3 enlightening ways to make it easier for their L&D functions and L&D professionals to let go of what’s not working.
Make methods disposable
Some leaders consider all learning methods as disposable or having a short shelf life. Everything has a life cycle, they said, which means it’s natural for things that no longer work to be pulled out of the system.
This is akin to hand-making valentines. You put a lot of effort in them to benefit someone, but you know they’re likely going to appreciate it (or not) and then toss it on February 15. By having a mindset of disposability, L&D leaders and their teams can make more rational decisions about what to invest in and what not to invest in, both originally and on an ongoing basis.
Implement the strategic pause
Sometimes it can help to simply put time and distance between the realization that something isn’t working and the final act of letting go. One leader said:
“If something isn’t working, we’ll do a ‘strategic pause.’ We stop providing the program and evaluate if it’s really needed. If so, we rework the offering. If not, we find a way to offboard it."
– Roundtable participant, Oct 2021.”20
Delegate the decision to the business function
In orgs that use a charge-back system (business functions contract with a central L&D function to create a learning initiative), L&D leaders can leave the keep / toss decision to the business function. If a program or a piece of content continues to be useful, the business function keeps it alive. If not, they let it go.
Cutting them loose
In some instances, L&D functions may take a more proactive approach to getting rid of learning methods. In such cases, the cutting-loose is planned as a part of the method’s implementation. Orgs focusing on skills, for example, understand that the learning methods, the data, and the tech associated with teaching skills that they’re using in this moment are not the ones they will be using in 3 years—and maybe not even in 6 months.
Tech, data, and methods continue to get better. So rather than just thinking of certain methods as disposable, it’s important to make them disposable. One leader mentioned 2 main ideas.
First, contracts with vendors of all sorts should get shorter. Large learning tech implementations are sometimes 10 years long, which gives plenty of time to implement the tech, work out the bugs, see the benefits, and then ride the wave for a while. However, if your org knows it will be jumping to something better as soon as it’s available, contracts need to be negotiated with shorter terms.
Building an exit strategy into the implementation of learning methods ensures that they won't outstay their welcome.
Second, there has to be an exit strategy. As learning tech gets more sophisticated, it becomes more integrated into an org’s tech stack, and more data flows into and out of the system. When an org plans to kill a method or tech, it needs a plan in place for what happens with those integrations and, even more importantly, that data.
We love the idea of building an exit strategy into the implementation of learning methods. Although it doesn’t apply to all methods, it ensures that when it’s critical, methods won’t outstay their welcome.
Real-world thread: Letting go of what’s not working
When it becomes clear that a learning method isn’t working, the right course of action is sometimes to ditch it in favor of something that does work.
That’s what happened at NASCO, a healthcare tech company. A customer-facing learning program initially created a full user guide to help customers use a NASCO product. As the Workforce Readiness Solutions team watched customer usage, they realized customers weren’t using the guide. Instead, customers were capturing video snippets and using them to train employees on the product.
The NASCO team used that data to adjust their efforts. They started releasing short video snippets, eventually replacing the full user guide with video snippets as a part of the customer’s knowledge management solution.
We loved NASCO’s willingness to let go of a product they’d devoted a lot of time to, and pivot to ensure their learning methods were as useful to their customers as possible.21
Although most of the learning methods we’ve covered in this report have been around for a while, the ways they’re being used in many orgs are quite innovative. The 6 behaviors in the Employee Development Framework offer a way to understand what learning methods are available and how they can be used to enable different learning behaviors—making it easier to assess whether an org has the right methods, in the right combinations, for their goals.
We expect to see more orgs taking a broader approach to learning methods in the future, because leaders see the value in the flexibility and personalization such approaches offer in a changing business climate. We look forward to seeing how that approach works and continues to evolve.
Appendix 1: Methodology
We launched our study in the fall of 2021. This report gathers and synthesizes findings from our research efforts, which include:
- A review of 60+ articles, podcasts, videos, and books from business, trade, and popular literature sources
- 1 roundtable with 28 participants
- 15 in-depth interviews with leaders on learning methods
For those looking for more info on this topic, you’re in luck: We have a policy of sharing as much information as possible throughout the research process. Please see these articles on our website:
- Premise: Choosing the right development opportunities for your employees
- Next-Gen Learning Methods: Literature Insights
- Roundtable readout: Choosing, evaluating, and offboarding learning methods
Appendix 2: Contributors
Thank you so much to those of you who participated in our roundtable and interviews. We couldn’t have done this research without you! In addition to the leaders listed below, there are many others we can’t name publicly. We extend our gratitude nonetheless: You know who you are.
- Adrian Klemme
- Alison Antolak
- Ann Boldt
- Brian Richardson
- Buck Seward
- Dan Balzer
- Deanna Foster
- Doug Osborn
- Eric Dingler
- Erik Soerhaug
- Jeffrey Mills
- Jim Maddock
- John Fallon
- Kaitlyn Mathews
- Kate Earle
- Kelly Rider
- Kim Ziprik
- Lisa Ross
- Rachel Hutchinson
- Ryan Cozens
- Sonia Malik
- Stephen Wilhite
- Stephen Young
- Zachary Pfau
In addition, we thank Ferenc Laszlo Petho for graphic design and Jenny Barandich for layout.
Posted on Tuesday, November 16th, 2021 at 4:30 AM
The Evolution of Coaching
Coaching is a big dang deal right now. A developmental method that, in the past, was reserved for leaders, future leaders, and problem children has, in a sense, gone mainstream. In a nutshell, more org leaders are using coaching for more reasons, more coaching flavors are surfacing, and more coaching configurations are being developed.
Let’s explore some of the larger changes we’re seeing.
From scarce to abundant(ish)
To put it simply, there is a larger supply of coaches. According to ICF’s Global Coaching Study, there are approximately 71,000 professional coaches in the world as of 2019, a 33% increase over a 2015 estimate.1
Not only is there a larger supply, but access to that supply has been simplified. The pandemic has normalized virtual connection, making it easier to receive coaching from any distance. Additionally, coaching tech platforms continue to make it easier to find, vet, hire, and engage with coaches, and to improve the coaching experience overall.
From exclusive to inclusive
Because of high cost and low scalability, most orgs have reserved coaching for specific audiences—generally leadership—which gave it an air of exclusivity. In recent years, however, movements such as #metoo and #blm have made orgs take a hard look at how opportunity is distributed and rethink how and what they’re offering in terms of development.
This has caused many orgs to look for ways to offer coaching more broadly and evenly. Orgs that value coaching are finding ways to make it more available to more individuals—causing them to throw out traditional definitions in favor of more inclusive, broader coaching configurations (more on this later).
From narrow scope to broad scope
When we asked leaders why they were investing in coaching, their answers varied much more than we expected. Coaching now has larger goals than simply improving the behavior of one person. Some of the most common goals we heard included:
- Connection and engagement. As workforces have moved to either hybrid or remote situations, the need for human connection has become increasingly pronounced. Medical research tells us that the quality and quantity of individuals’ social relationships has been linked not only to mental health, but also to morbidity and mortality.2 Coaching likely appeals to orgs right now because most coaching activities are about connection between humans. As such, we’re also seeing coaching being used by several organizations as a way to engage employees.
- Personalized development. While personalized development has long been the pipe dream of learning leaders and academics, the pandemic forced the issue: employees not only couldn’t gather to learn the same stuff, they actually needed different stuff. Coaching, at least in some of its more scalable forms, can meet the diverse development needs of many employees without breaking the bank.
- Change management. As orgs face an unprecedented rate of change, they are grappling with issues such as hybrid work, wellness and burnout, and diversity, equity, inclusion, and belonging (DEIB), among others, and are looking for ways to reach large portions of the employee population. New coaching methods enable individual behavior changes at scale.
- As coaching becomes easier, less expensive, more applicable, and more scalable, more orgs are experimenting by offering it to more people. Coaching tech has likely also played a role in amplifying the discussion about coaching. These solutions run the gamut, from simplifying administration of traditional coaching initiatives to delivering coaching by way of tech. These solutions are causing some noise, curiosity, and experimentation (for more on coaching tech, see our sister report, Coaching Tech Landscape: The Humans and the Machines.)
Coaching is being used to engage employees, personalize development, manage change, even experiment.
From stand-alone to integrated
As coaching becomes more mainstream, it is also becoming more streamlined through integration with other activities. While it used to be a stand-alone, one-off development opportunity rarely tied to anything else, coaching efforts are now being coordinated more broadly.
Coaching is now being integrated into career and development discussions and activities (mobility and career), performance improvement actions and independent development plans (performance management), and leadership development initiatives and skills discussions (learning and development). As such, coaching no longer belongs to just one vertical—it is, and should be, a shared initiative.
From one definition to many
Finally, the very nature of coaching is changing. In its most traditional sense, coaching is defined as a 1-on-1 relationship between a professional coach and a coachee, which extends over a period of time to improve the behavior, knowledge, and skills of the coachee. The goal is to improve the individual—the one person receiving the coaching.
That is changing. While some leaders we spoke with still use this traditional definition, the boundaries of coaching are being pushed. The lines between coaching, mentoring, educating, championing, and any number of similar roles are blurring, which has led to several new flavors of coaching.
For the purposes of this paper, we’re not going to clearly define what “coaching” is. We have found that semantics discussions rarely benefit anyone. Instead, we will focus on the broader definitions offered by the leaders we spoke with and the literature we read.
Boundaries around how coaching is defined and executed are expanding.
Which is what we’ll do next.
Coaching Methods: All the Choices
As we outlined in the first section of this paper, there are now fewer rules surrounding both the definition and application of coaching. As with any area that experiences a lot of growth and invention, there is also a lot of confusion around coaching.
What used to be a fairly straightforward proposition has morphed into something impressively more complicated: leaders aren’t just choosing who gets access to coaching anymore; they’re also choosing what kinds of coaching they’ll invest in and who gets access to which kinds of coaching. This was the most common question we got: Which one should I be implementing?
There is no "right" answer to how coaching should be implemented; but there are lots of choices.
While a “right” answer would be nirvana, we unfortunately didn’t come up with one. What we did find, however, was that there are many choices when it comes to coaching, and depending on your goal, there may be better choices than others.
We identified 9 methods that orgs are utilizing to coach employees, as shown in Figure 1 below.
While the chart above provides a nice summary of options leaders have for coaching, it does a bit more than that: It provides a way to think about those options.
To this point, org leaders may have assumed that coaching is for the individual. Earlier, we mentioned that coaching has traditionally been utilized to influence one individual at a time. However, many of the methods we identified do not fit that mold: they’re increasing their scalability and impact by focusing on group or org effectiveness as well.
Orgs interested in affecting the effectiveness of just 1 individual tended to stick to methods that focus on helping individual employees reach their potential. These tend to rely on the traditional coaching 1-on-1 relationship, although we did see variations on that theme.
Group or team effectiveness
Orgs interested in affecting the effectiveness of a group focused on coaching methods that help managers and teams work better together. This often included equipping managers with coaching skills or bringing in an outside coach or tech to work with an entire team.
Orgs interested in developing a culture of coaching or large culture changes tended to use methods that help orgs build coaching cultures, increase feedback and learning from each other, and focus on large-scale org initiatives. Some of the more revolutionary types of coaching methods fell here, including things like coaching circles, tech-led coaching, and initiative-driven coaching (DEIB and wellness).
Coaching can affect a single individual, a team, or an entire org.
The balance of this report will address these 3 major targets or goals and their respective coaching methods. We’ll provide more information on the goals, respective methods, and leading practices. We’ll also highlight examples of how organizations are using these methods and provide a list of questions that leaders should ask themselves as they consider the methods that may be best for their orgs.
Coaching for individual effectiveness
We’re big fans of using coaching to improve individual effectiveness. We think 1-on-1 interactions and guidance are a fantastic way to develop employees. And we’re not alone. In fact, the majority of leaders we spoke with are focusing a large part of their coaching initiatives on improving individual effectiveness.
Coaching methods that focus on improving individual effectiveness generally have a few things in common.
- Professional or trained coaches. These methods generally involve either a professional coach or at least coaches that have some training. We were both surprised and delighted that most leaders we spoke with take the idea of coaching seriously and ensure that the coaches have the skills they need. (This point obviously varies slightly when talking about self-coaching.)
- One way. Coaching methods in this group are not mutually beneficial—meaning that the coach is in place to help the coachee achieve something. Some of the other methods we will discuss have a different dynamic, but in the case of coaching for individual effectiveness, coaches help coachees.
- Coaching methods for individual effectiveness are just that: individual. These methods leverage a 1-on-1 relationship between coach and coachee and the work is individualized to the coachee’s specific needs.
We identified 3 coaching methods associated with individual effectiveness, each leveraging 1-on-1 interactions. These methods include:
- Traditional, professional coaching
- Drop-in coaching
For each of the methods in this section and throughout this paper, we’ll provide a description, information about how the method is most commonly used, including its audience, outline some leading practices, and give leaders some questions to think about.
Method 1: Traditional, professional coaching
What is it?
Traditional professional coaching is exactly what it sounds like: a professional coach works with a coachee to identify and work on development areas. These engagements are generally longer term. Usually, orgs identify professional coaches they can trust and then assign them as necessary to individuals within the organization. Of all the methods we came across in the research, this method was by far the most well-known and the most commonly implemented.
Traditional professional coaching is also the most expensive, the least scalable, and the least inclusive of all of the methods we ran across, so it isn’t surprising that this method is often reserved for leaders or those likely to become leaders (high potential employees).
Traditional professional coaching involves a professional coach working with a coachee to identify and work on development areas. These engagements are generally longer term.
How is it applied?
Most orgs utilizing traditional professional coaching had specific ideas about how it should be applied. Some of the ways we saw it applied are listed below.
Building leadership skills
Because traditional coaching is often reserved for leaders or potential reasons, it makes sense that building leadership skills is one of its main objectives. Coaches and coachees determine which skills need to be developed and work through a process for developing them.
Often, 360s or other assessments are utilized to determine areas of focus. Assessments can align to the org’s particular brand of leadership if they have one, or they can leverage one of many leadership assessments available on the market. Coaches are often certified in one or more of these models.
Interestingly, many organizations use traditional professional coaching as a way to engage employees. In this case, the fact that it is both expensive and exclusive works in its favor: traditional professional coaching sends a message to coachees that they are valuable to the organization and worth the investment.
Bringing the outside perspective in
Experienced leaders building their strategic chops may also be assigned a coach with significant leadership or industry experience. The coach may act as a sounding board or outside perspective for leaders making important strategic decisions.
Correcting unacceptable behavior
While most leaders we spoke with said that coaching has a positive connotation within their orgs, a few are still using traditional professional coaching to help leaders work through behaviors that put their careers or the company at risk.
Many leaders have dreams of offering professional coaching to more people at more levels within an org, but resources remain a challenge. While coaching technology is increasing accessibility to skilled, even credentialed coaches—both simplifying administration costs and sometimes even enabling orgs to find more cost-effective coaches—it remains a fairly expensive way to coach entire employee populations.
That said, one leader we spoke with was solving this challenge by systemically creating, and then employing, professional coaches throughout their org. Read more below.
“The reality is, we can’t train enough for the 90,000-person organization to meet the needs of people who want coaching, so we were going to have to look at the hybrid strategy of internal and external.”
– Global Head of Coaching Center of Expertise at a large healthcare company
The goal of one global healthcare company is to grow and extend coaching to senior leaders. In fact, the Global Head of Coaching Center of Expertise told us his personal goal is to provide a coach to anyone in the organization who wants one.
Because coaching can be costly, this company is using a hybrid model to build an internal supply of coaches. The model works like this: a number of leaders at a time go through a coaching certification program. Once leaders are certified coaches, they act as coaches for future leadership development programs for 2 years.
This hybrid program benefits the org and individual leaders in several ways. First, it does indeed increase the number of available coaches and does so at a reduced cost. Second, it gives leaders valuable coaching skills that they can use in managing their teams. And third, leaders are certified coaches—a credential that can help them in other aspects of their lives and allows them to coach other individuals as well.
Method 2: Drop-in coaching
What is it?
Drop-in or coaching refers to a specific type of 1-on-1 coaching that focuses on in-the-moment needs. Drop-in coaching is characterized by both its specificity and its “quick hit” nature.
Employees taking advantage of this coaching method generally seek out coaches with expertise in certain areas, and the coaching relationship is generally short-lived: it doesn’t require, or necessarily encourage, a long-term relationship. Drop-in coaching can utilize either internal subject matter experts or external professional coaches.
Interestingly, we heard about drop-in coaching from both org leaders and coaching tech providers. Both groups see tremendous value in providing 1-on-1 human interaction to develop skills or knowledge, but also understand that it can be cost prohibitive to provide coaching broadly, to all employees.
Drop-in coaching refers to a specific type of 1-on-1 coaching focusing on in-the-moment needs. It is characterized by both its specificity and it’s “quick hit” nature.
How is it applied?
Orgs utilizing this coaching method are generally interested in scaling coaching to include more employees in the hopes of engaging them. We also identified a couple of other, more specific, applications.
Working through immediate, time-sensitive challenges
A few orgs we spoke with used drop-in coaching to meet immediate needs of employees. While traditional coaching focuses on long-term behavior change, drop-in coaching focuses on the here and now. Orgs provided access to coaches for employees looking for help with specific issues right this minute. For example, employees may schedule half an hour with a drop-in coach if they are looking for feedback on a presentation, are planning a tough conversation with their employee or manager or need an outside perspective on a touchy business decision.
Interestingly, several tech vendors we interviewed offer an option to utilize professional coaches in this capacity.
Subject matter expertise
Orgs can also use drop-in coaching to take advantage of subject matter expertise existing in the company. Employees needing a particular expertise can schedule time to work with more experienced employees.
Internal mobility is becoming more of a focus for orgs as well as their employees. One org we spoke with uses a simplified drop-in coaching approach as a way to leverage employees willing to talk about their own careers and help others make decisions about theirs. Another org utilized drop-in coaching as a way to make the career coaches in their HR department more visible and accessible to employees looking for help.
Coaching as a benefit
Vendors and thought leaders alike are touting coaching as a benefit,3 or providing coaching services for employees looking for guidance on profession, health, wellness, financials, and even sleep. Coaching as a benefit has the advantage of engaging with employees on issues that they care about.
Because there is so much noise about it, we were surprised and a little disappointed that not one of the leaders participating in this research is using coaching in this way. But it is available, and being touted by several coaching tech vendors.
Kelly Kinnebrew, currently a consultant in M&A Strategy at KPMG, mentioned that during her time as Principal of Organizational Development at Dartmouth Hitchcock Health, they leveraged the idea of drop-in coaching—which she described as coaching for “things you have to do tomorrow” rather than long-term projects or development. Kelly pointed out that drop-in coaching is better than no coaching at all:
“If your choices are an expensive relationship with a very skilled coach versus no coaching at all because it's too expensive for the front line, I would “drop-in” it every time.”
She mentioned that this type of coaching has a 2 main benefits. First, it’s motivating to employees to learn a new skill when they can use it to address a specific situation in real time with a real human being. She cited examples like conversations with bosses about wins or challenges, career conversations, roleplaying difficult situations, or receiving honest feedback.
Second, she said that, unlike a manager as coach, external drop-in coaches without any awareness of the specific context are able to coach from a detached perspective. Context matters a lot, certainly, but there are also benefits to the coach not being part of the system where the coachee resides.
Method 3: Self-coaching
What is it?
In self-coaching, individuals are provided the tools and frameworks that can help them guide their own growth and development. Self-coaching is a popular method for guiding individuals through transitions in their professional lives. Orgs are leveraging the idea of self-coaching by providing employees tools that help them explore potential development areas and resources to help them self-correct.
Tools can be either analog or digital. Good examples of tools for self-coaching include both the GROW model4> and the ABC model.5 Both models provide logical ways for individuals to think through their goals, assess where they are now, and plan steps to move forward. Orgs can create internal sites and job aids to make models like this widely available. Communication about the org’s use of such models also helps.
Digital tools often take it a step further by creating awareness of behavior in certain situations. They can be as simple as nudges or prompts to remind you to reflect on something, or they can be more complex, such as data-driven dashboards.
Often, these tools use technologies such as natural language processing (NLP), AI, or text analysis to capture and share information within the workflow. One example we saw monitors email and helps managers understand their own patterns, such as sending emails after work hours, or using a different tone with certain people; it then provides both data and content to encourage change.
For self-coaching, employees use tools and frameworks to guide their own growth and development.
How is it applied?
Self-coaching can be leveraged in several ways across the org.
Orgs will often introduce self-coaching models when employees onboard and provide them with some tools to help them navigate their new environment.
Remote or hybrid work
Orgs undergoing changes in the way employees work—either remote or hybrid—are also facing the challenge of establishing new norms and breaking old habits. Self-coaching, particularly some of the digital tools, are being utilized to draw attention to current behaviors and provide some guidance on how to change those that may be troublesome.
As with most of the coaching methods introduced in this paper, self-coaching can be heavily leveraged during leadership training initiatives to provide both frameworks for problem-solving and self-awareness, and data and feedback that prompt change.
The team that focuses on executive and leadership development at a global food production company uses coaching to develop crucial skills. They leverage several methods to coach their employees, and recently started to experiment with bots as part of their coaching strategy for managers.
Coaching technology has not only helped them scale, but has provided more consistency across the engagement, ensuring that information that managers got was consistent.
However, there has been mixed reactions, especially from frontline leaders since there are limits to access in factory settings. Many of their frontline workers work in factories where sanitation restrictions are very tight. Rules govern actions as small as picking up a dropped pen, which make using a phone to access coaching tech a non-starter in many cases.
The Head of Executive and Leadership Development, thinking of this use case, described how she’d like to see the tech work: like an earbud in NASCAR driving – directing and feeding insights in real time, on the plant floor, and without requiring interaction with a mobile device.
Coaching for group effectiveness
The second primary reason that orgs are investing in coaching is to increase group or team effectiveness. By this, we mean ensuring that groups or teams are led more effectively and work together more effectively.
The pandemic has drawn attention to the importance of cohesion among groups, and orgs have realized that while they focus on skills and knowledge for the individual, often there isn’t enough focus or support for better functioning groups.
Group and team effectiveness, and org and culture effectiveness, which we’ll discuss in the next section, is where we started to hear quite a bit about creating a coaching culture. Orgs attempting to create a coaching culture want to leverage some of the best aspects of coaching (candidness, open feedback, sharing information, seeking information, etc.) and embed them deeply into the way work gets done. They’re beginning with coaching methods that affect groups.
Orgs looking to build a coaching culture often start with the coaching methods that affect group or team effectiveness.
We identified 3 methods orgs are using to coach better teams or groups:
- Manager as coach
- Integrated coaching
- Group or team coaching
We discuss each below.
Method 4: Manager as coach
What is it?
As many orgs are figuring out how to function in this new world (with remote teams, back to office plans, and hybrid work arrangements. to name a few), they view managers, and particularly managers taking part in coaching efforts, as a means of ensuring more human connection and smoother transitions.
Manager as coach was identified as one way to provide coaching to more employees within the organization. Manager as coach is exactly what it sounds like: giving managers the skills they need and then leveraging them as coaches to their direct reports.
During our roundtable on coaching, leaders made 2 points we think are important to pass on. First, manager as coach is not a command: it’s a system. Just telling a manager to coach does not provide them with either the impetus or the skills necessary to do so. Three things leaders mentioned that can help:
- Help managers build skills. Don’t assume that asking managers to coach is enough. Provide them the skills necessary to do it.
- Give managers tools. Support managers by offering them nudges, prompts, dashboards, and reports that can give them more information about their own and their employees’ behavior.
- Create the impetus. Communicate, encourage, and even compensate managers to act as coaches—don’t just assume that coaching will happen.
The second point leaders made in the roundtable was that a manager, by nature, is directive. Managers have a dual responsibility of meeting business goals and developing employees. Coaching is a good development tool for that second responsibility, but orgs need to help managers understand when to utilize it and when the situation calls for something more directive.
Managers armed with coaching skills are often leveraged to provide connection and coaching to their teams.
How is it applied?
Orgs are leveraging manager as coach in several ways. We discuss some of the most common below.
Interestingly, manager as coach was one of the first things we heard when we asked leaders how they were planning to scale coaching. Leaders identified managers as the first line of defense for both skill building and engagement, and were working to ensure managers and leaders had the skills they needed in order to act as coaches for their direct reports.
Managers are also often the first line of defense when it comes to helping employees navigate their careers. Along with regular conversations with employees about their aspirations, managers also coach employees on what skills they should develop, what resources are available to them, and what stretch assignments can be taken on.
Increasingly, coaching conversations are being integrated into performance management processes. Hybrid work and remote work have encouraged more, not less, checking in with employees, and many of those conversations revolve around how employees can meet their performance objectives.
Ginny Gray, Director of Global Coaching, Assessments, and Facilitation at Intel, oversees a 5-month program in partnership with a coaching vendor dedicated to developing formal coaching skills of 50-75 managers within the org each year. These leaders learn coaching skills in working with their direct reports, and after the program, are expected to commit to voluntarily coaching other managers across the company. She shared:
“Part of standard expectations and capabilities going forward is that shift into being more of a facilitative manager and that part of their role is to show up in a more thoughtful way, asking thoughtful questions, less command and control and telling you what to do, instead focusing on asking the right questions to get them to figure out what they might want to do. Building out a coaching culture at the company.”
They also offer a variety of internal programs to coach around 800-1,000 managers a year out of the 14,000 managers at the company. Here, the focus is on manager of managers (3,500) who can have a greater impact and create a ripple effect within the org. To identify who gets coached, Intel considers critical needs across the company as well as retention issues.
Finally, to reach even more managers outside of those invited to their formal programs, Intel also has built out coaching resources in their learning pathways, setting up the expectation that an effective manager at Intel has good coaching skills.
Method 5: Embedded coaching
What is it?
Embedded coaching surfaced, not through our conversations with leaders, but through our briefings with coaching tech vendors. Still, leaders gave interesting examples of embedded coaching as they talked about skilling up their workforces.
Embedded coaching is coaching that is integrated into some other learning method—generally formal programs, and usually those requiring specific skills, such as leadership, although we’d like to see it used much more broadly. Embedded coaching gives participants in a program access to a coach. In some instances, coaches coach the group together; in others, employees can sign up for individual sessions with the coach.
Embedding coaching into other programs accomplishes a few good things. First, it provides a way to scale coaching to more employees in a fairly cost-effective way. Second, it pairs coaching with other types of learning, allowing employees to learn, practice, and receive feedback, all in context.
Coaching is often embedded or integrated into development programs and other learning initiatives, allowing employees to practice and receive feedback in context.
How is it applied?
Some of the more common ways that orgs are using embedded coaching are discussed below.
Including a coaching element in a leadership development program was the most common example of embedded coaching. Many leaders shared examples of providing coaching to participants to develop their coaching chops and ensure that they had the skills to coach. Embedding coaching in leadership development initiatives gives participants the opportunity to get 1-on-1 attention and feedback, an element often missing in group learning.
Follow on to programs
Embedded coaching sometimes also takes place after the fact. Some leaders mentioned follow-on coaching offered as a complement to the main learning initiative. Participants were encouraged to reach out to receive extra guidance, either by participating in group coaching sessions, or by scheduling a 1-on-1 feedback session. Because the follow on is largely optional and as-needed, orgs are able to use fewer resources while still meeting the needs of their employees.
Caterpillar, an American manufacturer of construction and mining equipment, engines, and locomotives, employing 100,000-plus people – shares coaching plays a significant role in their leadership development programs, aimed at their 11,000-12,000 leaders.
Amy Ashley, Organizational Development Manager, talked about their focus on building coaching skills being as important as direct coaching engagements, to reach these leaders and improve their performance in leading their teams.
Within their mid-level manager program, which they run several times a year, 30 managers (of the 3,000 mid-level managers at the company) are invited to the program and assigned a coach – so that they can experience coaching in addition to learning coaching skills.
For multiple levels of leaders (e.g., mid-level managers, frontline leaders, and senior leaders and executives), in these leadership development programs, they carve out coaching practice for leaders to experience varying scenarios (related to their positions) in triads, where leaders receive feedback on their coaching style. They offer coaching related content as well as opportunity to practice asking good questions, listening well, and building safe and trusting relationships.
Method 6: Team coaching
What is it?
The 6th coaching method we came across as a part of this project is team coaching. Team coaching generally involves a single coach—either a skilled outsider, or in some instances a team leader—working with a group to increase their effectiveness. Coaching can include both 1-on-1 sessions to help team members understand their role and the dynamics they help create, and sessions with the entire group to find solutions to some of their team’s challenges.
Team coaching has increased in popularity in recent years as orgs have moved toward team-based work.
The way people work increasingly involves other people. In fact, people spend about 50% more time collaborating than they did 10 years ago.6 As employees find themselves relying more and more on teammates to get work done, this coaching method has found its footing.
For coaching to have a positive effect on team performance, it needs to focus on the most salient team performance processes for a given task.7 Team coaching is not about simply improving interpersonal relationships; it is focused primarily on helping the team meet specific goals.
How is it applied?
Because team coaching has a fairly specific use case, applications are much narrower than with some of the other coaching methods in this study: they revolve around helping a team function better. Three situations were specifically addressed in the literature.
Team coaching may be used upon team formation to set the right tone for how work gets done. A team coach may help a team set up a charter, work structure and procedures, and communication norms to ensure the bones of solid working relationships are in place. They may also set expectations for how conflicts should be resolved.
Orgs can leverage team coaching throughout the lifecycle of a team, checking in with members and the group as a whole to ensure that the team is working together to accomplish their goals. In cases where orgs are using team coaching to continuously ensure team health, the role of team coach is often taken on by the team leader. Orgs should ensure that leaders acting as team coaches are equipped with the skills needed to do the job effectively.
Conflict can keep teams from performing at their best. While conflict is not the primary application of team coaching, team coaches can be brought in to help with alignment, interpersonal skills, expectation setting, and communication.
The director of the tax and treasury department at Specsavers, a multi-national optical retailer, noticed a growth area for her department to be less reactive and more proactive to org needs. A coach was assigned to the team and conducted monthly team coaching sessions in addition to 1:1 coaching with each member to practice more strategic action.
This coaching helped the team become more aware of each other’s strengths, developed clear objectives, and moved the team towards proactively accomplishing their goals. As a result, communication and productivity increased significantly – leading to the team to exceed its goal of saving the business £1 million.8
Coaching for org and culture effectiveness
During this study, the term “coaching culture” came up a lot—describing everything from managers having the ability to coach their teams to developing a culture of feedback. Orgs are looking for ways to infuse the benefits of coaching into the culture. Methods in this group focus on not only increasing access to coaching, but also building coaching skills within individuals.
Additionally, these methods can and are being used to address specific large-scale changes, such as burnout, wellness, and DEIB. These methods are scalable, inexpensive, and inclusive, making them good ways to affect large swaths of the employee population.
Coaching for org or culture effectiveness includes coaching methods that require less oversight and sustenance from the org. They most often don’t require professional coaches, or even extensive coaching training. In many cases, these methods are more systemic: orgs support the initiative with guides and tools rather than human coaches.
Three coaching methods were identified in this category:
- Coaching circles
- Peer coaching
- Culture change programs
We discuss each in more depth below.
Method 7: Coaching circles
What is it?
In its purest form, a coaching circle is a group of 5 or 6 individuals coming together and “synchronizing” their coaching to support a colleague.9 Coaching circles take advantage of the collective knowledge of a group and focus it to help individuals one at a time with challenges or situations they are facing.
Coaching circles can have many benefits over traditional 1-on-1 coaching for the individual:
- Leaders mentioned that accountability is often a challenge with many coaching initiatives. Coaching circles have built-in accountability, which can help with follow-through.
- Diverse perspectives. Coaching circles can offer diverse perspectives rather than the perspective of a single coach.
- Coaching circles build networks that often outlast the coaching circle itself. Participants develop trust, and can call on each other throughout their careers.10
Coaching circles also appeal to orgs because they can be quite cost effective and reach a large number of employees. Resources that would otherwise be directed toward finding or developing coaches for a select few can be used to establish systems and guides to help employees coach each other.
Coaching circles are smaller groups of employees coming together and “synchronizing” their coaching to support a colleague.
How is it applied?
Coaching circles can be used as a way to help any group connect around common challenges. Some of the ways we saw them applied are discussed below.
A few orgs we talked to are utilizing coaching circles to augment Employee Resource Groups (ERGs). Offering coaching circles within an ERG can mobilize an already interested population to help each other problem-solve challenges they may be experiencing in the workplace.
The literature mentions coaching circles as a good way to help leaders, particularly newer ones, navigate their roles. One org we spoke with reduced the length of their standard leadership programs by providing leaders with better data about their teams and clarity on their goals, and then augmenting the data with a coaching circle. Their thought was that people on the ground doing the job were likely to offer better guidance than any practice scenario a leadership training program might offer.
As talent mobility and career development get more attention, orgs are using coaching circles to provide needed support. While a lot of career development will naturally occur within coaching circles organized for other purposes, org leaders can also design coaching circles specifically for roles or career paths they see as key to future business success (e.g., data scientists).
Heather Bahorich, Talent Management Lead at Centric Consulting, a digital and tech company, noted the success of the coaching efforts in one of their mature local geographies in Ohio. Here, they are piloting a robust coaching program through coaching circles. They currently have 5 active groups: young professionals, business development, tech, business consulting, and leadership.
Bahorich noted that these circles focus on career development more effectively than ERGs, in addition to creating a social community. These coaching circles offer a place for connection and development for those with similar interests or who find themselves in similar stages of their careers.
Method 8: Peer coaching
What is it?
Peer coaching generally involves pairing up 2 individuals at a similar leadership level to coach one another. While some peer coaching happens naturally in most professional settings, more orgs are providing slightly more formalized relationships to get the most out of it.
To make the most of peer coaching, most orgs provide at least 2 things to peer coaches. First, they provide tools and ground rules to guide peer coaches through coaching sessions, including questions to ask each other and follow-up items. Second, they provide some guidance or training on how to give and receive feedback.
Peer coaching involves pairing up 2 individuals at a similar leadership level to coach one another. Orgs often provide structure and guidance for sessions.
How is it applied?
In our search of the literature and our interviews and roundtables, we only really saw two applications for peer coaching, discussed below.
Interestingly, much of the literature was written with the education or nursing professions in mind. Peer coaching has been used by teachers for at least 3 decades.11 Teachers observe each other teach and provide each other feedback, observations, support, and problem-solving. While the term “peer coaching” is newer to other professions, we see it as a valuable tool in any situation where feedback and support could be helpful—which describes basically all situations.
As with many of the coaching methods we’ve reviewed, in this study, we saw peer coaching used heavily in the leadership development space. Some orgs are including peer coaching in leadership training programs. Participants are paired up and utilize new coaching and management skills to coach each other after the program has been completed. Others are using peer coaching as a substitute for traditional professional coaching.
One of the best examples we’ve seen of peer coaching is actually a peer mentoring program at a large hospitality company (remember when we said the lines were blurring?) The Senior Learning Designer and Leadership Coach at the company oversees coaching and mentoring within the central learning team – 1 of the 4 learning areas focused on leadership development, learning labs, and DEIB (diversity, equity, inclusion, and belonging).
The company just relaunched their mentorship program this past fall, where anyone in the company can become a mentor or mentee, and in which 1 out of 7 employees are already participating.
The goals of the program are threefold: 1) to make new connections, 2) to create a sense of belonging which the company defines as “being respected, valued, and able to contribute”, and 3) to accelerate growth, aligning with their mission of connection and belonging.
This alignment of the mentorship program with their core values is key to their work in creating a coaching culture at the company by fostering greater transparency and reflective practice.
Method 9: Large-scale coaching
What is it?
Large-scale coaching is an umbrella term for large-scale initiatives, often involving tech, that support large-scale culture change initiatives such as DEIB or wellness. This type of coaching is often quite programmatic, ensuring that all employees receive the same access to information that can help with the org change.
Large-scale coaching can happen manually (paper and pencil), but usually involve tech. During our sister study on coaching tech, several vendors introduced us to “coaching” solutions that address these large-scale culture change initiatives. These solutions generally had an AI or machine learning component to provide the right information at the right time to participants.
This method is probably the least “coachy” of all the methods we discuss. In fact, there are naysayers that would not even describe this as “coaching,” as it doesn’t necessarily involved humans. But we include it because it has many of the characteristics we associate with coaching:
- Personalized development. These initiatives differ from courses or knowledge-sharing in that they are interactive and personalized. Technology or programs adapt to specific areas of study depending on the individual.
- Personalized data. Often tools or programs come with self-assessments and / or dashboards. Data acts as a mirror, much as a real coach would, to give direct and personalized feedback to help participants understand how they are performing or reacting.
- Self-reflection. Initiatives often have built-in opportunities to reflect and apply new information over a period of time. Nudges and other tools are often incorporated to help employees apply new knowledge and skills during work.
Because large-scale coaching is usually applied to fairly important topics, they are often part of larger org initiatives and include significant communications. For example, an org wouldn’t apply large-scale coaching to a DEIB initiative without significant communication about what the organization expects, what it’s doing, and how it will affect everyone.
Large scale coaching, often quite programmatic and often involving tech, supports culture change initiatives, such as DEIB or wellness.
How is it applied?
Important org Initiatives
Large-scale coaching is often used when an initiative is important enough that everyone needs to know and apply the new information. Examples include initiatives on such issues as DEIB, wellness, and burnout.
In the current environment, large-scale coaching is also being used for initiatives like “return-to-work.” Orgs put together a guided coaching process to help employees understand what is expected and coach them through the steps necessary to a smooth transition.
Marriott International has a strong focus on global well-being for their associates and wanted to embed this further into the culture based on the conviction that employees who are equipped in this arena are more productive.
Pre-pandemic, they partnered with a tech vendor to discuss providing their employees with the resources needed to increase resilience, improve mental health, and manage workplace stress. This partnership proved increasingly relevant as the pandemic hit, and Marriott made the coaching platform accessible to all employees in 2020.
Lance Bloomberg, Vice President of Employer Brand and Communications, and Colin Minto, Vice President of Talent Acquisition, Planning, & Employer Brand (EMEA), discussed how this rollout helped support Marriott’s global well-being initiative, Take Care. By the fall of 2020, they had users registered in 27 countries. And by using heat maps, they can also identify areas of interest and customize their well-being offerings to specific local regions.12
So far, we have told you why you should probably care about coaching, identified some for the major trends we’re seeing, and introduced you to 9 coaching methods we came across in our research. In the last meaty section of this report, we want to provide you with some guidance on how to get started with coaching or evaluate it as a development method.
For a solid coaching initiative, all leaders should understand 3 things: the need, the constraints, and what success looks like.
We suggest that any leaders charged with coaching initiatives understand the following 3 aspects:
- The need
- The constraints
- What success looks like
Let’s discuss each.
Articulating the need
Most leaders we spoke with addressed their coaching initiatives as a foregone conclusion: they have senior leadership, ergo, they have a coaching program. Interestingly, however, not many were able to clearly articulate why they were investing in coaching specifically.
We strongly recommend being able to clearly articulate the need a coaching initiative fills. Without a solid reason for an investment in coaching, it is difficult to justify and impossible to measure (which we’ll talk about a little later). To get a clear picture of your need, ask yourself the following 4 questions:
What exactly are you solving for?
Understanding exactly what outcome you want and expect is key to ensuring that your resources are being used well. In the first section, we mentioned 4 reasons, in addition to development, that orgs are investing in coaching, including engagement, personalized development, change management, and engagement. All are valid reasons to invest.
What makes coaching the best option for solving it?
Although it is a sexy and seemingly straightforward option, coaching isn’t always the best solution. It is often, however, one of the most expensive. Leaders should understand what other options are available to them to ensure coaching is the right option.
Who is the initiative for?
What may be appropriate for a particular audience (or audience size) may not be appropriate for another. Being clear on who the audience is, as well as the criteria for determining that audience, is important to articulating the need.
For example, it is not enough to specify the audience as “senior leaders.” What level of senior leader? Senior leaders in all regions? Senior leaders on the front line and in the plants? Senior leaders in a hybrid environment? A more specific response to any of those questions may lead to different coaching methods.
What will this coaching initiative interact with?
As we mentioned when discussing each of the 9 coaching methods, coaching is very often not a stand-alone development tool: it is often combined with other initiatives. When deciding what to solve for, it’s important to understand not only whether coaching is the best option, but what coaching will be interacting with. Integrating coaching into other initiatives (or on some occasions, vice-versa) ensures that participants are receiving consistent information, but also allows orgs to leverage them together.
Coaching doesn't exist in a bubble – it interacts (and should therefore align) with other initiatives.
A few of the leaders we spoke with were thoughtful about tailoring the coaching initiative to the different initiative. The 9 coaching methods we introduced provide more options to org leaders when designing initiatives. Being clear on what you’re solving for, what makes coaching the best solution, what audience it’s serving, and how coaching will interact with other aspects of the initiative can help leaders make more sound decisions.
Working within the constraints
One question we asked in every interview in this study was, “What are your biggest challenges with coaching?” Without fail, leaders mentioned the resources necessary to run a coaching program. It turns out that development budgets are not bottomless, and lack of resources kept leaders from using coaching as much as they’d like.
Part of this is because many of the leaders we spoke with hadn’t read this report yet and were thinking about coaching only in the most traditional way – which is expensive, hard to scale, and very exclusive.
All coaching methods have constraints – understand them to choose the method that will work best for your org.
Which brings us to constraints. All development methods have constraints and the 9 coaching methods we introduced earlier are no different. Constraints can motivate (and even force) leaders to choose one over the other. Understanding the constraints early on can lead to better decisions. For coaching, we identified 3 common constraints, discussed below.
When seen in its most traditional light, coaching is not that scalable: talent leaders to keep a binder of coaching resumes on hand, and then assign them as the need arises. Lots of work can go into managing the coaching relationships, getting through procurement processes, and ensuring that coach quality is maintained. Many of the newer methods, however, solve for scalability by changing the coach / coachee relationship, utilizing tech, or introducing coaching in different formats.
Scalability was the second most common constraint we heard from leaders (after cost), and many are actively looking for ways to provide more coaching to more employees. Three of the coaching methods we identified are reasonably scalable.
Coaching has long been seen as a developmental opportunity for senior leaders or high-potential employees. It has been reserved for those deemed as “special” and those the organization wanted to invest in. The exclusivity can be rankling to employees in orgs taking a more inclusive approach to employee development, and some leaders see the cost as particularly high when it only affects a small group of employees.
In most orgs, coaching is seen as a positive thing. There was the odd leader we spoke with that was actively trying to change the perception from negative to positive, but the majority actively use it as a development as well as an engagement tool, particularly given recent social movements, such as #BLM and #MeToo. Many coaching efforts scaled up to address specific audiences.
We’re now seeing leaders take it a step further by looking for inclusive solutions, not just expanding the exclusive solutions. Four of the 9 coaching methods we identified in this research have the potential to be fairly inclusive.
As we’ve mentioned, traditional coaching is not the cheapest option for employee development. This is partly because it cannot be scaled, and partly because coaching often leverages external resources, and cost per hour is often between $200 and $600 per hour, depending on the level and experience of the coach.13 One org we recently spoke with is spending well over $15M on executive coaching alone.
Affordability continues to be a challenge, but as with scalability, leaders and vendors are finding ways to stretch the coaching budget. Four of the coaching methods discussed in this research fairly cost-effective, either because they rely on internal resources or because they leverage resources already in place.
Figure 2 provides a brief summary of these 3 constraints and how scalable, inclusive, and affordable they are. We use a common red, yellow, green rating to give you a sense for how each stacks up.
Even so, in the 4 months we have been conducting this research, we’ve already seen changes; if you have noticed or are using methods we’re not discussing, or if you have found ways to make them more scalable, inclusive, and affordable, we’d love to hear from you.
Finally, leaders implementing coaching initiatives should decide in advance how they will define success. Leaders should understand what criteria they will utilize to determine if they should continue investing, and they should understand the metrics associated with that criteria.
Measuring the success of coaching is immature – both because it has been seen as a confidential relationship and because there has been no central system to gather data.
One thing was clear from the research: coaching measurement is in its infancy. And there are a couple of reasons for that.
First, traditional coaching relies on a confidential relationship between coach and coachee. Professional coaches we talked to held that relationship as almost sacred. What goes on in coaching discussions is akin to therapy sessions. Information is not shared beyond the participants. This desire for confidentiality has kept many orgs from collecting data beyond completion rates and coach evaluations. This is changing.
As orgs standardize their coaching processes, many find it advantageous to also standardize the models used during coaching engagements. Sometimes models are wrapped around 360s or other assessments, and sometimes they’re wrapped around leadership or other frameworks. But the standardization allows orgs to collect information on things like general themes and progress in certain areas. This data at least provides administrators with general information about how coaching is going.
The second reason coaching measurement isn’t super mature is that, until recently, there was no central access point, and therefore no central repository for data about coaching engagements. This is also changing.
As orgs take a more standardized approach to coaching, utilizing similar models, similar metrics, and tech that helps track both, more data is available about general themes, whether or not similar models are used; completion rates can be tracked without surveys or follow up; and coach evaluations can help to maintain a cadre of the most effective coaches. The data is helping to both scale and improve the coaching experience.
All of that said, good data relies on a tenacious employee development team and good feedback mechanisms. It also relies on the desire to actually obtain and analyze the data. Many leaders we spoke with felt no impetus for this final step because it was universally assumed that coaching is a valuable development opportunity, worth the effort and money invested in it.
We asked of every interviewee, “How do you measure the success of the coaching initiative?” Answers varied widely. But some of the most common responses fell into 3 categories:
- Participation / satisfaction
- General coaching themes
Participation / satisfaction
By far, the most common ways coaching is currently being measured is by gauging participation and satisfaction. That is, orgs are measuring how effective coaching is by how many people participated and liked their coaching sessions.
“I know we do everything from the basic level, like how many coaching sessions have there been. And then we have a scale of when somebody has finished a coaching session, for each coaching session, they get a quick assessment that pops up in the tool and says, how was your session?”
– Head of People Development, large software company
This is by no means a bad thing. It gives leaders a sense of how much coaching is being utilized and if adjustments need to be made. It is also understandable, considering the challenges orgs have faced to this point when measuring coaching.
But it should only be a place to start. As more tech is utilized to guide coaching initiatives, more data is available, which can provide richer and more useful data.
General coaching themes
Some of the more data-savvy (and data-rich) org leaders we spoke with are also looking at consolidated data on general coaching themes. General coaching themes help leaders understand what challenges employees face, but they can also help the L&D function (or other talent function) understand where their development initiatives may be missing the mark. For leaders utilizing internal coaches in their initiatives, data that specifies the areas where individual coaches feel confident enough to coach on is also available.
"I actually can see the goals and skills that people are most seeking on, and what goals and skills people are most willing to offer on. And that can translate into the skills that we should think about potentially supporting from a central learning perspective."
– Leadership Coach, Hospitality Company
This data may also be key to the larger skills discussion happening in most orgs. Coaching themes can help leaders understand both the skills they have (from the coaches), and skills they need (from the coachees).
Correlations with business results
Finally, there were a few valiant leaders looking at correlations between their coaching numbers and KPIs important to the rest of the organization. Specifically, leaders told us they were looking at 3 major areas.
Forward-thinking orgs are correlating coaching data against engagement scores, retention, and even performance.
Most orgs nowadays conduct an employee engagement survey that includes questions about engagement, development, and other important human matters. Some leaders we spoke to utilized the information from those surveys to correlate against coaching data, particularly where the manager was acting as coach, to understand how the most effective managers addressed these concerns.
"We do an annual employee insight survey, so we do look for clues through the questions that we ask in that survey and they may not be directly called out to coaching, but a subset of those questions give us indication—particularly when you talk about diversity and inclusion, building trust with your supervisor—those kinds of questions help us to understand where we have made an impact and where we have a missed opportunity."
– Director of Global Coaching, Assessments, and Facilitation, (another) tech company
In light of recent social disruptions, including movements like #BLM and #Metoo, coaching has become an even more important tool for engaging with certain populations. Some orgs are paying more attention to both the coaching initiatives associated with these groups and their retention numbers, to understand how better to meet their needs.
Finally, some org leaders mentioned that they were actually looking for a change in performance. How this was measured varied. Some orgs track performance scores year over year, looking specifically at those who have received coaching. Others provide 360s or other assessments at the beginning, midway through, and at the end of a coaching engagement to determine performance improvement. Still others rely on manager observation to understand if changes were being made.
"What we actually look for is, has your thinking changed, the way you think, the way you feel, the way you act—are those things changing."
– Global Head of Coaching Center of Expertise at a large healthcare company.
Measurement is unsexy, we understand. But as talent leaders have increasingly more choices for employee development, it’ll become more important to understand which of those are actually impacting the org. Understanding – from the outset – what success looks like and how you’ll measure it can help talent leaders get out ahead of the ask surely to come from senior leadership.
At the beginning of this report, we mentioned that coaching is going mainstream. We’ve introduced a lot of ways that is happening: more flexibility in what we mean by coaching, more coaching configurations, more reasons for coaching, and more experimentation.
It’s that last point, the experimentation, that we think has driven a lot of the progress we’ve seen. Leaders have recognized the value that coaching can bring and they have, sometimes through brute force, found ways to make it available to more people. We think this resonates particularly well right now because the world is uncertain. We’re all looking for ways to connect with each other and we’re all looking for direction. Coaching helps us do both.
While we have no crystal ball, we’d bet that we’ll continue to see new iterations and configurations of coaching, and that leaders will continue to push the boundaries of what coaching is. We are excited to see what’s next.
Appendix 1: Research Methodology
We launched our study in the summer of 2021. This report gathers and synthesizes findings from our research efforts, which include:
- A literature review of 60+ articles from business, trade, and popular literature sources
- 1 roundtable with a total of 27 participants
- 16 in-depth interviews with leaders on their experiences and thoughts on coaching
For those looking for specific information that came out of those efforts, you’re in luck: We have a policy of sharing as much information as possible throughout the research process. Please see:
- Coaching Premise: https://redthreadresearch.com/coaching-the-newest-old-way-to-develop-people/
- Coaching Literature Review: https://redthreadresearch.com/coaching-lit-review/
- Coaching Roundtable 1 Readout: https://redthreadresearch.com/coaching-roundtable-readout/
- Coaching Infographic:
And please read our sister report on coaching tech: https://redthreadresearch.com/coaching-tech-landscape-humans-and-robots/