What are some of the obvious and less obvious risks of choosing and implementing D&I technology? In our recent study with Mercer, we examined the emerging market for D&I tech. As part of our exploration, we needed to take a step back to understand the potential gains or pitfalls that come when well-intentioned companies use technology and AI to solve endemic people challenges.
Here is an excerpt from that report which breaks down some of the risks and payoffs of implementing D&I tech:
What are the benefits and risks of D&I technology?
While there are many potential benefits of D&I technology, the most apparent one is the opportunity to create consistent, scalable practices that can identify or mitigate biases across organizations, often in real-time. Many people-related decisions leave a lot of room for bias, particularly when it comes to an assessment of a person’s skills, behaviors, or value (e.g., for hiring, performance evaluation, promotion, or compensation).
Much of the technology on the market today is designed to change the processes that enable bias or identify that bias exists. Another benefit customers see in D&I technology is the increased understanding of the current state of diversity and inclusion throughout the organization. With greater visibility, leaders can better measure and monitor the impact of D&I initiatives.
- Implementing more consistent, less-biased, and scalable people decision-making processes
Increasing the understanding of the current state of diversity and inclusion across the entire organization, using both traditional and new metrics
- Measuring and monitoring the impact of efforts designed to improve D&I outcomes
- Raising awareness of bias occurring in real-time and at the individual level and enabling a range of people to act on it
- Enabling action at individual levels by making new, appropriate information available to employees at different levels within the organization
- Signaling broadly the importance of a diverse and inclusive culture to the organization
- Implementing technology that itself may have bias due to the data sets on which the algorithms are trained or the lack of diversity of technologists creating it
- Creating legal risk if problems are identified and the organization fails to act
- Enabling the perception that the technology will solve bias problems, not that people are responsible for solving them
- Reducing people’s sense of empowerment to make critical people decisions
- Implementing technology or processes that are disconnected from other people processes or technologies
- Enabling employee perceptions of “big brother” monitoring, an over-focus on “political correctness,” or “reverse-discrimination”
Explore our interactive tool and infographic summary and download the rest of this report, including our detailed breakdowns of D&I tech categories and solutions, and some predictions for the future of this market. Also check out our most recent summer/fall 2019 update on the D&I tech market.