Foreword
T alent development reporting informs organization leaders about how much the organization invests and how well investments in people pay off. Consumers of talent development data are C-suite executives who decide how and where to allocate organization resources. They are managers and supervisors accountable for business performance whose employees are participants of programs. They are the human resources business partners looking for solutions to their client needs. And they are employees who want insight into which programs can propel them along their development journey. Finally, consumers are the heads of learning, program owners, and facilitators who monitor learning investment progress and improve programs to drive even greater value. To steer consumers of data in the right direction, learning leaders need a reporting strategy built around a practical framework. This strategy should lead to reporting that makes it easy for data users to get the information they need when they need it, and understand what it means once they have it. This book describes how to develop such a strategy.
Progress With Measurement
The talent development industry is making progress in measurement. In 2010, ROI Institute partnered with ATD to conduct a first-of-its-kind study to determine what CEOs think about the learning investment. CEOs indicated the types of measures they were receiving and those they wished they were receiving to understand talent developments value. CEOs also ranked the measures in terms of their importance in resource allocation decisions. Of least importance were data describing inputs, efficiency, and participant reaction to learning; yet these were the data most executives reported receiving. Impact and ROI were the top two most important measures to CEOs, yet only 7 percent reported receiving impact data, and only 4 percent reported receiving ROI. This gap between what CEOs receive and what they want was a wake-up call for many talent development leaders.
Five years later, in 2015, Chief Learning Officers Business Intelligence Board Measurement and Metrics study reported that 71.2 percent of 335 CLOs indicated they were either using or planning to use ROI as a measure of learning performance. In 2017, Training magazines Top 10 Hall of Fame report acknowledged that the success of any program is based on whether it improves business results. Today, organizations recognized as Trainings Top 125 must report how their talent development investments deliver business results to their organizations.
Progress with measurement was further evident in ROI Institutes 2020 benchmarking study. When comparing our recommended percentage of programs evaluated at the different levels to the survey respondents results, we were happy to see the progress talent development is making in connecting programs to the business. While the percentage of programs evaluated at reaction and learning was lower than our recommendation, the rate of programs evaluated at impact and ROI was impressively higher. Survey respondents reported that they evaluated 37 percent of their programs to the impact level compared to our recommended 10 percent. They also said they evaluated 18 percent of their programs to ROI compared to our recommended 5 percent.
Despite this progress in measurement, there are still questions from many stakeholders regarding talent developments value and how best to allocate those resources. Why? Because reporting still fails to communicate performance effectively.
How Talent Development Reporting Fails
Talent development leaders have at their fingertips measures of activity, such as:
number of programs
number of employees reached
numbers of employees participating
learning assets produced
spend per learning hour consumed.
These metrics, accessible in any learning management system, describe where funds are going, but not what the organization is receiving in return. These activity-based measures are easy to report, yet reporting them makes it difficult for others to recognize the learning investments real value. For this reason, TD funding is an easy target when cost-cutting measures ensue. Activity-based reporting focuses on the learning leader as the consumer, ignoring other consumers data needs.
Another way reporting fails is that even when the data reported are results-based versus activity-based, reports often omit targets. While some would argue targets are unnecessary, the question is, how does one define success if there is no basis for comparison? This omission of results-compared-to-target leaves the interpretation of success up to the consumer.
A third way reporting fails is that the data tend to be static. Measures and metrics with nice graphics make for an interesting report, but consumers sometimes struggle to understand the so what? of it all. Reporting data without insight offers little value.
Reporting fails to effectively communicate talent development performance because it, like evaluation, is often an afterthought. Yes, the easy data are automated and placed on dashboards, but the real value of reporting is when it clearly communicates information that can influence consumers who make the ultimate talent development funding decisions. These decision makers will certainly base decisions on activity; but these decisions are altogether different than those based on results.