In the mid-1990s, labor economist Laurie Bassi began to notice a strong correlation between public companies’ stock prices and how much they invested in training-and-development programs. On average, the more a firm invested in training in a given year, the higher its stock went the following year.
The big difference between this form of spending and other investments that can also boost share prices — such as those for research and development, capital equipment, and marketing — is that those latter ones show up on financial statements, allowing analysts and investors to incorporate them into stock valuations. Not so with money spent on training and development, which does not have to be reported and therefore gets little attention.
That means companies that spend a lot on training are undervalued, but only in the short term, says Bassi. As companies spend more on training, the benefits to overall financial performance accumulate and the stock price does eventually reward the investment.
But Bassi, who was formerly the research director for the American Society of Training and Development (ASTD), got little reaction from investors even after publishing what is considered seminal academic research on the business impact of training. She churned out a substantial body of work, “but it was largely ignored,” she says. “I kept thinking, ‘Why doesn’t someone take up this mantle and build this idea into investment portfolios?'”
Eventually she decided to do it herself. She opened a human-capital consultancy in 2001, and as a sideline she became a registered investment adviser and created a “family-and-friends” fund populated with stocks of companies known to educate workers on a large scale. The fund, still going strong, has significantly outperformed the S&P 500 for the past decade, yet learning expenditures haven’t caught on as a major metric for analysts and investors.
No Need to Know
The training community has long debated how — or even whether — to calculate the bottom-line impact of increased spending on employee development. But there is no doubt that few firms make a rigorous effort to quantify the financial value of their learning programs — even though, according to ASTD, U.S. companies spent an estimated $126 billion in 2009 to educate their workforces.
Companies do measure certain results of training, but usually not in a way that directly connects such investments to overall financial performance. One reason is that financial performance is believed to be influenced by so many factors that trying to prove what role training-and-development investments play has struck many as a waste of time.
For example, Radiant Systems, a vendor of point-of-sale technology, believes that well-trained people come up with better products and other ideas, which drives better financial performance, says CFO Mark Haidet. But the closest the company comes to documenting that link is a career survey that asks workers whether they’re getting the needed level of training.
Says Tamar Elkeles, chief learning officer at Qualcomm: “Where people get messed up is in thinking that they have to measure individual training classes to make sure that participants [absorbed] the information and are applying it to their jobs. I’m not interested in that, and neither is our CFO, Bill Keitel. He wants to measure whether managers are performing well and whether the organization is productive.”
Missed Opportunity?
But there are vocal critics of such a mind-set. While most of Bassi’s research has focused on stock price rather than financial performance overall, she supports a scientific approach to finding good financial metrics for training. “It’s a learning officer’s professional obligation to optimize the resources devoted to the function,” she says. “There is no excuse for suboptimizing, which is almost certainly what you’ll be doing in the absence of measurement.”
And, indeed, there is a cottage industry devoted to helping companies identify the return on investment (ROI) for training. For example, Knowledge Advisors applies the “wisdom of crowds” to estimate the future impact of just-completed courses. If you ask a large-enough number of participants whether they’re going to be more productive after having received training, the results will be a good predictor of future performance, says Kent Barnett, the firm’s founder and CEO. But such measurements are rarely taken, and he believes that half of all training spending is wasted.
Bellevue University in Omaha’s Human Capital Lab helps organizations measure the impact of learning on salespeople and call-center staff. The approach is similar to drug trials: give a class to some people and not to others, then measure the difference in the groups’ subsequent performance.
IBM has taken a similar tack. It analyzed the performance of new salespeople who were given “robust training” compared with others who were not, and concluded that the ROI was 480%. “We’re very satisfied that there is a relationship between skill proficiency and performance variables that have economic impact,” says Michael Bazigos, strategy and change executive at IBM’s Center for Learning and Development.
But in most enterprises, “learning is the only operation that isn’t accountable for demonstrating the impact of the expenditure,” says Mike Echols, a former General Electric executive who runs the Bellevue lab. He points to accounting rules that keep human-capital “assets” off balance sheets as the root cause of that anomaly.
Another academic, John Boudreau, at the University of California’s Marshall School of Business, agrees that measurements like those done by Bellevue and IBM are valid. But he takes issue with Echols’s reasoning as to why companies don’t value finance-based training metrics, saying it mischaracterizes business leaders’ level of sophistication about training.
Most believe implicitly that training pays off, says Boudreau, and they would rather not spend capital on measuring the impact. “We certainly have no evidence that they would invest more in training if they knew the dollar value of it,” he says.
The rare CFO who is inclined to dabble in the ROI of learning investments generally looks only for directional data rather than precise calculations, says Ed Boswell, the former CEO of a training consultancy and now leader of the People and Change practice at PricewaterhouseCoopers. And while firms often resolve to test for ROI as they prepare to invest in training, most of the time they don’t follow through.
In the end, it may come down to a CFO’s inherent orientation. “My sense is that those who believe training is a good investment don’t need a lot of data to reinforce that belief,” Boswell says. “And for those who don’t believe it, no amount of data will convince them otherwise.”
David McCann is senior editor for human capital & careers at CFO.
Different Strokes
Even if a company does want to measure the bottom-line benefit of learning programs, it may first have to ask, “What do we consider to be learning?” Sometimes, there are unexpected answers.
At Educational Testing Service, the definition includes assembling cross-functional teams of high-potential employees to solve business problems. “It’s training in the sense of learning how to work together toward a common goal and understanding the implications of various courses of action before you take them,” says Frank Gatti, the just-retired CFO of ETS. “Those are important things that I believe are missing from the higher-education system.”
For example, one project undertaken through the company’s Learning for Business Results program revamped and streamlined the process for responding to requests for proposal. ETS was easily able to identify a subsequent tripling of its RFP win rate and a sharp drop in resources used in the process.
IBM creates a variety of limited-term measurement experiments for what it considers forms of learning that lie beyond the traditional definition of training. In one example, a couple of years ago it began to use its own business-intelligence software, augmented by an advanced text-recognition feature, to analyze the learning quality of employee interactions via e-mail, messaging, and social media (the employees who participated agreed to waive their privacy rights). IBM uses the same technology to assess the content of periodic “jams” in which company employees brainstorm via an IBM intranet. They contribute and discuss ideas on various topics, allowing them to learn from one another — and the company to learn from them. — D.M.
Still No App for That
With a low-end smart phone packing more computing power than the giant machines that directed a spaceship to the moon in 1969, you might expect mobile devices to be a hot new platform for training and development. But the pace of adoption has been slower than analysts expected. Bersin & Associates, a corporate-learning consultancy, says that in 2010, about 20% of U.S.-based organizations provided mobile learning, up from 9% in 2007. That’s a healthy jump, but hardly the sea change one might expect given the ubiquity of mobile devices today.
Why the modest growth? Some say it’s because mobile devices don’t offer a quality learning experience. Screens are too small to hold an employee’s attention for more than a few minutes, says Norbert Büning, a managing director at Accenture. Tablet computers might address that in part, but another wrinkle concerns not so much the devices themselves as where they are typically used: while employees are in transit, which is rarely a good time to dive into a training session.
To address that, some companies are packaging training modules into four- or five-minute segments. Health-care provider Sta-Home Health and Hospice uses that approach with its on-the-go nurse practitioners. That helps personalize the learning process, says Sta-Home information-technology manager Glenn Wood, as employees can proceed at their own pace. But breaking content into bite-size chunks may do little to alleviate other problems of the truly mobile worker, such as the ambient noise on a train or plane ride.
Companies may also remain hesitant about mobile learning because other computer-based training programs have not worked well. “CFOs have spent a lot of money on e-learning and haven’t seen the results,” Büning says. — Marielle Segarra
