Demand for performance data is skyrocketing within organizations. Arthur Kordon, leader in the data mining and modeling group at the Dow Chemical Co., says his team has never been so inundated with requests as it has been during the economic crisis. “Executives are coming to us as sources of last hope to get answers,” Kordon says.
It’s not surprising: in times of economic uncertainty, corporate decision makers need every scrap of information they can get to make an informed decision. The importance of data and analytics has also been underscored by the banking crisis. At its heart, banking’s problems were not about greed, excessive risk-taking, and lax regulation, but about data quality, says Thomas Redman, president of Navesink Consulting Group and a keynote speaker at “Predictive Analytics in Perilous Times” a CFO conference held in San Francisco earlier this week.
“The data was wrong, or it simply wasn’t available. Bad data and the lack of transparency made it easier for the greed to go unchecked,” says Redman, a former Bell Labs quality specialist. In some cases, data was also mishandled, misused, or not used at all.
It behooves CFOs and other senior executives to head off such problems with a more proactive and aggressive strategy to capture high-quality data and leverage it in decision making, according to Redman. The consequences of not doing so: acting on data and analyses that should not be trusted or not acting on data and analyses that should be trusted. Both come with high costs.
So where should CFOs aim their guns? While past data errors can be costly, management’s energy is better spent on preventing future errors, Redman says – that can reduce the cost of poor data quality by one-half to two-thirds, he says. “When you find root causes you eliminate thousands of future errors at a stretch.”
From that starting point, there are many things CFOs can do to improve data accuracy. Two key tasks: establishing controls at all organizational levels to halt simple errors and formalizing management’s accountability for data quality.
The responsibility for data lies with business, not the information-technology department, Redman stresses. That requires managers to measure data quality at the source – and in terms meaningful to business units. For a bank, a quality measure could be as simple as the percent of customer statements that contain an error. For the U.S. Defense Logistics agency, the supplier to the Defense Department, it could be the percentage of weapons systems that ship on time to battlefield troops in Afghanistan.
Top-flight organizations also typically invest time to extract high-quality data from suppliers. “You must work back into your supply chain to make sure you know the quality of the ‘raw materials’ that go into things like financial products,” Redman says.
Striving for continuous improvement and setting aggressive error-rate targets are also habits that leading companies adopt, as is leadership from a broad-based, senior management team. What’s more, management will support cleanup efforts if they see it as crucial to overall corporate strategy.
Of course, data quality matters little if a company is focusing on the wrong measures. The best companies adopt a customer-oriented definition of data quality and recognize that all items of data are not created equal, Redman says. Which information is strategic? “Data that helps you operate more efficiently and data that you have [particularly customer data] that no one else has,” Redman says.
The Treasury Department’s plan to rescue banks, which has come under fire for a lack of results, suffers from a lack of metrics, experts feel. “No one understood what metric the government was going to use to see if the program was working, and what data they were going to collect to judge that,” says David Friend, the chairman of Palladium Group, a consultancy.
The Numbers Game
In contrast, in Major League Baseball, team front offices are starting to adopt analytics that translate on-field results into financial effects. In particular, baseball executives are looking at the return on investment of winning, says consultant Vince Gennaro, author of Diamond Dollars: The Economics of Winning in Baseball.”
An MBL team that makes the playoffs one year generates as much as $25 million more sales in the next five years than if it hadn’t made the postseason, Gennaro says. Teams that measure the relationship between wins and revenue have a better tool for accurately placing a value on factors that increase win total, like signing high-quality free-agent players. “Proprietary metrics – measuring the ‘hard to measure’ – has definite value for corporations,” Gennaro says.
Of course, data is useless if it doesn’t lead to better decisions. The need to execute strategy is the biggest issue in C-suites around the world, Friend says. “You have to take data and turn it into information and turn information into insight.”
Decisions based on the data, then, have to be monitored for results, and the lessons incorporated into planning. Risk management, indeed, is an area in which CEOs and boards are clamoring for data to make decisions.
Understanding risk is a start, but then companies have to develop meaningful data. For every key performance indicator (KPI), for example, companies should be tracking a key risk indicator (KRI), Friend says. “You plan not just for results, but for contingencies. What happens if sales are down 20 percent? If I’m a quarterback, what happens if the ball is intercepted?”
The lack of contingency planning evident in the banking crisis was amazing, Friend says. “You have to plan for bad things happening,” Friend says. “A 30 percent reduction in house prices should not have been a shock.” Planning does not take the form of a budget, he emphasizes, but involves “how do you get from point A to point B.”
Failure to plan for adversity produced a nightmare for the banking industry. “If you’re not modeling for rare but consequential events, you’re not doing your job,” Friend concludes.