No matter how much rigor a company may bring to it, building a data warehouse remains an act of faith. It can also entail a lot of pain.
So what prompts a company to do it? Pain.
That was the case at Impax Laboratories, which embarked on a data-warehousing project in 2006 after determining that printing out reports from two different ERP systems and manually entering the data into spreadsheets for analysis was impeding its ability to make smart business decisions.
Data warehouses synthesize data from disparate sources and present it in an easy-to-comprehend manner, often via a “dashboard” or similar data-visualization technique. This, at least in theory, promotes more-informed decision-making. They are often at the heart of business-intelligence (BI) and analytics systems, but many small and midsize companies believe that building a data warehouse is beyond their capabilities. After all, it entails acquiring technology, migrating databases, writing integration code, validating data accuracy, layering BI tools on top of the underlying warehouse, and cajoling and training employees to use the new system.
But, daunting as such projects may be, they can drive great — if difficult to quantify — value. At Impax Labs, a $358 million, publicly held specialty pharmaceutical company, the effort began as a relatively low-budget affair. It had just two dedicated employees, including project leader Jonathan Retano, a former Federal Reserve financial analyst whose five-year tenure at Impax had included stints in both sales operations and accounting. He moved to the IT department as associate director for business intelligence.
“I’m certainly not a traditional IT person, but with my background I had an understanding of the pain points for users,” Retano says. The goals of the program, he explains, were to make it easier for users to get access to data and to generate reports; establish a single, accurate version of the data; enable data sharing companywide; and combine data from different sources to provide a 360-degree view of the business.
Getting there has involved a long procession of small steps. Retano began by researching best practices as described in books by experts like Ralph Kimball and lining up an advisory firm, ISA Consulting. The first half of 2007 was devoted to buying and customizing IBM’s Cognos BI software suite for data extraction, modeling, and reporting, and then setting up the architecture (using Microsoft’s SQL Server as the platform).
That established a foundation that allowed Impax to cross a major hurdle: conducting a test run to make sure employees would embrace the concept. The test demonstrated how they could easily generate two reports they used often, on daily sales activity and back orders. Once employees compared the new reports with what they had been getting, they were enthusiastic. “It was a way to say, ‘Here’s what we can do. If you like it, we can continue and add more stuff,'” Retano says.
Indeed, “add more stuff” has become a rallying cry for the project. With an initial thumbs-up from employees, Impax marked 2008 by rolling out new BI capabilities one after the other: information on products, customers, shipments, and credit/debit memos because of their importance to many areas of the business, then the inclusion of manufacturing priorities such as inventory transactions and balances. Also added was market-share data from external sources, so that someone reviewing a product’s net sales and margins could factor in, say, new competitors entering the market. The system’s capabilities were expanded every quarter (see the chart at the end of this article), and Retano says that it continues to evolve and expand.
Battle by Battle
While winning user acceptance was critical, Retano didn’t waste much time exploring how users wanted the data to be presented. “We found out early on they didn’t know what they wanted,” he says. “You need to develop something quickly and get it out to them. Then the light will go on and lead you in the direction you need to go.”
That’s part and parcel of what Retano calls the “divide-and-conquer approach,” one that emphasizes piecemeal progress versus a grand rollout of massive capabilities. “You target a specific topic in a given area, such as shipping performance, and just focus on validating that. It goes a lot smoother that way,” he says.
“Divide and conquer” also helped overcome another challenge: maximizing the effectiveness of employee training. Employees were able to learn the system incrementally, easing the strain. “You can’t just put the data out there. People have to get to the point where they’re comfortable using the tool to improve their work lives,” Retano says. Still, mitigating the normal human aversion to change required constant vigilance. Retano adopted the role of evangelist, seizing any opportunity to demonstrate the tool to executives and managers and point out its time-saving benefits.
There were tricky technical aspects to the project as well, like merging data from the two ERP systems and pulling information from a separate system used to track chargebacks for indirect sales by distributors to pharmacies. But Retano found those easier to handle than the more nebulous, people-oriented issues. “Those are defined problems that you can see right in front of you,” he says.
Probably the most vexing problem was, and is, maintaining a high level of data quality. If, for example, a product is assigned to the wrong business segment in the BI tool, reports generated from it will be wrong. Users then will lose trust in the tool and revert to the more laborious manual processes. Bad data is problematic even if there is no BI initiative, of course, but since a data warehouse is intended to provide a widely disseminated “single source of truth,” people expect the data to be correct.
Even a midsize company like Impax has an enormous volume of data, and “data cleansing” can be complex. Retano says it’s not just an IT issue, but a companywide concern, so he frames it around “the four Cs” — everyone responsible for data should regularly check that it is correct, current, consistent, and complete. “Bad data is the quickest way to short-circuit a BI initiative,” he says, “but addressing it is not fun or exciting, so it doesn’t get the attention it deserves.”
Retano organized an unofficial data-quality group that meets every other month. Its members are part of what Impax calls its Business Intelligence Competency Center, a collection of people who have proven adept at getting value from the BI deployment and are available to offer advice to other users.
Now You See It
The BI system continues to evolve. This year one focus is on the development of custom portals, secure internal Websites devoted to specific subject areas and departments. The new Key Products Portal, for example, provides executive-level users with key performance indicators about the company’s top products. The portal also offers news stories relevant to those products, their customers, and competing options.
Another new portal, for the accounting department, highlights crucial information relating to accounts payable, accounts receivable, shipments, and credit/debit memos. An AP staffer can immediately see the current state of AP: how many invoices are past due, how much cash is outstanding, and so forth.
The portals are designed to provide what Retano calls “four-in-four” information: the four things that a manager or executive should know within four seconds of launching the portal at the start of the day. Using the number four is partly aesthetic (“We tried five, but four graphs fit nicely in a window,” Retano says) and partly psychological, as the human mind cannot easily comprehend more than four objects or images simultaneously.
But What’s the Payoff?
Retano says the goals of the data-warehouse project — easy access to a consistent database, sharable data, a more informed view of the business — are being realized, but he admits that the economic value of such capabilities is very difficult to quantify. It’s possible to measure the time that users save in not having to manually generate reports, but a data warehouse is fundamentally a decision-support tool — and how can you quantify the results of a smart decision, such as suspending an underperforming product line, when so many factors will influence the result of that decision?
That challenge bothered Impax CFO Art Koch when he approved the program in 2007. The firm attempted a cost-benefit analysis, but the results were inconclusive. Koch admits he had doubts; he knew that “projects like this are built on great aspirations, but don’t always deliver.” His approval ultimately hinged on testimony from the firm’s sales group, who argued that enhanced data-analysis capabilities would provide competitive advantages.
James Kobelius, a senior BI analyst with Forrester Research, notes that vendors like to push what he calls “the lottery value of BI” — that a single great decision can transform a company. But in practice, he says, most decisions supported by BI are routine and operational — and unlikely to provide great incremental value.
Nonetheless, Kobelius maintains that even if decision-making remains more an art than a science, the value of systems that support it can be quantified. He is developing a conceptual ROI model for decision-support infrastructures, based in part on assertions that every bad decision has a calculable “do-over” cost, and every optimal decision has a calculable monetary return.
Retano says there is a danger in overrelying on data when making decisions, and that the subjective component of experiential wisdom plays a key part. “But,” he says, “at least BI gives you a better base from which to extrapolate your own conclusions.”
David McCann is senior editor for technology at CFO.