Improving workflows is challenging when most critical information comes from other parts of the business, far beyond the CFO’s line of sight. Finance and accounting organizations that are divided up by competency or business unit can rarely coordinate effectively across entire accounting processes. Analysts and accountants usually spend much of their time “circling back” to understand discrepancies and clarify information. Problems are usually discovered long after they originate, and gaining cooperation to fix errors is difficult. Over time, correcting errors takes priority over eliminating root causes.
How do you solve someone else’s (another department’s) problems? That might seem like a trick question, because leaders view all problems as their own. Unfortunately, this sense of shared responsibility can be blurred when functional silos interrupt flows of work and information across the enterprise. Finance and accounting departments can become too focused on their internal performance and lose sight of the overall objectives of their work.
For example, a unit controller may neglect to properly document some journal entries in her rush to meet close milestones. Juggling multiple priorities, a manager might book several fixed assets under the same record. Two finance analysts might use different source data for their revenue reports. These are just a few of the issues that lead to rework and inefficiency in finance and accounting, and each is caused by inadequate coordination.
John A. Kahler, PwC
The CFO of a large entertainment enterprise found that before improving the management of the accounting and finance functions, any problems resulted in finger-pointing between departments and were rarely resolved. Each group was focused on solving problems within their own functional silos; they made small improvements but larger issues persisted. When department heads were asked to collaborate on creating a better workflow, the idea was initially met with skepticism.
But as they laid out the end-to-end value stream, they began to realize that each department played a critical role in delivering clean financials and timely analysis. Their initial reluctance vanished as they realized how functional hand-offs created gaps in the information flow. This eventually eliminated delays and rework cycles by improving communication and coordination between groups. By bringing people from across the process into the same room, they were able to resolve some of the most intractable barriers to improvement. Quality and timeliness improved as people shared accountability for the overall outcome.
From “inspect and correct” to “quality at the source”
Much of a typical accountant’s job entails reviewing and reconciling information. But any time spent tracking down incomplete information, correcting data or clarifying narrative is considered rework and waste. In most cases, problems with source data are noticed long after the fact by someone far from the point of origin. This rework interrupts the workflow and delays production of reports and analysis.
The critical process gap occurs when the person who originates a bit of information thinks the job is done right, but someone else who uses that information for a subsequent task discovers that the information can’t be processed. Why are these two people working with different concepts of timely, complete and accurate information? Defects often arise when there’s no clearly defined standard for what good data looks like.
Ask the person who discovers the problems how they determine whether the data is right or wrong. What do they look for to determine if information is correct or incorrect? These reviews should be described in very detailed terms. Ideally, every defect will have its own specific test, and those tests will be determined by the people who receive the information.
That highlights a critical relationship in any workflow: information receivers are customers, and information originators are suppliers. The customer, internal or otherwise, must ultimately define the requirements for good and bad information. Suppliers must conform to the customer’s definition of quality.
Once the CFO has identified the most common errors, everyone should be shown how to check their work to assure that it meets the quality standard. Data suppliers should adopt the same checks their customers use to verify that the information they pass forward will be suitable for use. It’s important that these reviews are performed exactly the same way so that all the errors are discovered at their point of origin, before the data is sent on to the next task.
Identifying errors at the source of the data gives originators a chance to minimize the disruptive impact of rework and, better still, to make timely and ongoing process improvements that reduce or eliminate the errors altogether.
Flow-through reviews drive efficiency and accountability
The point of establishing quality at the source is to achieve a continuous forward information flow. Moving quality-checks upstream to the data source can dramatically improve flow by reducing incidents when data moves backwards, from customer to supplier. The errors won’t be eliminated overnight, but downstream finance and accounting activities can often be completed with a thorough narrative in lieu of complete and accurate data.
For each point where poor source data is discovered, move the quality check back to the data source, then identify the ideal information set along with all the situations and variations that require questions and rework, and then determine a suitable substitute — a narrative, alternative transaction, etc. — that would allow each variation to be accepted and used without rework by the downstream customer.
One of our clients, a global business services company, used this method as part of a two-phased approach to improve flow and data quality in its fixed asset depreciation process. Beginning with a key checkpoint where most data problems were discovered, it identified all of the most common errors and omissions that created rework in their process. It shared the tests that revealed each problem with the people who supplied the data, and together they worked out standard explanations or substitute information that could be used when the ideal data weren’t available.
Finally, the data suppliers looked for ways to reduce or eliminate the root causes of data variations. Using this approach, they reduced their depreciation process lead time by almost half, and improvements to quality and efficiency continue.
One of the key lessons learned when working on the transformation of the accounting and finance functions is that everyone in a workflow is accountable for the overall result. This enables people to move past the blame game and collaborate to serve each customer of the data. That’s when the CFO begins to really get quality information.
John A. “Jack” Kahler is an Advisory director with PwC focused on operational excellence in finance and back-office workflows. He can be reached at [email protected].
© 2014 PricewaterhouseCoopers LLP, a Delaware limited liability partnership. All rights reserved.
This content is for general information purposes only, and should not be used as a substitute for consultation with professional advisors.