Improving workflows is challenging when most critical information comes from other parts of the business, far beyond the CFO’s line of sight. Finance and accounting organizations that are divided up by competency or business unit can rarely coordinate effectively across entire accounting processes. Analysts and accountants usually spend much of their time “circling back” to understand discrepancies and clarify information. Problems are usually discovered long after they originate, and gaining cooperation to fix errors is difficult. Over time, correcting errors takes priority over eliminating root causes.

How do you solve someone else’s (another department’s) problems? That might seem like a trick question, because leaders view all problems as their own. Unfortunately, this sense of shared responsibility can be blurred when functional silos interrupt flows of work and information across the enterprise. Finance and accounting departments can become too focused on their internal performance and lose sight of the overall objectives of their work.

For example, a unit controller may neglect to properly document some journal entries in her rush to meet close milestones. Juggling multiple priorities, a manager might book several fixed assets under the same record. Two finance analysts might use different source data for their revenue reports. These are just a few of the issues that lead to rework and inefficiency in finance and accounting, and each is caused by inadequate coordination.

workflow accounting and finance

John A. Kahler, PwC

The CFO of a large entertainment enterprise found that before improving the management of the accounting and finance functions, any problems resulted in finger-pointing between departments and were rarely resolved. Each group was focused on solving problems within their own functional silos; they made small improvements but larger issues persisted. When department heads were asked to collaborate on creating a better workflow, the idea was initially met with skepticism. 

But as they laid out the end-to-end value stream, they began to realize that each department played a critical role in delivering clean financials and timely analysis. Their initial reluctance vanished as they realized how functional hand-offs created gaps in the information flow. This eventually eliminated delays and rework cycles by improving communication and coordination between groups. By bringing people from across the process into the same room, they were able to resolve some of the most intractable barriers to improvement. Quality and timeliness improved as people shared accountability for the overall outcome.

From “inspect and correct” to “quality at the source”

Much of a typical accountant’s job entails reviewing and reconciling information. But any time spent tracking down incomplete information, correcting data or clarifying narrative is considered rework and waste. In most cases, problems with source data are noticed long after the fact by someone far from the point of origin. This rework interrupts the workflow and delays production of reports and analysis.

The critical process gap occurs when the person who originates a bit of information thinks the job is done right, but someone else who uses that information for a subsequent task discovers that the information can’t be processed. Why are these two people working with different concepts of timely, complete and accurate information? Defects often arise when there’s no clearly defined standard for what good data looks like.

Ask the person who discovers the problems how they determine whether the data is right or wrong. What do they look for to determine if information is correct or incorrect? These reviews should be described in very detailed terms. Ideally, every defect will have its own specific test, and those tests will be determined by the people who receive the information.

That highlights a critical relationship in any workflow: information receivers are customers, and information originators are suppliers. The customer, internal or otherwise, must ultimately define the requirements for good and bad information. Suppliers must conform to the customer’s definition of quality.

Once the CFO has identified the most common errors, everyone should be shown how to check their work to assure that it meets the quality standard. Data suppliers should adopt the same checks their customers use to verify that the information they pass forward will be suitable for use. It’s important that these reviews are performed exactly the same way so that all the errors are discovered at their point of origin, before the data is sent on to the next task.

Identifying errors at the source of the data gives originators a chance to minimize the disruptive impact of rework and, better still, to make timely and ongoing process improvements that reduce or eliminate the errors altogether.

Flow-through reviews drive efficiency and accountability

The point of establishing quality at the source is to achieve a continuous forward information flow. Moving quality-checks upstream to the data source can dramatically improve flow by reducing incidents when data moves backwards, from customer to supplier. The errors won’t be eliminated overnight, but downstream finance and accounting activities can often be completed with a thorough narrative in lieu of complete and accurate data.

For each point where poor source data is discovered, move the quality check back to the data source, then identify the ideal information set along with all the situations and variations that require questions and rework, and then determine a suitable substitute — a narrative, alternative transaction, etc. — that would allow each variation to be accepted and used without rework by the downstream customer.

One of our clients, a global business services company, used this method as part of a two-phased approach to improve flow and data quality in its fixed asset depreciation process. Beginning with a key checkpoint where most data problems were discovered, it identified all of the most common errors and omissions that created rework in their process. It shared the tests that revealed each problem with the people who supplied the data, and together they worked out standard explanations or substitute information that could be used when the ideal data weren’t available.

Finally, the data suppliers looked for ways to reduce or eliminate the root causes of data variations. Using this approach, they reduced their depreciation process lead time by almost half, and improvements to quality and efficiency continue.

One of the key lessons learned when working on the transformation of the accounting and finance functions is that everyone in a workflow is accountable for the overall result. This enables people to move past the blame game and collaborate to serve each customer of the data. That’s when the CFO begins to really get quality information.  

John A. “Jack” Kahler is an Advisory director with PwC focused on operational excellence in finance and back-office workflows. He can be reached at john.a.kahler@us.pwc.com.

© 2014  PricewaterhouseCoopers LLP, a Delaware limited liability partnership. All rights reserved.

This content is for general information purposes only, and should not be used as a substitute for consultation with professional advisors.

, , ,

6 responses to “How Can a CFO Get Timely, Clean Financial Data?”

  1. Mapping the value stream at the initial phase sets a solid foundation to acheiving a seamless flow of work, making each unit/department responsible and eager to provide the right information at the right time. At Minacs this process is done at the early stages of any transition, thus increasing the predictability of delivery agreed as a part of the Service levels and reducing process time considerably.

  2. Love the concept of “quality at the source” and “that everyone in a workflow is accountable for the overall result”.

    We are undertaking a move to the cloud with our accounting and association management systems and are following your playbook. First looking at all workflows in major business lines and then looking for the “place of best opportunity” for maximizing the benefits of the key cloud systems we are implementing (Tallie (Expense management), Bill.com (payable/receivables) and Intacct (ERP/GL). The other key concept I would add is make sure that your teams resist the urge to “pave the cowpaths” when automating workflows by first looking for the new and improved ways of processing before customizing.

    My second insight is that this article describes the intent behind the recently enacted federal DATA ACT (Digital Accountability and Transparency Act) which is about creating “quality at the source” for recipients of federal money to “tag” their financial transactions with a standardized coding system (taxonomy) that will allow for more accurate, faster, and much more transparent accounting of federal spending.

    Finally there are new tools like XBRL and XBRL GL which can help bring standardized data together from disparate systems and support the workflow and analysis functions.

  3. Two things impact a firm’s ability to get things done – the organizational structure and experience of, in this case, the CFO. The reporting structure can be a nightmare. Similarly, most CFOs do not have the “hands on” experience of detailed accounting and financial analysis since technology has eliminated the need for “bookkeeping”, manual compilation of critical ratios, and in many cases, procedures manuals.

    We take what the computer generates as gospel. I don’t see a proper fix unless we understand the systems, procedures and processes. Automation has replaced these but the automation has been generated by people who most likely don’t understand the intricate parts of accounting and financial control. You ask people today to explain general ledger cards and posting machines – they can’t.

  4. I do agree with Alan Lester about the overall effect of automation. Recent accounting graduates are less likely to understand traditional bookkeeping, and so may not understand the full effect of transactions and processes. Procedure manuals such as Balance Sheet Books make a difference – especially when it is put together by the front line Accounting staff and reviewed by a seasoned Accounting manager. Middle managers with bookkeeping experience can also bridge the gap with other program heads to ensure ‘quality at the source’. Great article, one to which many of us can relate.

  5. At the heart of any process is the need to measure “success”. Often success needs to be defined by the receiving party (another internal department, even a custoemr) as much as it does the originating party.

    Define success or output improperly and you can end up making your mistakes faster which can appear to be efficient, but is not really an effective use of resources nor a goal one should be proud of.

    What gets measured gets managed.

  6. Identifying errors at the source of data is best possible solution. We have recently implemented ERP. In order to ensure accurate, correct and complete data, we have issued SOP ( standard operating process) with pro-active approach allowing users to revert as soon as they face problem of lack of understanding. We have champions who attends to those issues thereby we ensure clean and complete data.
    We intent to review daily activities of all processes so that we come to know immediately where rework is required. and timely action is taken.

Leave a Reply

Your email address will not be published. Required fields are marked *