Dependency on data, and the accuracy of that data, is pivotal for any organization. But in a recent research report from insightsoftware, less than a third (27%) of organizations said a majority of their users can forecast at scale using current data analysis tools. As dependency on data increases in all industries, low confidence around that data may result in less reliable business forecasts.
Monica Boydston, vice president of product management at insightsoftware, said the deep, complex layers of leveraging data can make a major difference for a finance executive looking to strategize toward considerable growth. “While the latter steps in the data fluency hierarchy, like predictive analytics, might be more attractive, executives need to focus on data quality and consistency,” said Boydston. “Without quality data, you’re going nowhere fast.”
Monica Boydston
Connected Data
As organizations rush their data platforms into public and private clouds, their data may suffer a disconnect in the transition, according to the survey. While aggregation techniques in connected data have dropped 14% since 2021, nearly an identical drop (13%) occurred in the blending of data from multiple sources in the construction of analytics applications. As rapid access and processing of data are needed to make the most accurate predictions, efforts towards innovation in data storage may be starting to slow productivity within data usage.
According to Boydston, transitions in work environments also play a role in a drop in data quality during transitions in infrastructure.
“The remote working boom has brought with it a huge proliferation in cloud technology, as organizations seek to enable their home-bound workforces,” she said. “That comes with its advantages, but it also reduces a non-technical employee’s ability to seek technical help. If they are using a complex analytics tool and need to generate some ad hoc analysis for leadership, they must wait until IT responds to deliver that report, when it may already be too late.”
As the aggregation and blending of data decreases within data systems, issues around data consistency have continued to rise. More than a third (37%) of teams reported problems with data inconsistencies; that number is up from 29% in 2021.
Organizations that wish to offer flexible work environments while also hoping to maximize data quality and consistency must hone in on this idea of properly connected data, according to Boydston. “In a remote working environment, it’s even more important to enable your workforce with intuitive tools that allow them to get most of their job done autonomously.”
Forecasts Require Quality Data
Despite the widespread digitization of businesses, a significant number of global professionals are concerned about the quality and consistency of their data. Over one-third of respondents indicated problems in three different areas issues.
More than a third (39%) of respondents do not rate their data quality highly for completeness, according to survey results. A nearly identical number of respondents (38%) said they don’t rate their data highly for consistency, while another 35% do not rate their data highly for accuracy.
When asked about frequency of forecasting in relation to data, Boydston highlighted the need for quality and consistency improvements in data. According to her, quality over quantity is paramount in forecasting.
“Accuracy is far more important in forecasting than frequency,” Boydston said. “You could be churning out quarterly forecasts, but if you or leadership do not trust the underlying data, then it’s useless.”
“Market uncertainty is forcing CFOs to look ahead, and rightly so, but to see a clear picture of the future, you must first have a clear picture of the present,” she continued. “To this end, data quality and consistency are essential to effective forecasting. Once you have that nailed down, and tools and processes in place to boost forecast efficiency, then you can look to increase your frequency.”
Fixing the Problem
Despite some issues increasing, a desire to fix the problem is prevalent. Nearly three-quarters (74%) of those surveyed believe it is important for users to become more data literate. Large majorities of respondents believe increasing data literacy is a priority and that it is valuable for users to work towards a data-fluent state (71% and 61%, respectively).
“Major systems overhauls and audits are not fun for anyone and take a significant financial and time investment," said Boydston. "Thankfully, there are tools available that can automate the most painful aspects of data cleaning, significantly reducing the burden of [improving] data quality.”
The very first step for this, or any other major data-change process, is sorting out current systems and processes to ensure that you trust the analysis you are generating. — Monica Boydston, insightsoftware
According to Boydston, convincing leadership of the value of data leveraging is the initial approach to finding a solution to problems surrounding data.
“The first step is taking stock of where you are, and then, where you want to be. Break down the gap between those two positions, what it will cost to close, and the value your decision-makers can expect as a result.”
Poor data can result in difficulties in making arguments on why said data is faulty. “Explaining the organization’s current data position is harder than it sounds, especially if you are dealing with incomplete or inconsistent data sets,” she said.
“The very first step for this, or any other major data-change process, is sorting out current systems and processes to ensure that you trust the analysis you are generating. If you don’t trust it, there’s no way that leadership will.”
