Will an economic downturn hit in the next year or two? No one can say for sure, but a lot of people — including more than 70% of 250 C-level executives recently surveyed by Grant Thornton — are expecting one.
During times of uncertainty, those that are best informed can take advantage of the evolving landscape. And the use of a next-generation analytics program — i.e., a contemporary one, as opposed to previous-generation analytics capabilities that debuted around a decade ago and are still in use — can be a calling card for a nimble and agile organization.
C-suite executives, of course, require actionable insights to make timely and useful decisions. An analytic toolset can be likened to an organization’s flight instruments, telling executives when to turn and pull up. By applying analytics correctly, an organization can stay ahead of myriad risks and its competitors.
CFOs can use analytic trending and predictive insights to better anticipate and predict cash flow and market volatility, essentially looking outside the organization for what’s to come. These external views may allow companies to see further ahead than their rivals can.
Next-generation analytics can also be applied internally to manage crises or unplanned events. By swiftly modeling the impacts of emerging operational issues, companies can better plot and assess potential remediation strategies. Such internal views allow timely decisions to be made before a problem arises.
If a recession is likely within two years, companies need to enable technologies and train their employees now. Innovation investments made early will offer the greatest opportunity for return. Companies can harness regulatory changes, exchange rate volatility, and many other marketplace conditions to work in their favor if they have the tools to forecast and adapt.
Next-gen analytics programs adhere to three best practices in their implementations, as described below. Each of them can be helpful individually, but combining them is more likely to provide the agility necessary to contend with the uncertainty of an economic downturn.
The Use of a Flexible Framework
In 2008, during the last economic downturn, even organizations that employed data analytics fell victim to the turbulent times.
At that time, technology lent itself to two types of analytics usage: a decentralized approach (each department had its own spreadsheet/transactional query) and a centralized one (a data warehouse). Both were problematic.
The fault in the decentralized approach was that siloed data often resulted in disjointed views, with no single enterprise viewpoint. In the centralized approach, while data and viewpoints were normalized, modeling the warehouse for an unanticipated scenario often took weeks, sacrificing the advantage of agile planning.
Over the past few years, tremendous progress in the capabilities of data modeling frameworks has propelled analytics beyond the “Catch-22” situation that prevailed in 2008.
Technology has moved beyond centralized warehouses to big data schema like data lakes, data vaults, and visual data discovery. These technical disruptions enable rapid modeling of data, even in unanticipated ways, allowing a faster turn radius for a company.
Moreover, artificial intelligence, machine learning, and natural language processing can help companies pose questions to analytics and receive back predictive responses.
Ability to Fail Fast
Thomas Edison famously attributed his failures to finding out, for example, various ways not to make a lightbulb. Today, we often exhibit little tolerance for failure, taking a binary approach that relegates experimentation to a laboratory rather than the corporate world.
Failures cost money and don’t show an obvious ROI. When failures — such as a mistimed acquisition, for example — are built on the static structures of the previous generation of analytics, they can prove fatal.
Enter the cloud. Many cloud providers make heroic claims about what they can provide, citing varying degrees of value. However, one characteristic that all clouds have in common is that they’re not a permanent cost to a company. Cloud computing/storage services are “rentals” that can be turned off or on, sized up or down on demand.
By aligning a flexible analytics framework with the elasticity and non-permanence of the cloud, an organization creates the conditions to rapidly experiment with an on-demand analytics solution. This solution is low-cost (per cloud) and can be swiftly disposed of without penalty if deemed a “failure.”
An organization that can rapidly fail through iterations will incrementally improve its analysis with each subsequent experiment. Embracing fast though not permanent failure is a foundational characteristic of companies wishing to apply next-gen analytics.
Actions Based on Trend vs. Detail
While technology allows analysts to rapidly sift through volumes of data, analysts must be prepared to recommend action just as rapidly.
General behaviors of waiting for a particular indicator, such as a ratio to turn negative, were common practice in previous generations, presenting challenges in forecasting and trend analysis.
In today’s analytics landscape, details are less important than trends. A trend line allows prediction and, ultimately, the foresight to act. Analysts must be coached to assign confidence to a trend line and recommend action early. One might posit that a timid analyst with a modern-day toolset is as unfortunate as a Ferrari with a driver riding the brake.
Still, how can we prevent analysts from being reckless, if they base their analysis primarily on trends and less on details?
Here are two thoughts that are useful for calibrating analysis: (1) An analytics system is only as good as the analysts and the sophistication of the toolsets they employ. (2) A truly next-gen analytics program must encourage analysts to take advantage of the technology tools they now have at their disposal.
Analytics Now Compared with 2008
Next-generation analytics composed of flexible frameworks, fast-fail prototyping, and a trend mindset provide for insights were unavailable to executives in 2008. The presence of these characteristics will offer companies a path to turn an obstacle into an opportunity.
Let’s pose a scenario and consider the analytics options that could be tapped in 2008 vs. those that are available today.
Scenario: Recession has struck our suppliers in the Rust Belt, causing pricing fluctuations and product delays
2008 Analytics Options
- Analyze profitability adjustments from delays and higher cost of goods sold (COGS)
- Query order fulfillment impacts and lost revenue
- Submit specs to IT to create new analysis for alternate suppliers (estimate several weeks to complete)
- Pray
Potential 2019 Analytics Options
- Analyze profitability adjustments from delays and higher COGS
- Query order fulfillment impacts and lost revenue
- Leverage cloud data lakes to create self-service analytics for alternate suppliers within hours
- Develop forecasting models for continued Rust Belt economic degradation, providing insight on the most at-risk products over three, six, and nine months
- Geographically plot orders from source to fulfillment to determine whether opportunities exist to lessen freight cost, making up the shortfall of higher COGS
- Model the impact of regulatory tariffs if a non-U.S. supplier is required; provide optimal offshoring recommendation
- Develop simulations using AI to determine alternate suppliers and the impacts of each course of action
- Ingest publicly available information, like product reviews, using natural language processing tools to determine recommendations for best alternative suppliers based on quality metrics
- Utilize natural language processing to conduct social media monitoring for additional customer perspectives
- Conduct M&A analysis of acquiring distressed suppliers using cloud analytics for rapid diligence reviews…
- …and many more
Companies have a choice: they can continue to use previous-generation frameworks, which yield previous-generation insights; or they can adapt to today’s new tools, which enable flexible modeling and improved decision-making that drives growth. The latest tools enable swift experimentation of corporate data, driving better metrics and benchmarks.
In this period of economic uncertainty, the application of a next-gen analytics program can help savvy organizations remain agile and prepare for the downturn.
Jeff Silverman is a business analytics specialist for Grant Thornton that helps commercial and government clients in developing next-gen analytic solutions. Jeff is also a military intelligence lieutenant colonel in the U.S. Army Reserves, where he applies creative analytic solutions to solve problems.
