It was only early last year that General Electric started to talk about its plan to digitise its entire business, but arguably the whole thing started much earlier, in the mid-1990s. That was when GE launched its “Six Sigma” initiative, the management method on which its redoubtable quest for perfection is based. It is no accident that this effort was first led by Gary Reiner, the company’s chief information officer.
Six Sigma, in essence, is a way of creating a closed-loop system to make continuous improvements in business processes. First, you pick a goal, often customer-related, for instance the time you take to deliver a product, and measure how well you are doing against that goal—not on average, but in terms of variation. Then you try to change the business process in order to reduce that variation as much as possible. If you hit your goal 99.9997% of the time, you have achieved “six sigma”, a statistical term describing the degree of variation. In Six-Sigma parlance, it means your “defects per million opportunities” are down to 3.4.
GE has trained tens of thousands of its managers in Six Sigma techniques, which now makes it much easier for the company to introduce real-time technology. It allows results to be measured easily, and business processes to be adjusted quickly to improve these results. In fact, most of the start-ups mentioned in this article say that their products are designed for exactly this kind of closed-loop decision-making.
Not many firms will be as prepared as GE to go real-time. Many will have to adapt their culture and the way they do business. And, for better or worse, things are likely to get more quantitative, centralised and ever-changing. Until recently, the corporate spreadsheet—the IT guts of a firm—was shaped mostly by the organisation for which it was designed. Now it is the spreadsheet which in many ways will shape the organisation.
Once Again, with More Feeling
Not that re-engineering combined with IT is a new concept. In the 1990s, many firms went through a wrenching re-engineering experience, often in parallel with the equally difficult introduction of an enterprise resource planning (ERP) system. But these were one-off efforts usually limited to one company. Real-time technology should make it possible to re-engineer business processes on a continuous basis, and across the boundaries of many firms.
Both authors of the infamous 1993 bestseller “Re-engineering the Corporation”, Michael Hammer and James Champy, have recently written new books. In “The Agenda” (Crown Business, 2001), Mr Hammer emphasises the need to “institutionalise a capacity for change”. Mr Champy, now chairman of consulting at Perot Systems, in “X-Engineering the Corporation” (Warner Books, forthcoming) invokes the increasingly pressing need for “cross-organisational process change”.
At this point, it is anybody’s guess what a typical real-time enterprise will look like. Ray Lane, a partner with Kleiner Perkins Caufield & Byers, predicts that in the long run real-time technology will do away with all the features of a firm that were needed to assure information flow in an offline world: hierarchies, departmental boundaries, paper-shuffling employees. To the former number two at Oracle, this is a tempting prospect because it would empower top executives: they would no longer be isolated from their business by layers of bureaucracy.
Tibco’s Vivek Ranadivé, for his part, already has a rather precise vision of what he calls the “event-driven” firm. If he is right, running a company will be rather like managing an IT system today: machines monitor the business, solve problems by themselves as far as possible and alert managers when something is amiss. Mr Ranadivé calls this “management by exception”, and to some extent already practises it at Tibco: most of the firm’s employees are equipped with a BlackBerry, a wireless device that can receive and send e-mail, so that they can be given warning of an “event” such as an unhappy customer.
Yet firms are not just paper- and people-based information systems, easily replaced by more efficient digital ones. Some aspects of non-computerised information systems that have developed over hundreds of years will be hard to digitise, not least because they are not always consciously understood, argues Ole Hanseth, the researcher at Oslo University, who is currently studying the way information about x-ray examinations is handled in a hospital. One example of such “hidden” data is the precise way a patient’s chart is placed on a desk, which can indicate that the examination is over.
There are similar problems with another theory about the future of the firm: that information technology will slowly but surely deconstruct the company as we know it. According to this view, the fundamental building blocks of the economy will one day be “virtual firms”, ever-changing networks of subcontractors and freelancers, managed by a core of people with a good idea. Eric Raymond, one of the intellectual leaders of the open-source software movement, even thinks that the future belongs to the “ex-corporation”— groups of people held together mainly by idealism or desire for self-expression and led by benevolent dictators, similar to today’s open-source projects such as Linux.
Such predictions are often based on a one-sided interpretation of the ideas of Ronald Coase, a Nobel-prize-winning economist, says Phil Agre, a professor of information studies at the University of California at Los Angeles. True, he explains, technologies that speed up the flow of information bring down transaction costs, which should induce companies to do less themselves and outsource more. But Mr Coase also argued that the size of a firm is determined by organising costs, which technology tends to lower as well, so the real-time enterprise might end up being larger than its less nimble predecessors. And there are other forces that keep a firm together. Even the best information flow cannot replace old-fashioned trust and social bonds. Besides, a company might want to keep control of its supply chain to protect its brand and knowledge.
Yet better IT will certainly allow the economy to be reorganised in more efficient ways, giving rise to ever more specialised firms. For instance, if the underwriting department of an insurance company is delivered internally as a web service, it could easily be outsourced or, more likely, offered as a service to others. The electronics industry is at the forefront of this trend. Most big computer makers no longer build what they sell, but outsource production to huge, albeit lesser-known providers of manufacturing services such as Flextronics or Solectron.
The firm as an institution is thus pulled in two directions: to become smaller as well as bigger. Perhaps the economy will one day look like traffic patterns on the Internet. There are a dozen or so websites, mostly portals such as Yahoo! or AOL.com, that attract millions of visitors every day. Then there are millions of websites that are lucky if they get a dozen hits a week. And there is not much in the middle. Similarly, there could be a few dozen giant global concerns that offer more or less standardised services in manufacturing, finance or computer systems which millions of small firms will use.
Much easier to predict than what will happen to the firm is what will happen to contracts. Information technology makes monitoring much cheaper, says Hal Varian, an economics professor at the University of California at Berkeley. And just as good fences make good neighbours, he argues, good monitoring makes good contracts. His favourite example is video shops. Before 1998, they generally paid distributors a whopping $70 for each tape, which meant that they ordered only small numbers of even the most popular flicks. Now they pay a small charge of $3-8 up front and then hand over 40-60% of every rental fee. This has become possible because smart cash registers and network connections are now cheap enough for distributors to monitor sales.
In this instance everybody wins: video shops can afford to order many more copies of popular movies from Hollywood, and consumers do not have to wait for them. But the effect of new technology may not always please all concerned. Software such as Siebel’s recently released employee-relationship management program allows companies to monitor their employees more effectively, which makes it easier for them to weed out their worst performers. Siebel fires the bottom 10% of its workforce every year.
More and better information will also change contracts with suppliers. Vivecon, a start-up founded by Blake Johnson, a Stanford business professor and former investment banker, has developed software that helps firms manage procurement contracts. For example, should they ask for fixed quantities and prices, or should they have flexible agreements? And what should be the penalty if they want to be released from their commitment?
The aim of the game is to structure agreements with suppliers in a way that lowers cost and risk. A manufacturer hoping to launch a new product for which demand is uncertain may not want to get locked into buying a large quantity of a critical component. It might commit itself to a smaller volume, with a guarantee that more will be available, and pay a bit more per unit.
Vivecon also helps firms to choose the right portfolio of these “structured contracts”, along with long-term relationships with suppliers and spot-market purchases. Hewlett-Packard is already using the start-up’s software and services to manage the procurement of some of its memory chips and electric power. And that is only the beginning, says Corey Billington, who runs HP’s supply-chain services. To him, procurement will increasingly become like trading. Yet looking at Enron, the energy-trading giant which collapsed recently, that might not be such a good idea.
Copyright © 2002 The Economist Newspaper and The Economist Group. All rights reserved.