Have you heard of the phrase “Industrial Internet”? Coined by General Electric, it refers to the integration of complex physical machinery with networked sensors and software. Its components include machine learning, big data, the Internet of things and machine-to-machine communications.
If you think about practical applications of the Industrial Internet, you might think about making machines like jet engines, locomotives or MRIs smarter. Or you might think about networking these machines together, or making sure a water-level sensor can tell a valve to turn off in time. All of those things are important and full of challenges, in terms of both technology and standards.
But there is a bigger picture, and in some part it’s about service.
If you are a CFO in any number of industries, such as aviation, health care, hospitality, oil and gas, power and transportation, you are in a service business. And what does that mean? Answering the phone nicely? No, service is delivering personally relevant information. If you ask a hotel concierge to recommend a cheap Italian restaurant nearby, and you’re steered to a good place, that’s service. If a doctor tells you that based on your personal genome and your lifestyle, you need to start taking Lipitor and, by the way, it wouldn’t hurt to start exercising, that’s service.
The consumer Internet of Google, Facebook and Amazon uses lots of computers to deliver information that’s personal and relevant to individuals. But let’s put aside consumer applications and talk about service as it relates to the Industrial Internet.
Consider a next-generation, public-safety application. Today, if you call (and you can’t text) 911, you talk to an operator who types in your information; if you are one of many callers at one time you’re put in a queue. Some of this information is relayed to the police, fire or HazMat departments – but not all of it. Tell me, though: Why should it be that way?
More than half of Americans have smart mobile phones. So, in almost any emergency situation, multiple citizens are armed with tools that collectively can paint a vivid data picture of what is happening. Rather than thinking of how we don’t want 24 people to report the same traffic accident, why don’t we see those 24 reports as 24 sources of data providing timely information from a variety of perspectives, all with some value? Why isn’t all of the location, video and audio information routed to a public-safety, cloud-based computer and storage service running applications that deliver personal and relevant information to everyone involved in the emergency?
Beyond the mobile information, today there is an increasing amount of fixed data on the weather, buildings, traffic and other things that may trigger or influence emergency situations. To take just one of those, “weather data” is a compilation of radar data, satellite data, infrared data, water-vapor data, lightning data and modeling data for a 10-to-100-square-mile area every 15 minutes.
Further, in a metro area there is also information coming from every traffic intersection. In a city the size of Tucson, Ariz., with thousands of intersections, four gigabytes of such information is produced in an hour. More information comes in from modern smart buildings; there might be 250 of those in Tucson, producing two more gigabytes of data per hour.
Think about a 24-hour emergency. It’s easy to imagine more than 100 terabytes of relevant data being produced. Now imagine that data stream being fed to a next-generation cloud service with perhaps 10,000 servers running applications to deliver information that’s personal and relevant to an office worker on the 20th floor of a burning building, a firefighter at the northeast corner of the building, and an EMT arriving from the west.
A service application could inform the firefighter that it’s 85 percent likely a gust of wind will shortly move the fire toward him. A mobile application could inform the office worker to go down the right side of the building rather than the left from floors 10 to 16. There are all kinds of information that could be made relevant to every police and HazMat officer.
This is just one simple example, and it’s from the public-safety domain. What about the rest of the service economy? What would a power-service application look like? Power companies have access to information about the weather, about power generation from solar arrays on homes, about a coming increase in power demand because the Super Bowl will be televised tonight, about generators, about the availability of power from the 10,000 electric cars on the city streets. What information would be personal and relevant to a consumer, a power-plant operator, a chief information officer and a maintenance supervisor? And what if you could tie together the information from multiple power operators to create massive Industrial Internet for the power sector?
Today, General Electric collects more than 30,000 operating hours of data from more than 1,500 globally deployed power-generation locations on a daily basis, supplementing a 15-terabyte database representing more than 93 million fleet operating hours. A team of more than 20 engineers analyzes upwards of 25,000 operational alarms per year. Using a combination of off-the-shelf and custom analytic tools, the team diagnoses problems ranging from failed sensors to gas turbine compressor damage. The team has developed dozens of physics-based proprietary algorithms that provide early warnings of more than 60 different failure mechanisms.
You might not be in the public-safety or power business, but odds are good that you’re part of the service economy. CFOs should start asking questions about what information your company has access to that may be personal and relevant to people within the company, customers, suppliers and outside partners.
Timothy Chou teaches cloud computing at Stanford University. He is the former president of Oracle on Demand and the author of Cloud: Seven Clear Business Models.