Many people think the Internet of Things (IoT) is about your toaster talking to your refrigerator. While there will no doubt one day be very useful consumer IoT applications, more immediately there are many industrial applications, and many more potential ones, to consider.
This article, the first of three on the IoT, constructs a framework for precision technology — that is, an organization of the technologies that will enable the building of precision machines. The second article will take the point of view of a machine manufacturer and discuss the value of using such technology to build precision machines. The third article will discuss how such machines can be used to power precision industries like farming, transportation, healthcare, construction, and power.
Most first- and second-generation enterprise software was focused on us — people, whether individuals or groups. Applications were designed to help people do something useful, like buy a book, issue a purchase order, recruit employees, or communicate with others.
But things are not people. That may seem obvious, but there are at least three fundamental differences that matter for purposes of this discussion.
There are a lot more Things than people. These days, you can’t be on the Internet and not see some pronouncement about how many Things are going to become connected. John Chambers, former CEO of Cisco, recently declared there will be 500 billion “Things” connected by 2025. That’s about 70 times the number of people currently living on this planet.
Things can tell you more than people can. The main mechanism people use to communicate with applications is a keyboard, and most applications use some kind of form to collect simple data from people.
Things, by comparison, may have many sensors. A typical cell phone has about 14 of them, including an accelerometer, GPS, and even a radiation detector. Industrial Things like wind turbines, gene sequencers, and high-speed inserters can easily have 100 sensors.
Things can talk constantly. People don’t actually enter data all that frequently into “Internet of People” (IoP) applications for e-commerce, human resources (HR), purchasing, customer relationship management (CRM), or even enterprise resource planning (ERP). But a utility grid-power sensor can send data 60 times per second, a construction forklift once per minute, and a high-speed inserter once every two seconds.
The first generation of enterprise application software from SAP, Oracle, Siebel, PeopleSoft, and Microsoft leveraged the availability of low-cost, client-server computing to automate key financial, HR, supply chain, and purchasing processes. The business model was based on licensing the application software, and the purchasing company was left with the responsibility (and cost) of managing the software’s security, availability, performance, and updates/upgrades.
Around 2000 there came the second generation of enterprise application software. It was largely differentiated by a fundamental shift in the delivery model. The software provider took on the responsibility of managing the software, and with that change came a change to the business model.
Rather than an upfront licensing fee, a software-as-a-service (SaaS) model emerged, which allowed customers to purchase the service monthly or annually. Key business suppliers from this era included Salesforce.com, WebEx, Taleo, SuccessFactors, NetSuite, Vocus, Constant Contact, and Workday, to name a few.
As a result, most basic corporate functions — sales, marketing, purchasing, hiring, benefits, accounting — have been automated. You can debate the effectiveness of the automation, but while the resulting improvements in operational efficiency through CRM or ERP software are good, they’re hardly transformative. It’s really only in the areas of retail (think Amazon) and banking (think eTrade or PayPal) that software has transformed businesses.
Perhaps now, with the changing economics of computing, the continued innovations in communications technology, and the decreasing cost of sensors, we can move to a third generation of enterprise software. That new generation will tackle the challenges of precision agriculture, power, water, health care, and transportation, and fundamentally reshape businesses and our planet.
In order to organize the technology of IoT, let’s define a simple five-layer framework. The first layer is composed of Things. We’ll use the words “Things” and “machines” interchangeably.
In the second layer, things are connected to the Internet in many different ways. The third layer consists of technologies designed to collect the data, which are increasingly time-series data being sent every hour, minute, or second.
The fourth layer is about learning. Unlike IoP applications, which entice people to type something, with IoT applications we will learn constantly in settings like hospitals, farms, and mines.
Finally, you should ask, what’s all this technology for? What are the business outcomes? This layer, the “do” layer, describes both the software application technologies and the business models affected by companies that build Things, as well as those who use them to deliver health care, transportation, or construction services.
Let’s take a closer look at each of the five layers.
Things: Enterprise Things, whether you’re talking about a gene sequencer, a locomotive, or a water chiller, are becoming smarter and more connected. If you’re going to build or buy next-generation machines, you’ll need to consider sensors, CPU architectures, operating systems, packaging, and security.
Sensors are beginning to follow Moore’s Law, becoming dramatically less expensive every year. They are increasingly being attached to low-cost computers, which can range from simple microcontrollers to fully featured CPUs supporting either the ARM or Intel instruction set architecture.
As you move to more powerful processors, more powerful software can be supported — but as powerful as that software is, it becomes a point of vulnerability in our hostile world.
Connect: Things can be connected to the Internet in a variety of ways. Connecting Things requires a diverse set of technologies based on the amount of data that needs to be transmitted, how far it needs to go, and how much power you have. Furthermore, there are many choices at a higher level around how to manage the connection and how it’s protected and secured.
Collect: Remember, Things aren’t people. The sheer volume of data that can be generated by Things will be exponentially larger than that of IoP applications. Data might be collected and stored using SQL, NoSQL, or traditional or next-generation time-series collection architectures.
Learn: With an increasing amount of data coming from Things, we’ll need to apply technology to learn from that data. Learning and analysis products will include query technology, and both supervised and unsupervised machine-learning technologies.
Most of the technology for learning from data streams has been applied to learning from data about people. As with all layers in the stack, there is much room for future innovation.
Do: As with IoP applications, there will be both packaged applications (ERP, CRM) and middleware to build IoT applications.
Of course, in the end these applications — whether bought or built — will have to drive business outcomes. As machines become increasingly complex and enabled by software, many of the lessons learned in software maintenance and service will also apply to machine service.
As many in the software industry already know, the movement to delivering SaaS has revolutionized the industry. So, maybe we’ll also see “machines-as-a-service.”
Timothy Chou is a lecturer on cloud computing at Stanford University. He is a former president of Oracle on Demand. For more information about IoT and how it might reshape your business, check out his new book: “Precision: Principles, Practices and Solutions for the Internet of Things” and his online class Precision IoT: The Class.