Hands-On Industrial Internet of Things
上QQ阅读APP看书,第一时间看更新

History and definition

The IoT as a concept wasn't officially named until 1999. One of the first examples of the IoT was a Coca-Cola machine, located at the Carnegie Mellon University (CMU) in the early 1980s. Local programmers would connect through the internet to the refrigerated appliance checking to see if there was a drink available and whether it was cold before making a trip to it.

Kevin Ashton, the Executive Director of Auto-ID Labs at MIT, was the first to describe the IoT in a presentation for Procter and Gamble. During his 1999 speech, Mr. Ashton stated as follows:

Today, computers, and therefore the Internet, are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the Internet was first captured and created by human beings by typing, pressing a record button, taking a digital picture, or scanning a bar code. The problem is, people have limited time, attention, and accuracy, all of which means they are not very good at capturing data about things in the real world. If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.”

Kevin Ashton believed that radio-frequency identification (RFID) was a prerequisite for the IoT, and that if all devices were tagged, computers could manage, track, and inventory them.

In the first decade of the 21st century, several projects were developed to try to implement and translate into the real world the IoT philosophy and Ashton's innovative approach. These first attempts, however, were not so successful. One of the most famous and emblematic cases was the Walmart mandate (2003). By placing RFID tags with embedded circuits and radio antennas on pallets, cases, and even individual packages, Walmart was supposed to be able to reduce inefficiencies in its massive logistics operations and slash out-of-stock incidents, thus boosting same-store sales.

In 2003, Walmart started this pilot project to put RFID tags, carrying electronic product codes, on all pallets and cases involving all of its suppliers. In 2009, Procter and Gamble, one of the main suppliers involved in the project, stated that it would exit from the pilot project after validating and checking the benefits of RFID in merchandising and promotional displays.

The unsuccessful story of the Walmart RFID project was caused by various factors:

  • Most of the technologies used were in their initial stages of development and their performance was poor. They had sensors with little information, and Wi-Fi or LAN connectivity with high power and bandwidth usage.
  • The sensors and connectivity devices were expensive due to the small market size.
  • There were no common standards for emerging technologies, and there was a lack of interoperability between legacy systems.
  • Business cases were not very accurate.
  • The technology infrastructure and architecture was organized in vertical silos with legacy hardware and middleware, and a lack of interactions between each silo.
  • Technology infrastructure and software architecture was based on a client-server model that still belonged to the so-called second digital platform. 

From 2008, several changes were introduced to deal with the preceding issues, which were led mainly by the mobile market. These included the following:

  • New higher-performing processors were produced on a large scale at lower cost. These processors supported commercial and/or open operating systems.
  • New sensors, which were much more developed, with computation capabilities and high performance embedded at a low cost.
  • New network and wireless connectivity, which allowed the user to interconnect the devices with each other and to the internet by optimizing bandwidth, power consumption, latency, and range.
  • Sensors and devices using commercial off-the-shell (COTS) components.
  • The third, cloud-based, digital platform.

Due to these changes, the IoT evolved into a system that used multiple technologies. These included the internet, wireless communication, micro-electromechanical systems, and embedded systems such as the automation of public buildings, homes, factories, wireless sensor networks, GPS, control systems, and so on.

The IoT consists of any device with an on/off switch that is connected to the internet. If it has an on/off switch, then it can, theoretically, be part of a system. This includes almost anything you can think of, from cell phones, to building maintenance, to the jet engine of an airplane. Medical devices, such as a heart monitor implant or a bio-chip transponder in a farm animal, are also part of the IoT because they can transfer data over a network. The IoT is a large and digital network of things and devices connected through the internet. It can also be thought of as a horizontal technology stack, linking the physical world to the digital world. By creating a digital twin of the physical object in the cloud, the IoT makes the object more intelligent thanks to the interaction of the digital twin with the other digital images living in the cloud.