Wednesday, April 1, 2015

Deploying the Internet of All Things

Blog Post 2 - Phil Collett
Responding to Trend #4 of McKinsey’s “Ten IT-enabled business trends for the decade ahead” (Bughin, Chui, and Manyika)
Arguably one of the first - if not the very first - documented ‘things’ to meet the definition of an IoT connected device was a Coke machine which resided in Doherty Hall at Carnegie Mellon University. In the early 1980’s the computer science faculty affixed light sensors inside the machine. Signals were sent over RS-232 to a nearby LSI-11 dedicated gateway which would poll the Coke machine every second for a volume status. What set this little extracurricular scheme apart to qualify it as possibly the first IoT device is that the Coke machine’s volume status was linked into the ARPANET. Any ARPANET connected node could access the gateway via telnet and request the status of the coke machine. There was even a one-line command which would tell the user which button to push to receive the coldest coke available in the machine. 1

In 1991, Mark Weiser, head of the Xerox computer science laboratory at Palo Alto, coined the phrase, “Ubiquitous Computing” to describe what we now call the Internet of Things. From his words, “specialized elements of hardware and software, connected by wires, radio waves and infrared, will be ubiquitous that no one will notice their presence.”

The possible applications for internet connected smart devices are seemingly limitless. However, one specific category of IoT innovation is making significant ripples across the tech world; wearables & biometrics. How could smart earrings, GPS shoes, and implanted NFC chips be considered the Holy Grail of IoT technology? In the beginning, technology companies battled to get their products into universities and large firms. After that, they targeted small businesses and finally… households. Now, the name of the game is the individual. At the individual level, data collection is far more valuable. Furthermore, product consumption at the individual level (as opposed to the household level) represents greater purchasing potential. For example, a single telephone line per household has transformed into a separate cell line per individual in that household.

The technology behind wearable and biometric devices has made drastic strides in just the past year. For example, Intel, which took a beating from processor makers such as QUALCOMM and ARM, is loath to miss out on the small CPU/GPU market that will fuel the IoT tidal wave. Last year Intel created a new IoT division which just announced a new product called Curie. Curie is a systems on chips (SoCs), the size of a button, syncs via Bluetooth and relies on ultra-low power to power its accelerometer, gyroscope, and other sensors. 2
Aside from smaller microchips and sensors, developments such as the University of Manchester’s discovery of isolated Graphene, have the potential to dissolve the physical limitations of existing technology. This flexible yet resilient conductor can be crafted into clothing, liquids, food packaging, medical devices and more. 3
In Gartner’s 2013 IoT forecast, the cost for IoT enablement on a wide array of devices is predicted to drop below $1 USD by the year 2021. With such a low cost hurdle, IoT implementation will not be reserved only for high end products produced by name brands. 


No comments:

Post a Comment

Note: Only a member of this blog may post a comment.