The Next Big Thing
Posts about next generation technologies and their effect on business.

IoT model update from the one I used 4 years back...

Back about four years ago, I used a model to think about machine-to-machine (M2) from a holistic perspective. Today, this would be viewed more through an Internet of Things (IoT) lens. In talking with some others last week, it seemed that the simple progressing from sensors all the way through action is still valid but may need to be expanded a bit.

Internet of things model.png

 

In really starts with the ‘thing’ that has been tagged (and its sensors and controllers). There is also a supporting device management layer that adds security, power management and other fleet management features. I didn't really show that the first time.

 

Data collection continues to have integration capabilities but the analytics layer needs to add more context and pattern recognition than just traditional analytics. There is an automation layer that rides on top that performs a number of the action oriented features.

 

I didn’t really think about the management layer that is inherent in the approach, even though some functions may only be useful for a subset of the environment. A pluggable set of standards is needed to minimize the complexity.

 

The Internet of Things will require a significant degree of autonomous control. It can’t be as needy as the tools we’re using today – crying out for our attention all the time.

 

Where did the IoT come from?

I was talking with some folks about the Internet of Things the other day and they showed me some analysis that made it look like it was relatively recent.

 

where did the IoT come from.jpg

 

My view is that its foundations go back a long way. I worked on (SCADA) Supervisory Control and Data Acquisition systems back in the 80s, which were gathering data off the factory floor, analyzing it and performing predictive analytics, even way back then.


In the 70s, passive RFID came into being and one of the first places it was used was tracking cows for the department of agriculture to ensure they were given the right dosage of medicine and hormones – since cows could talk for themselves.

 

In the late 70s and early 80s barcodes become widely used to identify objects, allowing greater tracking of manufacturing lines as well as consumers in stores.

 

In the 90s, higher speed and greater range allowed for toll tags to be placed on cars, allowing for greater ease of identification but still very little use of sensors to collect additional information.

 

At the turn of the century, the military and Walmart required the use of RFID to track products and that caused significant increase in their adoption. About the same time, low powered sensing capabilities were developed since RFID only provided identification and the scanner provided location, people began to look at other information that could be collected like temperature, humidity as well as ways to gather information remotely like smart metering in the utilities space (although even that started much earlier).

 

Most technology adoption follows an S curve for investment and value generation. We’re just now entering the steep part of the S curve where the real business models and excitement is generated. It is not really all that new it is just that the capabilities have caught up with demand and that is making us think about everything differently (and proactively).

Is the Internet-of-Things really on the brink of enabling a major shift in business value?

fields.JPGI was talking with some folks the other day about the implications of technology shifts and what it means to business. Some shifts like Cloud and Big Data advance how we do many of the things we’ve been doing for years. Others like the Internet-of-Things (IoT) enable whole new approaches. I think the impact is being under estimated – probably because they are not as technically sexy.

 

One of my favorite examples of IoT is the SmartSpud. This sensor pack allows a potato producer to look at the process from the potato’s point of view, reducing bruising and other issues that cause waste -- very quickly. We’ve just not been able to get this perspective in the past. I grew up on a farm so the whole issue of organic processes and their optimization is always of interest.

 

I believe almost every industry as opportunities to use IoT in new ways. This report from the Economist states that this is “an idea whose time has finally come.” They took a survey and only 40% of the respondents saw the impact limited to certain markets or industries. 38% believe that the IoT will have a major impact in most markets and industries. Yet, 96% of all respondents expect their business to be using the IoT in some respect within 3 years. When I think about this, it is an issue where people are just coming to grips with what can be done to maximize the value of what is scarce to the organization.

 

There are some things that are holding the pressure to deploy IoT back. The need for some common infrastructure and services that enable secure, fairly reliable transport and analysis of information. All the parts exist, they just need to be bundled so they can be consumed effectively. It is crying out for an innovator’s dilemma approach that is just good enough for what is needed now to get things rolling. The people who want to use these capabilities don’t want to have to understand it deeply or create it from scratch – they just want to buy it and use it. Until we reach that stage, we’ll only have great examples (in isolation) and not real impact.

Update on skin-based sensing

skin sensor.jpgBack in 2011, I created a post about Sensors in Tattoos – taking wearables to a whole other level of monitoring.

 

Today I saw an update from the University of Illinois about a prototype for an Ultrathin “Diagnostic Skin” that allows for continuous patient monitoring.  This device sticks to the surface of the skin and can survive most normal stretching and pinching motions.

 

“The technology offers the potential for a wide range of diagnostic and therapeutic capabilities with little patient discomfort. For example, sensors can be incorporated that detect different metabolites of interest. Similarly, the heaters can be used to deliver heat therapy to specific body regions; actuators can be added that deliver an electrical stimulus or even a specific drug.”

 

The current investigation is tracking skin temperature, which can be used to track the onset of an illness…

 

Although not attached directly on the skin, this technology from Rice University attempts to detect malaria by sensing low levels of the malaria infection.

 

The new diagnostic technology uses a low-powered laser creating tiny vapor “nanobubbles” inside malaria-infected cells. The bursting bubbles have a unique acoustic signature that allows for an extremely sensitive diagnosis. The test requires no dyes or diagnostic chemicals, and there is no need to draw blood.

 

“A preclinical study published in the Proceedings of the National Academy of Sciences shows that Rice’s technology detected even a single malaria-infected cell among a million normal cells with zero false-positive readings.”

 

Hopefully this research will go more than skin deep and be deployed.

Earth Insights - Big Data in the Wild

Earth Insights.jpgToday at HP Discover, HP announced an innovative collaboration with Conservation International (CI) — a leading non-governmental organization dedicated to protecting nature for people — to dramatically improve the accuracy and speed of analysis of data collection in environmental science. 

 

The initiative, called HP Earth Insights, delivers near-real time analytics yielding new information that indicates a decline in a significant percentage of species monitored. The project serves as an early warning system for conservation efforts, enabling proactive responses to environmental threats. 

 

HP Earth Insights applies HP’s big data technology to the ecological research being conducted by Conservation International and its partners across 16 tropical forests around the world, as part of the Tropical Ecology Assessment and Monitoring (TEAM) Network Tropical forests, which are home to approximately 30 million species—half of all the plants and animals on earth—and generate 40 percent of the earth’s oxygen.

 

According to the United Nations Environment Programme, tropical forests are vanishing at a rate of about 18,000 square miles (4.6 million hectares) per year. Data and analysis from HP Earth Insights will be shared with protected area managers to develop policies regarding hunting and other causes of species loss in these ecosystems.

 

For this project, HP is providing a customized solution that harnesses the power of its big data offerings to address the challenges faced by CI scientists:

  • The HP Vertica Analytics Platform—a next-generation software tool, designed to manage and analyze massive and fast-growing volumes of data with amazing speed and unprecedented accuracy, will be used to address the needs of CI scientists to collect, store and assess a variety of data.
  • The Wildlife Picture Index (WPI) Analytics System—a dashboard and analytics tool built by HP Enterprise Services—allows for the visualization of user-friendly, data-driven insights to be accessed anytime, anywhere, in near real-time.

 

Currently, HP Earth Insights manages 3 terabytes of critical biodiversity information, including more than 1.4 million photos and more than 3 million climate measurements that continues to grow every day. The project analyzes the ever-increasing inputs related to species, vegetation, precipitation, temperature, carbon stocks, humidity and more, gathered from over 1000 camera traps and many climate sensors in 16 countries to deliver findings about the environment that previously were unknown.

 

By delivering analytics nine times faster than previously available and generating useable trend information on species and the impacts of climate, people and land use, HP Earth Insights is helping to protect these important resources.

 

This new solution also serves to demonstrate HP’s end-to-end capabilities in action, from HP ElitePads that meet the mobility requirements of the scientists to capture data in the tropical forests, to HP ProLiant servers powering back-end data systems, to building out the existing cloud component to meet the project’s growing data needs.

 

This real life example shows how big data techniques can take on big problems and provide new value and greater understanding, far beyond the office.

Search
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation