I was in a discussion today with some folks that are part of the Service Futures SIG of ISSIP. We were talking about the technology trends that will be shaping our approach to addressing business problems in our organizations. It seems that we may be at a pivot point for our perspective.
Every once in a while I get into a conversation with someone who thinks about sensors and says something like “Yes, I can see some uses for sensors but not in my business”, so I have to give some examples...
I am old enough to remember when the first PC landed on my desk, as well as my first laptop, smartphone… now it is an assumed part of work today. It takes more than new technology to differentiate an organization.
I sat in on an IEEE Computing meeting in Dallas last week with the title: What’s Next for VoIP? As I sat through the session, I was beginning to wonder about the limited scope and its relevance going forward.
Voice is nice. I’ve mentioned before that Unified Communications is important, but with all the effort focused on video (e.g., conferencing with Skype and efforts like Google Glasses) as well as other techniques to provide a wider range of interaction (e.g., Making a Huggable Internet), where will our bandwidth consumption be in a few years. Haptics and dimensions of sensory interaction are advancing quickly like digital taste and scent. I wonder if these techniques will ever significantly affect business computing.
Since the more senses we use to make decisions, the better/faster those decisions may be – it does make me wonder if we’ll soon put our hand on a device (or even our computer) to feel the vibration of a machine in the field. That is essentially what a microphone and a speaker accomplish, they are just aimed at transferring the vibration to our ears. Similar techniques could apply to using our nose to smell the way a chemical process is cooking. Our bodies are made to do that kind of thing. Why limit ourselves to bar charts and other gages for our eyes to consume? Of course, we’ll need the bandwidth to carry all that information.
Not that long ago, we thought that the entire Internet bandwidth would be taken up by SPAM. In November of 2012, one third of the NA Internet bandwidth consumption was taking up by just one application – NetFlix. It is strange how quickly our concerns can change.
I mentioned that I was giving a presentation this week at the New Horizons Forum at the AIAA conference. Since it may provide some useful insight about the research underway at HP labs in a larger context, here is the content of one slide from that presentation:
1 datum is a point
2 data are a line
3 data are a trend
100 data are a picture
Having sensors to generate the data that fuels a more proactive business is important, but there is more to sensing than the sensors and the data collected. A holistic ecosystem view is needed. Unfortunately, this means that the tools of today may not be up to the tasks required.
You may have heard about HP’s efforts to place a million node sensor network in the ground for Shell, gathering seismic information. Traditionally, this kind of information was just a flash of perspective taken in the dark from a few locations. Instead, this sensing effort with Shell generated a much more fine-grained view, taken from a myriad of angles, to understand in-depth what was underground.
In order to do implement the system, HP not only had to invent the sensors (relatively cheap and yet very sensitive MEMS devices), but we also create the networking and management techniques to make it useful. Building upon what we’ve learned, we’ve been researching whole new approaches to information storage and computation that will be required to generate value from massive amounts of information.
HP has many of the foundational patents on memristor devices and sensing techniques and we should soon see the shift in storage and computing that the implementation of these techniques should enable. The whole concept of computing will likely need to bow to the onslaught of information from sensing and the related metadata, changing how information is transferred within the computing environment -- shifting from computing on bits to analyzing information in graphs on highly parallelizes computing platforms: Cog Ex Machina
In addition, research is underway to understand how information can be analyzed, automated and displayed. New techniques can be applied to focus attention on the areas needing the creativity that people can provide.
In the marketplace, last year was the year of Big Data as a buzzword with its primary focus on generating insight from the massive amounts of information being collected. Frankly, that will not be enough for the future envisioned – we need to shift the focus to time-to-action, not insight and that is what many of our research efforts underway will enable.