The Next Big Thing
Posts about next generation technologies and their effect on business.

Displaying articles for: July 2014

New HP automotive industry e-zine

Lately I’ve been blogging quite a bit about the Internet of Things. Few industries are so permeated with IoT activities (both in production and in their products) than automotive. Periodically the HP Enterprise Services team focused on automotive create an e-zine and the new one just came out. At least I think it should be out there by now. If not, it will be soon. Here is a brief video about the effort:

 

 

The previous edition of the e-zine is located here. If you are really interested, you can sign up for a subscription service so the new information is pushed to you directly.

IoT model update from the one I used 4 years back...

Back about four years ago, I used a model to think about machine-to-machine (M2) from a holistic perspective. Today, this would be viewed more through an Internet of Things (IoT) lens. In talking with some others last week, it seemed that the simple progressing from sensors all the way through action is still valid but may need to be expanded a bit.

Internet of things model.png

 

In really starts with the ‘thing’ that has been tagged (and its sensors and controllers). There is also a supporting device management layer that adds security, power management and other fleet management features. I didn't really show that the first time.

 

Data collection continues to have integration capabilities but the analytics layer needs to add more context and pattern recognition than just traditional analytics. There is an automation layer that rides on top that performs a number of the action oriented features.

 

I didn’t really think about the management layer that is inherent in the approach, even though some functions may only be useful for a subset of the environment. A pluggable set of standards is needed to minimize the complexity.

 

The Internet of Things will require a significant degree of autonomous control. It can’t be as needy as the tools we’re using today – crying out for our attention all the time.

 

Is it time for a Chief Automation Officer?

Automation officer.pngOver the last few years, there has been quite a bit of discussion about the race against the machines (or the race with the machines), based on the abundance of computing available. When I think about the IoT and its implications on business, it may be that information is just a side effect of an entirely different corporate strategic effort.

 

Maybe there is a need for a Chief Automation Officer more than a Chief Information Officer going forward?!? Someone who looks at the business implications and opportunities for cognitive computing, sensing, robotics and other automation techniques.

 

Or is automation just assumed to be part of all future strategic planning activities. As I began thinking about it, it’s clear that others have thought about this CAO role as well, although mostly from an IT perspective instead of one based on business need. It could be viewed that this is a role for the CTO or even the enterprise architect.

IoT standards war begins

tug of war.pngI seem to have done quite a number of blog posts in the last month related to the Internet of Things. I just noticed that there have been numerous announcements about standards efforts. This may have spurred me on. 

 

There are a number of them, but the three I’ve seen the most about are:

  • AllSeen Alliance that supports the open source project AllJoyn that provides “a universal software framework and core set of system services that enable interoperability among connected products and software applications across manufacturers to create dynamic proximal networks.”
  • The Open Interconnect Consortium with “the goal of defining the connectivity requirements and ensuring interoperability of the billions of devices that will make up the emerging Internet of Things. “
  • And Google (not to be left out) has defined Thread. Its goal is: “To create the very best way to connect and control products in the home. “ These devices all run over IEEE 802.15.4.

The IEEE has its own set of IoT standards efforts, but those haven’t been getting the press as the recently announced ones above.

 

It is clear that IoT needs standards, but if it is too fragmented there will be no standard at all.

 

Hopefully this will shake out soon, since standards will help make the services and the software needed that actually provide the value for the end consumer.

 

Where did the IoT come from?

I was talking with some folks about the Internet of Things the other day and they showed me some analysis that made it look like it was relatively recent.

 

where did the IoT come from.jpg

 

My view is that its foundations go back a long way. I worked on (SCADA) Supervisory Control and Data Acquisition systems back in the 80s, which were gathering data off the factory floor, analyzing it and performing predictive analytics, even way back then.


In the 70s, passive RFID came into being and one of the first places it was used was tracking cows for the department of agriculture to ensure they were given the right dosage of medicine and hormones – since cows could talk for themselves.

 

In the late 70s and early 80s barcodes become widely used to identify objects, allowing greater tracking of manufacturing lines as well as consumers in stores.

 

In the 90s, higher speed and greater range allowed for toll tags to be placed on cars, allowing for greater ease of identification but still very little use of sensors to collect additional information.

 

At the turn of the century, the military and Walmart required the use of RFID to track products and that caused significant increase in their adoption. About the same time, low powered sensing capabilities were developed since RFID only provided identification and the scanner provided location, people began to look at other information that could be collected like temperature, humidity as well as ways to gather information remotely like smart metering in the utilities space (although even that started much earlier).

 

Most technology adoption follows an S curve for investment and value generation. We’re just now entering the steep part of the S curve where the real business models and excitement is generated. It is not really all that new it is just that the capabilities have caught up with demand and that is making us think about everything differently (and proactively).

The shifting world of business continuity

disaster2.pngI was in an exchange this week with an individual talking about business continuity. The view that business continuity needs to focus on:

An organizations business continuity approach need to be reassessed in a world of high levels of automation, contracting for services and reduced latency. The very definition of foundational terms like ‘work location’, ‘services’ and ‘support’ are changing. Diversity of perspective is likely to be a critical component of any kind of timely, situation response.

 

“The management of business continuity falls largely within the sphere of risk management, with some cross-over into related fields such as governance, information security and compliance. Risk is a core consideration since business continuity is primarily concerned with those business functions, operations, supplies, systems, relationships etc. that are critically important to achieve the organization's operational objectives. Business Impact Analysis is the generally accepted risk management term for the process of determining the relative importance or criticality of those elements, and in turn drives the priorities, planning, preparations and other business continuity management activities.”

 

In today’s environment, business impact analysis is becoming ever more technical and the interconnection between environmental factors more complex. We have seen situations recently with program trading that an entire financial institution has been placed at risk when their automated trading responds in an unforeseen fashion or their governance breaks down. We’ll be seeing similar techniques applied throughout organizational processes.

 

The response to almost any situation can be enabled by techniques like VOIP and other approaches that allow additional levels of abstraction. Simulations can be used to understand the implications of various scenarios as part of business continuity planning.

 

As I mentioned back in March:

Having an effective, robust approach to business continuity is part of management, security and many other roles within an organization.  It is important to remember that there is a cost for being unable to respond to an incident.

Metrics usage in an agile approach

change.pngA couple of months ago, I did a post on: The supply and demand issues of governance, including issues that cause organizations to be blindsided by events.

 

Lately, I’ve been thinking about this a bit more but from the metrics side -- defining and collecting the leading and lagging indicators of change associated with governance. There is quite a bit of material on this concept, but this link to a definition on leading indicators is focused on economic leading indicators. The concepts for business processes are similar.

 

Leading indicators show progress, lagging indicators confirm completion (examples on this perspective made me dig up a post I did in 2009 about measuring cloud adoption). Most organization’s processes only have lagging indicators. These are metrics that identify we’ve hit milestones… This can allow efforts to get fairly far down a path before they can do course corrections. More predictive approaches are possible and needed to adapt to this changing approach to business.

 

When I look at applying gamification, I usually come up with numerous leading indicators since gamification is about influencing the work in progress. When approaching change, look for items that show improvement or change and not just validation of achievement.

Will we need to update Moore’s law with the coming innovations...?

memory.jpgI was in an exchange with someone this week where we were talking about technology advances and some of the exponential ‘laws’ of IT. One of those is Moore’s law of transistor density. It is definitely useful, but maybe transistor density no longer has the meaning or breadth of relevance it used to?

 

For storage, it can take many transistors to store a one or a zero. But with Memristor or some of the other technologies that will someday compete with today’s memory circuits, they will not use transistors. Will we need to move to the density of 'holding either a zero or a one' instead?

 

Storage is just the start of the shift in computing circuits that are likely.

IoT and IT’s ability to foresee unintended consequences

Internet of things.pngI was recently talking with someone about an Internet of Things study that is coming out and it really made me wonder. HP has been doing work in the IoT for decades and gets relatively little credit for the efforts. In fact where I started work back in the 80s was writing statistical analysis tools for plant floor (SCADA) data collection – essentially the high value, big data space of its time, back when a 1 MIPS minicomputer was a high $$ investment.

 

The issues we deal with today are a far cry from that era, now we’re as likely to analysis data in the field about well head performance or robotics but many of the fundamentals remain the same. I’ve mentioned the issue of passive oversharing in the past, and addressing that issue needs to be at the foundation of today’s IoT efforts as well as value optimization issues.

 

I was in a discussion about vehicle to vehicle communications requirements a few months back and the whole issue of privacy looms much larger than the first thoughts of preventing accidents. I think everyone would agree that putting on the breaks by those vehicles affected is a good idea. Should the stop lights recognize bad behavior and visually send a signal to other drivers? There are a wide range of innovations possible with a network like this.

 

There are also negative possibilities to be considered:

  • Is passing along this driver performance to the police a good idea? What about insurance companies?
  • What about just that fact that your car knows it is speeding, is that something that others should know?
  • Or the information about where you’re driving to, now that your car is sharing this information with other cars and infrastructure (cell phones already do this by the way).
  • What if a driver can ‘socially engineer’ the limits of the system to maximize the performance for them. An example of this might be if you were to push the system so that yellow lights would stay yellow a bit longer because you’re accelerating into the intersection – is that OK?

Some unintended consciences are going to happen. We should be able to see many of them coming, if we think creatively. IT organizations will need to develop their implication assessment skills, for their social as well as business effects. The IT team should have better comprehension of the analysis and data sharing that has happened elsewhere and the implications, regardless of the business or industry and be able to advise accordingly. They need to reach out early and often.

Context, automation and the future of services

looking for direction.jpgThere recently was a story about a computer program that passed the Turing Test. When you get into the details of what was actually done, I am not sure it really qualifies. The fact that people are talking about the event though is enough to show that we’re pretty far down the road toward breaking down the perceived barriers between machines and human interaction.

 

These advanced levels of interaction capability are enabled by a new wave of AI applications that can capture context at scale and in near real-time. These solutions when they move out of the labs should be able to consume massive amounts of information and generate contextual understanding at a level that even the most intuitive individual would find difficult to match.

 

You might ask what does this mean for the future of services. Or where will it be of use to my organization? It should be applicable at just about any point where a conversation occurs with customer or between:

  • employee and employee
  • organization and organization
  • government and citizen

We may be able to automate interaction that isn’t face-to-face and even then it may need to be person to person with the likelihood we can overcome the uncanny valley.

 

These new context-aware, AI enabled interactions can provide a multi-level view on engagements and ‘experience’, allowing organizations to filter through the noise and latency (for example waiting for certain skills -- Spanish language) and shift the focus to an enriching experience, relationships, and achieving goals. I can easily see a future talking with an AI agent at the drive-up window, as a low-hanging opportunity.

 

The recent book The Second Machine Age, examines how society, the economy, and business will transform as digital technologies and smarter machines increasingly take over human occupations.

 

It makes you look for direction about who will robots put out of work? This interactive graphic from Quartz takes a stab at answering that question—exploring which U.S. jobs are most likely to become automated, and how many workers could be affected.

Search
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation