The Next Big Thing
Posts about next generation technologies and their effect on business.

Machine Intelligence, business applications and retooling

 

thinking.pngOne of the areas that has had a significant renewal of public interest recently is the application of AI techniques, both in our personal lives as well as within the enterprise.

 

For those interested in learning more there are some courses on Coursera and EdX that cover the foundations of AI, but I have yet to find one that goes into real world applications. It seems there could easily be some industry specific coursework defined. Have you seen any that are useful??

 

I was in a discussion just this morning with someone and asked them about the intersection between User Interface Design and Automation, since in many cases humans are scarce and computing is abundant so a human centered design may actually be self-constraining. This will shift the kinds of designs we will accept.

 

The Machine Intelligence Research Institute recently put out A Guide to MIRI’s Research, which I’ve found to be an interesting resource for those thinking about the application of AI techniques and possible unintended consequences.

 

It states: “AI theory currently isn’t about implementation, it’s about figuring out how to ask the right questions.” That aligns with a post I put out a few weeks back about the goal of cognitive computing.

 

Although the Guide seems to be targeted at people looking to work in AI, there are some areas useful for those interested in learning more about the foundation of the topic. MIRI is fairly focused on AI safety – essentially not having a SkyNet scenario.

 

Whenever I read material in this space I always think back on the Heinlein book: The Moon is a Harsh Mistress.

 

Displays value is in the eye of the user

 

eye.pngThere was a story a few weeks back that caught my eye but I didn’t have time to blog about it. It was about 3D printing contact lenses with built-in video. The concept of having sensors and displays directly on the eye is not new, but this is the first time I’ve seen discussion of them being 3D printed.

 

This particular effort is funded by the US Air Force and could be used for display of information or sensing the “state of the wearer's retina and possibly monitor pilot health without invasive implants.”

 

I can easily see these high impact/cost applications increasing in availability over time and being integrated with those roles where timely access to information can make a big difference. There will need to be some significant work on user interface design, since an on-eye display will be always in the way of the user’s vision.

 

The sensing application would be useful for those situation where immediate action could be the difference between life and death (for example diabetes intervention). I have a hard time imagining its use for every day service interactions, but I could easily be mistaken. It does make me wonder about the possibilities when integated with cognitive computing capabilities.

 

Automating programming in a self-aware enterprise

 

AI.pngThere was an interesting article in NewScientist about a new approach to providing computing capabilities, computers with human-like learning that will program themselves. They talk about new approaches for computers to program themselves.

 

Earlier this year when ‘the machine’ was announced at HP Discover, this scenario was one of the first things that came to mind, since memristors can be used to provide neuron-like behavior. When you have universal memory whole new possibilities open up. When I saw the NewScientist article, it did make me think about a number of applications in the enterprise, since these techniques will be as far beyond today’s cognitive computing as today’s approach is from the mainframe.

 

Always bet on the machine is in a post from 2008, that was contemplating the future of development. What I probably meant was: those who learn to work with the machine will still have a career.

 

I’ve mentioned before that much of today’s management function is ripe for automation. With approaches like this, an enterprise autopilot is conceivable that can optimize a businesses’ response to normal business situations. Questions probably has more to do with ‘when’ than ‘if’.

 

IoT standards war begins

tug of war.pngI seem to have done quite a number of blog posts in the last month related to the Internet of Things. I just noticed that there have been numerous announcements about standards efforts. This may have spurred me on. 

 

There are a number of them, but the three I’ve seen the most about are:

  • AllSeen Alliance that supports the open source project AllJoyn that provides “a universal software framework and core set of system services that enable interoperability among connected products and software applications across manufacturers to create dynamic proximal networks.”
  • The Open Interconnect Consortium with “the goal of defining the connectivity requirements and ensuring interoperability of the billions of devices that will make up the emerging Internet of Things. “
  • And Google (not to be left out) has defined Thread. Its goal is: “To create the very best way to connect and control products in the home. “ These devices all run over IEEE 802.15.4.

The IEEE has its own set of IoT standards efforts, but those haven’t been getting the press as the recently announced ones above.

 

It is clear that IoT needs standards, but if it is too fragmented there will be no standard at all.

 

Hopefully this will shake out soon, since standards will help make the services and the software needed that actually provide the value for the end consumer.

 

Context, automation and the future of services

looking for direction.jpgThere recently was a story about a computer program that passed the Turing Test. When you get into the details of what was actually done, I am not sure it really qualifies. The fact that people are talking about the event though is enough to show that we’re pretty far down the road toward breaking down the perceived barriers between machines and human interaction.

 

These advanced levels of interaction capability are enabled by a new wave of AI applications that can capture context at scale and in near real-time. These solutions when they move out of the labs should be able to consume massive amounts of information and generate contextual understanding at a level that even the most intuitive individual would find difficult to match.

 

You might ask what does this mean for the future of services. Or where will it be of use to my organization? It should be applicable at just about any point where a conversation occurs with customer or between:

  • employee and employee
  • organization and organization
  • government and citizen

We may be able to automate interaction that isn’t face-to-face and even then it may need to be person to person with the likelihood we can overcome the uncanny valley.

 

These new context-aware, AI enabled interactions can provide a multi-level view on engagements and ‘experience’, allowing organizations to filter through the noise and latency (for example waiting for certain skills -- Spanish language) and shift the focus to an enriching experience, relationships, and achieving goals. I can easily see a future talking with an AI agent at the drive-up window, as a low-hanging opportunity.

 

The recent book The Second Machine Age, examines how society, the economy, and business will transform as digital technologies and smarter machines increasingly take over human occupations.

 

It makes you look for direction about who will robots put out of work? This interactive graphic from Quartz takes a stab at answering that question—exploring which U.S. jobs are most likely to become automated, and how many workers could be affected.

Search
Showing results for 
Search instead for 
Do you mean 
Follow Us
Featured
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.