The Next Big Thing
Posts about next generation technologies and their effect on business.

Displaying articles for: December 2009

4th wave of computing perspectives for 2010

I was asked about my perspective on the future of business computing for an upcoming HP article. In prepartation for the interview, I pulled together a 'stream of conciousness" set of notes that I thought I'd pass along, based on the questions that they were going to ask me. We eventually talked about a number of other areas as well, but I thought I'd pass along my notes.

 

1. What are the four waves of IT and what is the significance of each in terms of innovation and productivity?

 

The four waves perspective is based on some analysis by Forrester covering how IT technology adoption has taken place. The first wave of IT was when the mainframe came along. Before that point when you were talking about a computer, you meant a person who calculated for a living. The mainframe replaced all that and developed an enterprise view of information.

 

The second wave came with the arrival of the personal computer. It allowed individuals to have their own view of the corporate information and work with the data at a personal level.

 

The third wave was about networked computing and allowed a shared view between and across organizations. This included the dot com era.

 

Based on previous trends, this last wave is coming to an end and we're entering into a new wave that consists of computers in things instead "as things" -- with massively more amounts of information available about the context of the enterprise being gathered in real time. It's also about analyzing this information and moving from a latency rich "sense and response" approach, where you wait for things to happen, to a much more proactive "cause and effect" approach. You understand the underlying causes and address the resulting effect. It's an approach where you "skate to where the puck is going to be" - as Wayne Gretsky says. This will significantly change the relationship between IT and the business as a whole and the perspective of how business value is generated.

2. What is the inflection point we have entered, in terms of the coming next wave? How will the Information Technology paradigm of the past decade be disrupted?

 

The move to the next wave has been delayed by the current economic downturn. What this means is that, when funds start flowing again, it will be on a quite different playing field than the IT of the last decade. Technical advances were already in motion and have continued through the downturn even though organization has not had the freedom to implement them, yet. Organizations that have kept tabs and are prepared for this new approach will have a significant advantage over those who have hunkered down in survival mode during the crisis. 

3. What new technologies will be driving this wave?
There are a few key business and technology threads that intersect in a new way for the next wave. First, with all the sensor techniques and new data gathering system, we're getting closer to the age of abundance of information - where we worry much less about if the information is available and much more about when and how to use it.

 

At the same time with multi-core processors and cloud computing techniques we've moving closer to the age of abundance of computing were we can take all that information and model, simulate and perform pattern recognition and separate out normal from the unusual and have the people focus on turning the anomalies into opportunities, rather than messing around in "normal" where they'll just add variation.

 

On top of this we have the whole issue of presenting this information in a fashion that it can be consumed and driven to action. That means personalizing the information flow to the role, availability and responsibility of the individual, instead of treating everyone the same. This means taking advantage of our understanding of the intersection of personal context with enterprise context - essentially creating a whole new type of enterprise.

 


4. How must network systems and technologies evolve to cope with vast increases in data?
My personal view is that there are certain IT activities that were designed from their inception to be highly parallel in nature. They'll require streams of information (ideally in real time) to function. These will be the approaches that consume significantly more of the cloud and multi-core computing resources than the traditional computing techniques. I'm not saying that ERP systems and other traditional software will go away; I'm just saying those techniques are at a significant disadvantage when it comes to using new computing resources. We'll see them change to take greater advantage of more parallel techniques or be replaced by whole new software approaches based on modeling, simulation and pattern recognition -- similar to the way the green screens (and their applications) are now a rare site in most organizations. As new techniques are available, organizational approaches and develop new expectations.

 


5. Will standalone IT departments survive the fourth wave?
With any trend there are leaders and laggards. I believe that a wholly standalone IT organization will be at a significant disadvantage, since they will be fighting it out on a traditional learning curve, where other organizations using these new techniques will base their work on a learning curve that is advancing much more steeply. IT organizations need to start now to understand the opportunities for them. Start small, but still do something meaningful - otherwise no one will pay attention.

 


6. How should employees be transitioned into this wave?

 

It depends on if you mean employees who work in or with IT or IT departments. For the individual, they should familiarize themselves with the concepts of what is going on in their area. For the Microsoft oriented personnel (since that's where I focus many of my efforts) the new technologies are: Azure, Silverlight, SharePoint and SQL StreamInsight. For IT organizations, it's to develop a strategy and enterprise architect that looks for waste in the current environment by assessing the current application portfolio from a value per watt and value per effort perspective. Since most organizations have the vast majority of their funds locked up in fixed costs built up over a number of years of success, you must first focus on the fixed costs in order to have investment funds. Addressing those fixed areas should free up funds to focus on strategic activities like:

 


  • data gathering and complex event processing

  • enterprise workflow and process automation

  • Virtualization and cloud computing application

  • Attention engineering (ensuring the right information is delivered to the right person in the right place, in the right format, at the right time, driving the right result) maximizing the outcome while minimizing the disruption.

  • Sustainability - addressing the efficiency of the enterprise as a whole as well as the IT organization.

Electronic trading and the business by wire

Many times when I am talking about "Business by Wire" and how the new computing capabilities will be used to define new levels of business value generation, I use electronic trading as an example. In the upcoming MIT Technology Review there is an article titled Trading Shares in Milliseconds that is worth reading. These techniques provide agility and advantage to organizations have also created some of the largest computing environments in the world. As cloud techniques become more common, this kind of power will enable these techniques for more businesses.

Predictions for 2010

crystal ball


For the last 5 years I've created a blog entry of my predictions for the coming year. Last year I was pretty conservative because of the economic downturn. 2010 will be a bit brighter, so I'll be a bit more adventurous as well.


2010 is also the start of a new decade. A decade that will be even more tumultuous from an IT perspective than the last one (and that included the end of the .com bubble, a significant financial downturn and numerous political struggles & events). I say that because we are on the cusp of a new wave of technology -- the last decade was the end of the previous wave. The  move forward has been pent up by financial issues. Once the money starts to flow again, it will be on a whole different kind of IT - one that is based more on computers and information being everywhere. There will be new lessons to be learned and value to be generated.


Areas coming into their own in 2010:


Augmented reality - The merger between reality and computer displays is becoming more prevalent and transparent. There were a few good entries on this topic in 2009, and I expect it to increase radically in the mobile computing space. This is one significant way of overcoming information overload issues with the massive amount of data being collected.


Social computing will continue its mobility path. Although the static desktop will still access it the vast majority of social computing will be via the mobile platform. It will also continue to use the location information of the mobile device in totally new ways, supplementing our experience and likely utilize augmented reality techniques as well.


Another type of augmented reality is through the use of 3D techniques & standards. Now that 3D movies are becoming commonplace and 3D TV is possible, the use of 3D visualization for all kinds of applications will become apparent in 2010. Our minds are set up to take in 3D information so we'll find numerous ways to take advantage of it in business as well. 3D visualization of other emerging technologies (like Nano) will become common as well.


Besides just how we look at data, the way we manipulate it will change in 2010 as well. The venerable SQL database will begin to succumb to the increased pressure by highly parallel techniques like mapreduce and other noSQL techniques. New SQL approaches will need to be developed. We are going to hear much more about them in 2010.


There will be a significant focus on data de-duplication -- a way for organizations to overcome some of the cost related to the age of abundance of data. The amount of information created by mankind doubles every 18 months. For organizations who are deploying sensors and gathering unstructured data in a big way, it can be twice that fast. Much of business data is redundant, like e-mail attachments and backup sets. Organizations will be inundated with hardware and software solutions to address data storage exponential growth in 2010 with a whole new set of products and services.


In 2009 Cloud Computing was at the Peak of Inflated Expectations in Gartner's Hype Cycle for Emerging Technologies.  We've started to hear some early stumbles as the market understands how the current implementations of cloud work. 2010 will likely see a continued dip and the start of further refinements to move onto the slope of enlightenment. Most organizations will develop deeper understanding of cloud techniques and we should see some significant capabilities to take advantage of its parallel nature in the business analytics space. 2010 will also see more commercial application of Web 3.0 techniques (that will use cloud techniques). The cloud space will continue its bifurcation into public cloud for SMB and private cloud for the enterprise clients, until some killer application unifies the two. I am still holding out for a new dimension of business analytics to do that, but I doubt we'll see that in 2010.


IT's focus on overall enterprise efficiency and sustainability will come back when spending increases. The market's expectations in this space will continue and increase in 2010. The focus will move from Green IT to "IT for Green".


In the security space, organizations are going to turn their security perspective inside out in 2010. In response to the deluge of ever more intelligent malware, old security models using signatures for detection are running out of steam. Whitelisting software for servers and other security concerns that take a snapshot of a known good system to create whitelisting rules, monitor systems for unapproved applications and even prevent unapproved programs from running will become more common. Security will continue to be a huge issue.


The infrastructure space should see the widespread use of Solid state storage, in addition to traditional storage. USB 3.0 will come into its own and we'll see quite different applications of data transfer and storage because of its significantly higher speeds.


Robots will also be applied to a wider range of tasks in 2010 - like medical and automotive applications (moving beyond parking and braking). Although we will still not reach the Robot in every home by 2010 target.  IT organizations that are focused on more back office tasks should begin to understand the possible overlap with their roles.


As I finish up this post I realize that there are way too many items for an IT organization to address them all. On the other hand I do think they should watch them all and use their investment dollars wisely to try out the ones that will have the most impact. They should also work on weeding out the IT deadwood, so that more funds are available to accelerate these and other high value techniques and maximize the value of their enterprise.


 

Nanotechnology solution may replace LEDs

An IEEE
article
states: "The next big thing in solid-state lighting
may be exceedingly tiny-the quantum
dot.
Researchers from around the world gathered at the Materials Research Society fall
meeting in Boston last week to discuss the progress they're making in using
quantum dots to enhance the color and efficiency of light-emitting diodes (LEDs).".


If they can be effectively manufactured these
devices are more efficient and can more accurately portray color than existing
techniques so they'll likely be deployed in computer displays first.


When I talk with others about nanotech, they
are always asking "Where
will it be applied
?" Devices like the quantum dot  that have clear advantages over more traditional
approaches will be one place. Sensing
will be another
, since the nanotech solutions can be designed to look for individual compounds...
These sensing capabilities will contribute to the age of abundance of data. There
is also the dream of nanotech
based manufacturing
to be fulfilled.


 

Data in the age of Coopitition

One of the facts of business life today for many sectors is that you're forced to work with your competitor on one deal and compete against them in another -- coopitition. IT organizations need to have a plan in this space, to support their corporation business model.


The same situation is true in politics, both at the local and global level. In this situation, the use of information can be key to allowing the relationship to exist. Being able to share what's important for one situation and retract that access once the situation/relationship has ended, allows the data owner to have greater confidence that their being protected.


This article from some folks at HP is one view on addressing this situation - although it is more focused on the retail consumer information space. The consumer space is ripe for legal protection, requiring the use of privacy enhancing technologies.


Digital rights management is the implementation of data control technologies we're most familiar with as consumers (closely coupled with Digital Millennium Copyright Act).

Search
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation