The Next Big Thing
Posts about next generation technologies and their effect on business.

4th wave of computing perspectives for 2010

I was asked about my perspective on the future of business computing for an upcoming HP article. In prepartation for the interview, I pulled together a 'stream of conciousness" set of notes that I thought I'd pass along, based on the questions that they were going to ask me. We eventually talked about a number of other areas as well, but I thought I'd pass along my notes.

 

1. What are the four waves of IT and what is the significance of each in terms of innovation and productivity?

 

The four waves perspective is based on some analysis by Forrester covering how IT technology adoption has taken place. The first wave of IT was when the mainframe came along. Before that point when you were talking about a computer, you meant a person who calculated for a living. The mainframe replaced all that and developed an enterprise view of information.

 

The second wave came with the arrival of the personal computer. It allowed individuals to have their own view of the corporate information and work with the data at a personal level.

 

The third wave was about networked computing and allowed a shared view between and across organizations. This included the dot com era.

 

Based on previous trends, this last wave is coming to an end and we're entering into a new wave that consists of computers in things instead "as things" -- with massively more amounts of information available about the context of the enterprise being gathered in real time. It's also about analyzing this information and moving from a latency rich "sense and response" approach, where you wait for things to happen, to a much more proactive "cause and effect" approach. You understand the underlying causes and address the resulting effect. It's an approach where you "skate to where the puck is going to be" - as Wayne Gretsky says. This will significantly change the relationship between IT and the business as a whole and the perspective of how business value is generated.

2. What is the inflection point we have entered, in terms of the coming next wave? How will the Information Technology paradigm of the past decade be disrupted?

 

The move to the next wave has been delayed by the current economic downturn. What this means is that, when funds start flowing again, it will be on a quite different playing field than the IT of the last decade. Technical advances were already in motion and have continued through the downturn even though organization has not had the freedom to implement them, yet. Organizations that have kept tabs and are prepared for this new approach will have a significant advantage over those who have hunkered down in survival mode during the crisis. 

3. What new technologies will be driving this wave?
There are a few key business and technology threads that intersect in a new way for the next wave. First, with all the sensor techniques and new data gathering system, we're getting closer to the age of abundance of information - where we worry much less about if the information is available and much more about when and how to use it.

 

At the same time with multi-core processors and cloud computing techniques we've moving closer to the age of abundance of computing were we can take all that information and model, simulate and perform pattern recognition and separate out normal from the unusual and have the people focus on turning the anomalies into opportunities, rather than messing around in "normal" where they'll just add variation.

 

On top of this we have the whole issue of presenting this information in a fashion that it can be consumed and driven to action. That means personalizing the information flow to the role, availability and responsibility of the individual, instead of treating everyone the same. This means taking advantage of our understanding of the intersection of personal context with enterprise context - essentially creating a whole new type of enterprise.

 


4. How must network systems and technologies evolve to cope with vast increases in data?
My personal view is that there are certain IT activities that were designed from their inception to be highly parallel in nature. They'll require streams of information (ideally in real time) to function. These will be the approaches that consume significantly more of the cloud and multi-core computing resources than the traditional computing techniques. I'm not saying that ERP systems and other traditional software will go away; I'm just saying those techniques are at a significant disadvantage when it comes to using new computing resources. We'll see them change to take greater advantage of more parallel techniques or be replaced by whole new software approaches based on modeling, simulation and pattern recognition -- similar to the way the green screens (and their applications) are now a rare site in most organizations. As new techniques are available, organizational approaches and develop new expectations.

 


5. Will standalone IT departments survive the fourth wave?
With any trend there are leaders and laggards. I believe that a wholly standalone IT organization will be at a significant disadvantage, since they will be fighting it out on a traditional learning curve, where other organizations using these new techniques will base their work on a learning curve that is advancing much more steeply. IT organizations need to start now to understand the opportunities for them. Start small, but still do something meaningful - otherwise no one will pay attention.

 


6. How should employees be transitioned into this wave?

 

It depends on if you mean employees who work in or with IT or IT departments. For the individual, they should familiarize themselves with the concepts of what is going on in their area. For the Microsoft oriented personnel (since that's where I focus many of my efforts) the new technologies are: Azure, Silverlight, SharePoint and SQL StreamInsight. For IT organizations, it's to develop a strategy and enterprise architect that looks for waste in the current environment by assessing the current application portfolio from a value per watt and value per effort perspective. Since most organizations have the vast majority of their funds locked up in fixed costs built up over a number of years of success, you must first focus on the fixed costs in order to have investment funds. Addressing those fixed areas should free up funds to focus on strategic activities like:

 


  • data gathering and complex event processing

  • enterprise workflow and process automation

  • Virtualization and cloud computing application

  • Attention engineering (ensuring the right information is delivered to the right person in the right place, in the right format, at the right time, driving the right result) maximizing the outcome while minimizing the disruption.

  • Sustainability - addressing the efficiency of the enterprise as a whole as well as the IT organization.

Predictions for 2010

crystal ball


For the last 5 years I've created a blog entry of my predictions for the coming year. Last year I was pretty conservative because of the economic downturn. 2010 will be a bit brighter, so I'll be a bit more adventurous as well.


2010 is also the start of a new decade. A decade that will be even more tumultuous from an IT perspective than the last one (and that included the end of the .com bubble, a significant financial downturn and numerous political struggles & events). I say that because we are on the cusp of a new wave of technology -- the last decade was the end of the previous wave. The  move forward has been pent up by financial issues. Once the money starts to flow again, it will be on a whole different kind of IT - one that is based more on computers and information being everywhere. There will be new lessons to be learned and value to be generated.


Areas coming into their own in 2010:


Augmented reality - The merger between reality and computer displays is becoming more prevalent and transparent. There were a few good entries on this topic in 2009, and I expect it to increase radically in the mobile computing space. This is one significant way of overcoming information overload issues with the massive amount of data being collected.


Social computing will continue its mobility path. Although the static desktop will still access it the vast majority of social computing will be via the mobile platform. It will also continue to use the location information of the mobile device in totally new ways, supplementing our experience and likely utilize augmented reality techniques as well.


Another type of augmented reality is through the use of 3D techniques & standards. Now that 3D movies are becoming commonplace and 3D TV is possible, the use of 3D visualization for all kinds of applications will become apparent in 2010. Our minds are set up to take in 3D information so we'll find numerous ways to take advantage of it in business as well. 3D visualization of other emerging technologies (like Nano) will become common as well.


Besides just how we look at data, the way we manipulate it will change in 2010 as well. The venerable SQL database will begin to succumb to the increased pressure by highly parallel techniques like mapreduce and other noSQL techniques. New SQL approaches will need to be developed. We are going to hear much more about them in 2010.


There will be a significant focus on data de-duplication -- a way for organizations to overcome some of the cost related to the age of abundance of data. The amount of information created by mankind doubles every 18 months. For organizations who are deploying sensors and gathering unstructured data in a big way, it can be twice that fast. Much of business data is redundant, like e-mail attachments and backup sets. Organizations will be inundated with hardware and software solutions to address data storage exponential growth in 2010 with a whole new set of products and services.


In 2009 Cloud Computing was at the Peak of Inflated Expectations in Gartner's Hype Cycle for Emerging Technologies.  We've started to hear some early stumbles as the market understands how the current implementations of cloud work. 2010 will likely see a continued dip and the start of further refinements to move onto the slope of enlightenment. Most organizations will develop deeper understanding of cloud techniques and we should see some significant capabilities to take advantage of its parallel nature in the business analytics space. 2010 will also see more commercial application of Web 3.0 techniques (that will use cloud techniques). The cloud space will continue its bifurcation into public cloud for SMB and private cloud for the enterprise clients, until some killer application unifies the two. I am still holding out for a new dimension of business analytics to do that, but I doubt we'll see that in 2010.


IT's focus on overall enterprise efficiency and sustainability will come back when spending increases. The market's expectations in this space will continue and increase in 2010. The focus will move from Green IT to "IT for Green".


In the security space, organizations are going to turn their security perspective inside out in 2010. In response to the deluge of ever more intelligent malware, old security models using signatures for detection are running out of steam. Whitelisting software for servers and other security concerns that take a snapshot of a known good system to create whitelisting rules, monitor systems for unapproved applications and even prevent unapproved programs from running will become more common. Security will continue to be a huge issue.


The infrastructure space should see the widespread use of Solid state storage, in addition to traditional storage. USB 3.0 will come into its own and we'll see quite different applications of data transfer and storage because of its significantly higher speeds.


Robots will also be applied to a wider range of tasks in 2010 - like medical and automotive applications (moving beyond parking and braking). Although we will still not reach the Robot in every home by 2010 target.  IT organizations that are focused on more back office tasks should begin to understand the possible overlap with their roles.


As I finish up this post I realize that there are way too many items for an IT organization to address them all. On the other hand I do think they should watch them all and use their investment dollars wisely to try out the ones that will have the most impact. They should also work on weeding out the IT deadwood, so that more funds are available to accelerate these and other high value techniques and maximize the value of their enterprise.


 

Multi-core, where will it end?

A number of times I've mentioned the rise of parallel processing and multi-core as well as the effect on development and the kinds of solutions that are consumed. I was talking with some folks at TI the other day about the projections of multi-core processors on the market and the attached chart was discussed.


If you place this information on a logarithmic scale it definitely appears to be exponential in nature and we all know examples of the shift in thinking that exponential growth causes - essentially a shift to a view to one of abundance and not scarcity. In addition to the core growth there is also the growth in the number of threads a processor can run simultaneously to pump more computing through the cores.


Windows 7 supports up to 256 processors, and Linux supports a large number as well, but it is the applications that actually add value, not the operating system. What can be written that takes advantage of these advances may start to shift what is actually written.

Cloud to Dampen New Languages?

There seems to be a bountiful crop of new programming languages springing up.


As cloud computing moves beyond IaaS to PaaS, I wonder about the effect of cloud standardization on these new languages. Granted there is only an infinitesimal percentage of executable code that runs in a cloud environment today, but as it begins to take off, will the rigor of cloud environments make adoption difficult for these new approaches to flourish in the cloud? Could the whole new cloud oriented approaches made up of higher level, domain specific, and process modeling languages arrive instead?


Some of the languages that are coming out of Microsoft (e.g., Axum, F#) can take advantage of the .NET framework and should be able to run in Azure. Languages that can run within the Java Virtual Machine should be able to take advantage of any PaaS solution that works with Java (e.g., Hadoop on top of an IaaS solution). There is also a great deal of cloud activity around Python at Google, Amazon, and other places.


Hopefully the folks working on D and other languages are thinking about having a cloud approach that is further up the stack than IaaS. Addressing cloud definitely takes a new approach, to more easily take advantage of the complexity, horizontal scaling, security concerns... that come with cloud. I have no doubt that new languages will be required to take advantage of these environments and maintain quality and productivity.


The various new client oriented languages like Leopard and Kodu don't have to worry about running in the cloud, just tapping into it, so I'd expect to see a great deal of these "little" languages to continue showing up.

Axum - New Parallel Programming Language from Microsoft

I mentioned last week that I am trying to develop a better understanding of parallel coding issues and approaches.


Microsoft has a new language for parallel programming in the works named Axum. It was formerly known as "Maestro". There is a channel 9 video on the MS site about the effort from last year. Axum is an Domain Specific Modeling research project that Microsoft is creating to help programmers tackle the issue of parallel programming in the .NET environment. It looks like it is even available for download.


Phillips, a program manager on Microsoft's Parallel Computing Platform team, describes the language as:


"Axum is an incubation project from Microsoft's Parallel Computing Platform that aims to validate a safe and productive parallel programming model for the .NET framework. It's a language that builds upon the architecture of the Web and the principles of isolation, agents and message-passing to increase application safety, responsiveness, scalability and developer productivity. Other advanced concepts we are exploring are data flow networks, asynchronous methods and type annotations for taming side-effects."


He also stated:


"We're not talking about objects as a primary concept anymore; it's object-aware rather than object-oriented. In fact, you can't even define objects in Axum. It's special-purpose, so we don't intend for Axum to be the general-purpose language that C# is. You're going to define objects and types in another language like Visual Basic or C# and then you can use Axum to coordinate and get safe concurrency out of it."


The reason specialized language research is important to business is because programming parallel solutions is very complex and error prone. It is not something mere mortals can do effectively with today's tools. New tools and approaches will be required and this is an example of a start. It will be interesting to see how modeling techniques will be integrated with this approach, since Microsoft has been working in the domain specific modeling space for a very long time.

Search
Showing results for 
Search instead for 
Do you mean 
Follow Us
Featured
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.