New research conducted on behalf of HP reveals that 1 out of every 2 business executives feel their organization is suffering from innovation gridlock. See “HP Research: Breaking the IT Innovation Gridlock,” Coleman Parkes Research Ltd., April 2010.
- Organizations feel blocked from driving new business innovation because the majority of their IT funding is consumed in operating and maintaining the current environment.
- In addition, almost 60 percent of business and technology executives feel that this gridlock is preventing their organizations from keeping up with the competition.
The inability to respond to such a widely recognized problem can be traced to a number of issues.
- Primary among these is economics. With limited budgets IT organizations are fighting to keep the wheels on - and this is where the majority of resources are spent. Spending on new projects is roughly one-third of the IT budget. Two-thirds is devoted to ongoing operations and maintenance. [See Forrester's "A Workable Application Modernization Framework is Job No. 1 Now", Phil Murphy, April 26, 2010).
- A major secondary issue is application portfolio understanding. What do we have, how does it support our critical business processes, and how much is redundant or grossly inefficient? If we decide to eliminate or replace a particular system, what are the ripple effects? Insight into these relations can be hard to come by and difficult to reason about.
- We have all had experience with failed IT projects. The numbers are daunting. Various studies have shown remarkable rates of failure or impairment, with reported success rates as low as 16%. A more moderate report, "IT Myth 5: Most IT projects fail" (August 13, 2004), states that in a study of 13,522 projects in 2003, 34% were an unqualified success, 15% failed, and the rest were "challenged" - which includes cost overruns, time overruns, and incomplete functionality. If we have a working system, even if it seems overly expensive to run or it is lacking in agility, we have an instinctive understanding that the risk of replacement can be sizable.
There are answers for all of these issues.
Improvement in the distribution of resources requires the inauguration of a virtuous cycle, where we identify initial projects with relatively rapid ROI, execute them, and then use the savings to fund further work.
Capturing a model of the portfolio is a crucial element of any long term improvement. As I have noted before, shutting systems off is one rapid road to savings. But without a reasonable model of business / application dependencies, the potential for unexpected consequences often leads to paralysis. HP can help you create these models and roadmaps.
One way to eliminate failure is to take on projects that are similar to ones we have previously completed successfully. This is where the combined experience of our modernization specialists and your application subject matter experts is so important. Having both team members experienced in modernization issues across a wide variety of systems as well as deep application experience significantly reduces the risks of selecting an improper modernization approach or failing to execute the modernization journey.
By 2019 a $1,000 personal computer will have as much raw power as the human brain. I figure at that price, I'll get two, two brains are better than one and by then I'll be 63. I may need four. This prediction and others are right around the corner, according to Ray Kurzweil, a noted futurist. I love to read Kurzweil. His predictions are both exciting and terrifying. Think you're getting a good deal in 2019, look at what happens in 2045:
$1000 buys a computer a billion times more intelligent than every human combined. This means that average and even low-end computers are vastly smarter than even highly intelligent, unenhanced humans.
Now we're getting personal. Unenhanced humans? Legacy people? I could be legacy in 2045? At the young age of 89?
Kurzweil goes on to imagine a “technological singularity”:
The technological singularity occurs as artificial intelligences surpass human beings as the smartest and most capable life forms on the Earth. Technological development is taken over by the machines, who can think, act and communicate so quickly that normal humans cannot even comprehend what is going on. The machines enter into a "runaway reaction" of self-improvement cycles, with each new generation of A.I.s appearing faster and faster. From this point onwards, technological advancement is explosive, under the control of the machines, and thus cannot be accurately predicted.
The Singularity is an extremely disruptive, world-altering event that forever changes the course of human history. The extermination of humanity by violent machines is unlikely (though not impossible) because sharp distinctions between man and machine will no longer exist thanks to the existence of cybernetically enhanced humans and uploaded humans.
Set your singularity clocks now, we only have 35 years before this happens.
I'd love to have Kurzweil's bookshelf, I'm sure he's well read, but he hasn't spent much time telling us what happens to old technology. What about all the legacy applications? Legacy businesses? All those unenhanced humans milling around writing COBOL. Do these businesses survive the singularity? Or is it like a slow extinction? Will legacy businesses become dinosaurs, unable to survive the world-altering singularity?
Maybe this future comes to pass, maybe it doesn't. But one thing is certain: change. Survival of any business is often predicated upon the journey its leaders have mapped out. A journey that isn't always clear, involving technology that is racing toward innovation, accelerating into the future. Want some help comprehending it all? If so, today HP is announcing a major initiative to help you to realize your future and help you to Break the Gridlock.
As part of this initiative, HP is offering a promotion of the Transformation Experience Workshop. This is a highly interactive workshop that helps to understand and map your legacy transformation journey. Please click the link above to find out more. Hope to see you at a workshop, even if your are an unenhanced human being. http://h10134.www1.hp.com/campaign/applications-workshop/
I’ve been programming for so long, with so many languages, that sometimes I get a lofty feeling of having seen it all, done it all. What could possibly challenge me further? Look no further than Java generics and the JUNG (Java Universal Network Graph) library.
HP’s second generation of the Visual Intelligence Tools will use the Java Universal Network/Graph Framework (http://jung.sourceforge.net/). The 2.0 version released in April of 2009 has fully embraced Java generics, an improvement that came with Java 5. Generics allow data structures in code to be more flexible and accommodating. And how could I complain when it’s such a powerful library. Just look at the balloon graph below: very cool. Our team sees several uses already, but first we must learn Java generics and moving the team to generics just doesn’t make us feel any younger. You see, the VI Tools uses network science to understand the relationship between modules of code. These bubbles could be a group of similar code. By allowing the nodes to arrange themselves; we get a better picture of what we like to call the unintended design.
Of course, my first move as team leader was to find the O’Reilly book that got the most stars on Amazon: Java Generics by Naftalin and Wadler. Between each chapter, coffee is advised. But what I’m finding with generics is unbelievable power. I can see why JUNG went all-in with their design.
Once we get past Java generics, implementing JUNG in our Visual Intelligence Tools is our next hurdle. But once we’ve climbed these learning curves, we hope to have a better view of the tools that will end up on our next generation VI Tools.
What JUNG will allow HP Application Modernization services to do within the VI Tools is to programmatically stand among the nodes in the graph to provide a view that gives us a better sense of that these code duplication patterns mean. Standing within the graph allows us a greater ability to tie together the graph patterns and the code artifacts that produce them. More later from inside JUNG…
ESPN announced recently that the next big thing in sports broadcasting will be 3D. I would agree. Over the holidays I went to see Avatar and now I’m trying to imagine watching the Super Bowl in 3D. What I found amazing while watching Avatar is that after the first hour, the 3D became almost unnoticeable. It’s as if you mind just accepts the new reality like your sense of smell adapts to persistent aromas, even if the aromas is a pleasant as cinnamon buns in the oven. It’s still a stunning effect, but for me my mind just accepted the movie as if I were looking out my window at a natural scene in a park.
3D TVs are on the way and we must wonder if attendance will drop at the actual events. Of course, there is still the aroma of hotdogs, the cool breeze in your face, and the rumble of the earth when your team scores. My point is the third dimension is really craved by our brains. But once we have, we quickly adapt our senses.
Sports seem to have always driven our technology. Early radio, early black-and-white TV, early color TV, all seem to prefer to use sports as their initial venues. In somewhat of a parallel, computer data analysis has enjoyed 3D for quite a while and 3D frameworks and technology seem to be driven in many cases by the gaming industry. In that vein, I’m closely watch a 3D tool and framework that brings together game developers, artists, and scientists in a single community. The tool is called Processing (http://www.processing.org/). It is a technology that was created by John Maeda at the Aesthetics and Computation Group as the MIT Media Lab.
The kinds of analysis tools created in Processing feel like video games. The web site has dozens of exhibits of the technology in actions. My favorite is the Base26 example seen below. You have to click on the link and interact with the tools to appreciate its power.
Imagine this approach with a touch-screen.
Given the enormous complexity of legacy source code analysis, I really envision enormous benefits of applying this technology to legacy transformation.