New research conducted on behalf of HP reveals that 1 out of every 2 business executives feel their organization is suffering from innovation gridlock. See “HP Research: Breaking the IT Innovation Gridlock,” Coleman Parkes Research Ltd., April 2010.
- Organizations feel blocked from driving new business innovation because the majority of their IT funding is consumed in operating and maintaining the current environment.
- In addition, almost 60 percent of business and technology executives feel that this gridlock is preventing their organizations from keeping up with the competition.
The inability to respond to such a widely recognized problem can be traced to a number of issues.
- Primary among these is economics. With limited budgets IT organizations are fighting to keep the wheels on - and this is where the majority of resources are spent. Spending on new projects is roughly one-third of the IT budget. Two-thirds is devoted to ongoing operations and maintenance. [See Forrester's "A Workable Application Modernization Framework is Job No. 1 Now", Phil Murphy, April 26, 2010).
- A major secondary issue is application portfolio understanding. What do we have, how does it support our critical business processes, and how much is redundant or grossly inefficient? If we decide to eliminate or replace a particular system, what are the ripple effects? Insight into these relations can be hard to come by and difficult to reason about.
- We have all had experience with failed IT projects. The numbers are daunting. Various studies have shown remarkable rates of failure or impairment, with reported success rates as low as 16%. A more moderate report, "IT Myth 5: Most IT projects fail" (August 13, 2004), states that in a study of 13,522 projects in 2003, 34% were an unqualified success, 15% failed, and the rest were "challenged" - which includes cost overruns, time overruns, and incomplete functionality. If we have a working system, even if it seems overly expensive to run or it is lacking in agility, we have an instinctive understanding that the risk of replacement can be sizable.
There are answers for all of these issues.
Improvement in the distribution of resources requires the inauguration of a virtuous cycle, where we identify initial projects with relatively rapid ROI, execute them, and then use the savings to fund further work.
Capturing a model of the portfolio is a crucial element of any long term improvement. As I have noted before, shutting systems off is one rapid road to savings. But without a reasonable model of business / application dependencies, the potential for unexpected consequences often leads to paralysis. HP can help you create these models and roadmaps.
One way to eliminate failure is to take on projects that are similar to ones we have previously completed successfully. This is where the combined experience of our modernization specialists and your application subject matter experts is so important. Having both team members experienced in modernization issues across a wide variety of systems as well as deep application experience significantly reduces the risks of selecting an improper modernization approach or failing to execute the modernization journey.
In a previous post I wrote that at one time developers had to write a great deal of code that today would depend on prebuilt or commercial components.
There is one class of legacy program that represents the ultimate example of this - large apps that manipulate flat files for reporting purposes. It is common in these instances to see extensive JCL devoted to sorting and copying, and COBOL programs doing substantial filtering, merging, and rewriting of data. Understanding exactly what is happening is complicated by the use of multiple jobs and programs to paste this all together. The fundamentals are actually fairly simple, once you work your way through the complex overlay of report processing that was written repeatedly.
I've seen programs that write out the table content, line by line, with a tag on the front indicating what type of report line it is. Then a final pass creates the actual formatted reports - kind of a primitive model, view, controller - with the model being the original input flat files, the controller being the multiple JCL and COBOL jobs filtering and massaging, and the view being that final COBOL program that emits 132 column lines suitable for your fan-fold, green bar paper.
Today, these programs would be built by linking a reporting tool to a relational database. The endless JCL manipulations would be accomplished by predefined views or appropriate SELECTs. And the results would be published to our corporate intranet with appropriate access controls. Agility, in terms of our ability to produce new or modified reports, would be enormously enhanced and the labor costs associated with maintaining the reporting capability would drop by an order of magnitude.
This is a classic candidate for what Steve Woods has labeled asymmetrical transformation, which involves identifying those portions of your application that truly require custom coding and then ensuring that the rest are implemented by the appropriate COTS solution.
Simplification is a recurring theme in this blog. It is a
major goal of modernization. Simpler has many facets. I am going to consider
- Unique mapping of business processes to
- Elimination of redundancy
- Retirement of unused applications
How do we know we have unique realizations for those of our
business processes supported by automation? This question is filled with
implications - we can't know the answer without having categorized our business
processes and business objects and mapped these elements to our IT
This mapping enables two things - we can detect duplicates as
well as identify systems that no longer support our current business model. In either
case these become candidates for retirement. Retiring systems can be
expensive. Consider a case where we use three CRM systems (say as a result of
multiple acquisitions). Turning off two of these will not be a painless flip of a switch. It will require data migration, process changes to match the retained system and retraining .
Any time we are retiring a system, a critical consideration
are its archival requirements. Are we required legally to keep the data? Is
this data that we should be mining for historical trends that could be of value
to the business? If we are intending to retain data from retiring systems, do
we need it everything we have in existing archives or is there an opportunity to significantly reduce the volume of data to reflect our needs?
Retirement is both a huge opportunity and surprisingly
difficult. Nothing saves money like eliminating redundant systems, but it
requires confidence to stand up and say it is okay to turn a system off. This
confidence only comes by virtue of applying rigor to an analysis of your
portfolio. Something provided to you by an HP Application Porfolio
In the future we will look as some of the other implications
arising from simplification.
- Reduced platform variability
- Homogeneous platform management
- Standard interfaces
- Correct dependence on systems of record
ESPN announced recently that the next big thing in sports broadcasting will be 3D. I would agree. Over the holidays I went to see Avatar and now I’m trying to imagine watching the Super Bowl in 3D. What I found amazing while watching Avatar is that after the first hour, the 3D became almost unnoticeable. It’s as if you mind just accepts the new reality like your sense of smell adapts to persistent aromas, even if the aromas is a pleasant as cinnamon buns in the oven. It’s still a stunning effect, but for me my mind just accepted the movie as if I were looking out my window at a natural scene in a park.
3D TVs are on the way and we must wonder if attendance will drop at the actual events. Of course, there is still the aroma of hotdogs, the cool breeze in your face, and the rumble of the earth when your team scores. My point is the third dimension is really craved by our brains. But once we have, we quickly adapt our senses.
Sports seem to have always driven our technology. Early radio, early black-and-white TV, early color TV, all seem to prefer to use sports as their initial venues. In somewhat of a parallel, computer data analysis has enjoyed 3D for quite a while and 3D frameworks and technology seem to be driven in many cases by the gaming industry. In that vein, I’m closely watch a 3D tool and framework that brings together game developers, artists, and scientists in a single community. The tool is called Processing (http://www.processing.org/). It is a technology that was created by John Maeda at the Aesthetics and Computation Group as the MIT Media Lab.
The kinds of analysis tools created in Processing feel like video games. The web site has dozens of exhibits of the technology in actions. My favorite is the Base26 example seen below. You have to click on the link and interact with the tools to appreciate its power.
Imagine this approach with a touch-screen.
Given the enormous complexity of legacy source code analysis, I really envision enormous benefits of applying this technology to legacy transformation.