Look for a number of posts in various blogs and twitter sites -- on October 5, HP is hosting an HP Tech Day to celebrate the 10th anniversary of the HP Superdome. The event brings together seven indie tech bloggers to visit the HP Cupertino campus and learn about HP's leadership in mission-critical computing. Too bad there are only independent bloggers -- so I can't go. :-)
In this month's Wired magazine there is an article titled: The Good Enough Revolution that discusses how new markets are taken over by products that are "good enough" and undercut technically better approaches. The examples discussed are the Flip camera undermining the camcorder market, or MP3 taking on the recording industry and even the Predator drone replacing manned aircraft.
These products grab market share because they make our selection standards shift and what was important before may no longer be a deciding factor in making the decision. Clayton Christensen in his book The Innovator's Dilemma talked about this phenomenon (Joe Hill did a blog entry on it a while back). Jerry Pournelle has also been talking about selecting "good enough" products in his Chaos Manor blog for almost as long as I can remember.
A reason this is important to IT organizations today is the whole concept of Cloud techniques, including SaaS. Granted there are many scenarios where an in-house or custom developed solution would be better, but the decision criteria may have shifted so those "better" characteristics are irrelevant. We need to focus on "What does the business want vs. what do they need?" and how do the characteristics of the product or service address them. For example: A cloud provider may not be able to guarantee anything close to 100% uptime from end-to-end for a transaction, although they may be willing to do this within their four walls. Is this guarantee good enough? If not are there alternatives?
I was in a Azure meeting in Dallas the other day and when I brought this issue up, a person in the audience said "No one can guarantee end-to-end!". I totally disagree. We've done it for years, although not in a cloud context.
I'd say organizations need to get used to the "good enough" thought process and be flexible about their critical characteristics. The way they have always measured success may no longer be what is most important to the business. This concept has been around for a very long time and is not a fad but the future.
An article on the BBC News website caught my eye today, apparently, August 2009 is the 40th Anniversary of UNIX . In August 1969, after AT&T-owned Bell Laboratories pulled the plug on the development of Multics (Multiplexed Information and Computing Service) - Ken Thompson had the time to write the first version of the as-yet-unnamed operating system, in assembly language for the DEC PDP-7 minicomputer. Thompson named his operating system UNICS (Uniplexed Information and Computing Service) which was a pun on "emasculated Multics". The UNICS name later evolved to the more familiar UNIX.
AT&T canceled Multics because it was a overly complicated time-sharing system. However, some important principles in Multics were carried over into UNIX. But the key to the success of UNIX has been that, unlike Multics, it was designed to be simple and compact. This has allowed it, in many ways, to become a kind of Swiss Army Knife Operating system that can be adapted to service many needs without over complication.
Despite this vision, it is ironic to note how companies desire's to stamp their own brand on UNIX lead to the so-called "UNIX wars". While UNIX was trying to get away from the complexity of corporate operating systems, the evolution of the GNU (GNU's not UNIX) operating system and LINUX has shown the strength of the UNIX vision and taken this philosophy even further by creating the concept of an open source operating system.
At 40,UNIX provides the OS for the majority of servers that make up the Internet, created much of today's open source market, is the basis for Apple's OS X and even Microsoft Windows supports the communication stack originally developed for UNIX. So happy 40th birthday UNIX and best wishes for many more!
2009 is both the 150 year publication anniversary of the foundational work On "The Origin of Species" by Charles Darwin and his 200 year birthday. What would he tell us today about the evolution of cloud computing or is this a far fetched question?
Well, not really. Consider the key role of adaptability for the survival of living systems ("Survival of a species is determined by their ability to adapt to a changing environment," a quote by Charles Darwin). This sounds familiar when it comes to IT solutions struggling to keep pace with the ever changing business environment. So where does this adaptation capability come from in general terms?
It comes from investing in flexibility in addition to improving efficiency (i.e. cutting cost). Isn't cloud computing a lot about balancing economy of scale efficiencies, providing responsiveness, flexibility through virtualization, and automation efforts? If so, Darwin would probably give it a good chance to survive.
Cost, while of paramount importance, e.g. in a biological context in the sense of minimizing energy consumption, is only one side of the coin. To probe further, the book by C.K. Prahalad and M.S. Krishnan (The New Age of Innovation) mentioned in my last contribution actually provides a good discussion of the efficiency versus flexibility tension.
"This device saves you half the work!" - "Fantastic, I take two of them." (Unknown source)