One slide that was used in HP Discover last week quite a bit was this one:
It shows how technology has shifted since the dawn of the Information Technology. These changes are not likely to slow down because it is all fueled by exponential technology growth.
It is about the unimaginable change that is possible when driven by exponential growth. The story starts with the man who invented chess. When he showed the king of India the game, the king was so entertained and excited by the game that he told the man he’d give him anything he asked for – within reason.
The man made what appeared to be a simple request. He asked that every year for the next 64 years (the number of squares on a chess board) a few grains of rice in the following manner: the king was to provide a single grain of rice on the first chess square and double it every following year.
The king quickly agreed.
The first year the inventor received 1 grain, the second 2, the third 4… It doesn’t get interesting until you cross over into the 2nd half of the board.
On the 23rd square we are talking about 8M grains. A still reasonable amount of rice, that can be delivered by a small field of rice. At the next square, when crossing over to the 2nd half, the king finally took notice, because now it would start impacting his grain inventory. The king realized by the time they would reach the end of the board, it would have required enough rice to cover all of India one meter thick with rice. He’d been had and the inventor’s head was soon cut off and the rice deliveries were no longer a problem.
I bring this up because all these exponential trends that we’ve been taking advantage of in IT, like Moore’s law, Edholm’s law…, are now reaching into the 2nd half of the board. We’re the ones who need to understand and take advantage of the change since it is quite different than what we’ve seen to date.
How many of you have already felt the constraints of your own thinking getting in the way of technology adoption? I know I for one need to take a step back every once in a while and say “what does this really mean?”
We are entering into a different world where there is an abundance of data – with all the sensors and mobile devices… We don’t worry as much about if the data is available, but more about what we can do with it. For those people who believe that data is king, it can be a rude awakening when they realize that in a world of abundant data, having more of it is worthless.
We don’t worry as much about if we can transport the data to the processing location. The networking is typically there, although it may still cost more than we wish.
With cloud computing, we have the resources to crunch all that data into something useful.
Additionally, our access to software capabilities is more than we’ve ever seen before as instantiated in the phase – there’s an app for that. For businesses that may be SaaS, Open Source, COTS…
For most businesses though the systems were designed with a very constrained view of the world. They were based on scarcity of data, computing… and it is time to take a step back and really look at that portfolio of applications for what they are really good at and how they add value.
One of the things that metrics I talk with people about focusing upon is time to action. Being able to identify events is one thing but making systems take action is a whole other discipline. I just came across this video about a research project at Microsoft called Mayhem that attempts to take some of the complexity out of the action side of equation.
This is an open source effort, so it will be interesting to see where the market takes it. The makemayhem site is where to learn more.
Every 7-10 years, technology development and delivery undergoes a fundamental shift that opens up new business models and value generation opportunities. These shifts fundamentally change the way that technology is consumed and the value that it can bring. These shifts change what is possible and break down the barriers to innovation. Today, mobility, consumerization and cloud computing are signposts that mark the shift that is underway.
So what is the implication for IT?
- Opportunities - but at the same time, risk
- Agility - but at the same time, a need for control
- Flexibility – but also the possibility of lock-in
The HP cloud offerings announced today are a start down the road to changing the way, infrastructure is built, applications are developed, services are defined and information is delivered. I will not cover the details in this post (you can see the press release for the official view), but focus instead on some of the underlying philosophy.
Early adopters of cloud services have found these techniques can provide both an improved “time-to-value” as well as cost flexibility. Today, many mainstream organizations see cloud services as a key delivery model that can increase their ability to address organizational objectives in a demanding and unpredictable world -- a world where a major constraint is the number of seconds in a day. A world where cloud enabled practices can be a cornerstone of their ability to gain access to the right IT services, from the right places, at the right time, at the right cost; and create the means to speed innovation, enhance agility and improve financial management.
HP believes organizations will need to implement a hybrid delivery strategy that will leverage cloud services as part of their IT delivery and consumption strategy. To make this happen, HP’s focus is on enabling choice, not making choices for organizations. Hopefully everyone recognizes that if you have a well understood set of computational requirements that are stable and consistent, it is better to own these capabilities – in those cases, “the cloud” will not be cheaper similar to the reason why it can be cheaper to own a home rather than rent one. So our view is that new more flexible solutions will be combined with traditional means to build and consume IT services. HP is also trying to continue to support a market where there are leaders and laggards in the adoption of cloud, so one size cannot fit all organizations.
In order to deliver on the promise of the cloud and hybrid delivery, where everything has to be sourced and assembled at will -- information in all forms must be harnessed and exploited from inside and outside the enterprise in a secure fashion. This is an area where HP is performing research and development, since there are still many unknowns about the best way to address this need. This flexibility demand creates an environment where the IT mix can rapidly shift as organizational requirements change. Naturally, this will require changes to how software is architected and written. Application development and operational infrastructure must be visible, accessible and manageable in a consistent manner. Standardization must be in place to allow portability of services across deployment models and reduce lock-in.
HP’s approach to address this area is called the Converged Cloud - providing the unconstrained access to IT resources that organizations require, fulfilling their objectives. HP Converged Cloud provides access to “Infrastructure Anywhere,” “Applications Anywhere,” and “Information Anywhere.” Today’s announcements are just the start of a whole series of offerings and services we’ll be hearing more about over the coming weeks.
HP will deliver the HP Converged Cloud experience across four key customer scenarios. The typical journeys that customers tackle to fully embracing the cloud:
- Making it safe for corporate developers to unleash innovation for mobility and consumerization while leveraging public cloud infrastructure safely and securely throughout the service lifecycle
- Cloud-enabling existing data centers beyond virtualization to include automation and full hybrid delivery
- Cloud Services provisioning from infrastructure, application, network, information, and SaaS-centricity standpoints.
- Sourcing new virtual services from outside the enterprise that deliver the information in the context of your enterprise, and then consumption of that information directly by the user, application or business process.
HP’s Converged Cloud will be underpinned by a single architecture built on proven, industry-leading Converged Infrastructure (Servers, Storage, Networking) and new Converged Management and Security software (Automation, Management, Security); and combined with enterprise-class, hardened open source technology (OpenStack) to deliver an enterprise-class IT service delivery capability which delivers the flexibility and choice the industry demands.
If you are on twitter you might try to watch the tag #convcloud on Thursday starting at about 1PM EDT. There is a twitter chat scheduled. I know if I have 2¢ to add, I’ll chime in.
One of the concerns about both public and private cloud deployments is the degree of “lock-in” that is involved in management and operations within the environment. Once you do all the tuning necessary to get the environment working and integrated with the business effectively, you have little flexibility to move the work around, since the control languages can be radically different.
One thing I find interesting is most of the lock-in discussions talk about proprietary solutions, but you can also get locked in to open source solutions as well. This starts to get into that whole discussion of Technical Debt or betting on the wrong horse.
A group at HP labs has been investigating the development of a lingua franca to hide some of the common tools constraints. The article published at the 2011 IEEE 3th International Conference on Cloud Computing is titled: Elastically Ruling the Cloud: Specifying Application’s Behavior in Federated Clouds (you can only get the abstract without being an IEEEXplore member).
Further efforts in this kind of integration/interaction work will be required to allow organizations to have the freedom needed to move work around their organization’s definition of cloud. As the article states: “The Cloud is still in its infancy and more type of rules will be included as the variety of actions triggered when a condition is met is also expanded.”
The picture is the top portion of the Engraving The Confusion of Tongues by Gustave Doré (1865).
I noticed when reading through a couple of press releases this week, that Microsoft
has made a series of announcements underlining the company’s “commitment to interoperability and
performance on the Web”. Specifically,
one announcement focused on striving to improve interoperability with the booming technology that is jQuery.
So what is significant about jQuery? Well, it's significant in a few ways the first being that it is open source, with most of us seeing it in operation everyday without realising it. Secondly it uses very innovative concepts in the way it
productivity can be stifled when having to deal with any browser incompatibility. This incompatibility results in a large
amount of development effort being wasted on testing and the development of workarounds. Google’s frustration with IE6, in particular has been in the news recently. Its public , abandonment of IE6, is a clear recognition of the diminishing returns of adapting new web technology to incompatible platforms. Understandably this news has been greeted with a large sign
of relief by much of the web community’s developers, who have been gnashing
their collective teeth for the last 5 or so years. In many respects this is contributing to the drive for compatibility and has resulted in the dominance of an open browser. (Fiirefox’s current share of the web browser usage is estimated
to be 46%)
This drive for compatibility is a clear demonstration that every business and
corporation in the world is now very dependent on open standards and Open
Source software. And, something most business probably do not realise is that, the
majority of their business systems are already heavily dependent on Open Source
software. It is not just browsers that I
with all its array of plugins, BI, SOA
Frameworks, Messaging Systems, not to mention the Software Development tools,
such as SubVersion, WireShark, SOAPUI, Eclipse, JUnit, etc.
The list is almost endless – without them no modern software applications
could be written, debugged, tested and sent into production. So
what is particularly encouraging is to see large software companies, actually
join the effort with a view to supporting and enhancing compatibility rather than rushing in to acquire and to dominate a technology,
Given this, it is somewhat baffling to know that there is still a belief in the IT industry that open source and open initiatives are to be feared. Paradoxically , if the CIOs of our large
companies were to actually sit down and perform an assessment of all their IT systems and their dependencies, I am sure that they would be elated to learn that they have already embraced open source.