The Next Big Thing
Posts about next generation technologies and their effect on business.

Facilitating a session on the Next Generation CIO

CIO.pngThis past week, I facilitated a session at a CIO conference in LA. The focus of the session I facilitate was The Next Generation CIO. Before we got started I had a brief introduction about the changes taking place from my view as a chief technologist perspective.  Here is a summary of my kick-off comments:

 

It seems today that you can’t pick up an IT magazine or listen to a conference keynote without someone lamenting the state of the relationship between the CIO and the business or IT’s capabilities to generate new value for corporations.

Let’s face it things have changed in recent years. For the past few decades we've been successful deploying and maintaining the systems of record that have been the backbone of decision-making for organizations. We’ve built up layer upon layer of successful projects to the point where we’re calcified by our own success. Unfortunately this means that it is common to hear people talk about having 80% of their budget consumed before the year even starts (just keeping the lights on) with little to nothing left over to add new business value. It may be as important what we stop doing as what we can start.

 

Having stated that we’ve had all this success, it is good to recognize that almost all the solutions in production today were built with a scarcity assumption. There was never enough data, storage, network or computing capacity.

In many cases, those limitations have been overcome and we live in a world of abundant IT capabilities. We now can take that abundance of data and computing capacity and use analytic techniques to perform complex tasks like context recognition and sentiment analysis – tasks that just a few years ago were the domain of human knowledge workers. We can now begin to recognize ‘normal’ situations and automate them, freeing up people to focus on the anomalies and turn them into opportunities.

 

 

Infrastructure as a Service is an example of a business process we're all familiar with. At its core it is the business process of instantiation and monitoring of virtual machines. Today, it has been automated to a large extent. What we can do today is just the tip of the iceberg of change headed our way as even greater IT capabilities allow us to take these techniques and apply them throughout the business. Instead of automating VM instantiation, we should be able to automate hiring personnel or even most of the middle management role in some organizations.

 

This abundance perspective can fundamentally shift how value is generated and the role of IT within organizations. If we don’t understand and capitalize on these technology shifts to address the business shifts underway, others will come in and eat our lunch.

 

With this as a starting point, we had a very active discussion covering a wide range of topics some of which were:

  •          Can is really be called Shadow IT if the CIO helps the business by applying their expertise to help steer, rather than running alongside and trying to slow it down?
  •          What can we do to help our people transition from traditional IT to a newer more flexible and business centric approach? Unfortunately, not all of them will be able to make the transition.
  •          What do CIOs need to do to sharpen the sword, for themselves and their people? One of the key points of this discussion was spending time with the business. Live it.
  •          Don’t strive for perfection – be flexible and enable the business to adjust as needed.

I had to draw the session to a close when time ran out but afterward there were a number of clusters that were still talking – and that discussion was likely more important than the discussion of the bigger group.

What’s the difference between SDN and NFV?

networking.jpgI was in a discussion the other day with someone focused on the networking services space and they kept using the acronym NFV, without really defining it. I dug in a bit and this is what I found.

 

Network Functions Virtualization aims to address the issue of having a large and increasing variety of proprietary hardware appliances. Its approach is to leverage standard IT virtualization technology to consolidate many types of network equipment onto industry standard high volume servers, switches and storage. These more standard device can be located in datacenters, network nodes or at end user premises. NFS is applicable to any data planepacket processing and control plane function in fixed and mobile network infrastructures. 

 

 

I’ve mentioned Software Defined Networking (SDN) in this blog before.  NFV and SDN are mutually beneficial but are not dependent on each other. That was one of the confusions I had during the initial conversation. NFV is focused on consolidating and reducing hardware costs. Although these devices could be virtualized and managed using techniques like SDN they don’t have to be.

 

The concepts of NFV are not really new. Even so, a more formalized approach with PoCs … will hopefully contribute to accelerating changes taking place in the communications industry allowing for reduced operational complexity, greater automation and self-provisioning – much like is happening in the cloud space (either through public or private techniques) for the rest of IT.

 

I just saw that Dave Larsen (of HP) put out a post about what HP is doing in both SDN and NFV, just as I was finishing up this post. Expect to see more about this when HP releases an HP Industry Edge e-zine devoted entirely to NFV, in the near future.  

HP ConvergedSystem 100 for Hosted Desktops

moonshot.jpgOne thing I am always looking for from a conference like HP Discover are products that might be underappreciated at first glance. The ConvergedSystem 100 may be one.

 

A while back HP came out with Moonshot. This massive application of computing capability was significantly cooler and more efficient than other options in the marketplace. It had one big issue, commercial software vendors didn’t have a licensing model that aligned with the number of cores this box could bring to bear on a problem. So for most organizations, the choices were to either write the software yourself, or use open source.

 

Now there is a solution that takes advantage of the cartridge approach used in Moonshot to tackle a problem that many organization have. The need for a low cost, no compromise PC experience. This solution (with the m700 cartridge) provides up to 6x faster graphics frames per second than similar solutions, up to 90% faster deployment (e.g., up and running in about 2 hours with Citrix XenDesktop and no SAN or virtualization layer to complicate things). It also has 44% better total cost of ownership, while consuming 63% less power.

 

You combine that with the HP t410 All-in-One, power over Ethernet thin-client solution and there are some real power savings and flexibility possibilities. 

HP Storage announcements at Discover

I don’t do posts about HP product announcements, but since this the week of HP Discover, why not. There definitly are some changes in the wind for storage.

 

One of the first things out of the shoot today at HP Discover was a whole new set of capability in the area of software defined storage. With all the options in storage today (SSD, low end, backup…) the complexity of managing the environment is reducing the flexibility of organizations. Storage solutions are becoming complex, inefficient and rigid.

 

storage shifts.png

The family of converged storage solutions announced today should help address many of these issues by providing a single architecture enabling a flexible and extensible approach that embraces block, object and file storage of devices ranging from HDDs to SSD/flash, providing:

  • Performance acceleration – eliminating system bottlenecks
  • Efficiency optimization – extending the life and utilization of existing media
  • System resiliency – providing constant information/application access
  • Data mobility – federating across systems and sites

With a range of HP 3PAR StoreServ solutions ranging from the low end 7200 to high performance (StoreServ 7450) or high scaling 10800 – all running from a single architecture and interface.

 

The 7450 that was announced today is:

  • Accelerated:  Over 500,000 IOPS and less than 0.6 ms latency proving a massively parallelized architecture, flash-optimized cache algorithms and QoS
  • Efficient:  Reduce capacity requirements up to 50% and extend flash lifespan delivering a multi-layered and fine-grained virtualization with hardware-accelerated data compaction
  • Bulletproof:  Eliminate downtime for performance-critical applications with a Quad-controller design with persistent cache and ports, peer failover, and multi-site replication
  • Futureproof:  Allowing organizations to move data seamlessly to balance performance and cost enabling a simple solution across Tier 1, midrange, and flash with federated data mobility

Also announced was HP StoreOnce VSA, a software defined approach to information protection enabling a more agile and efficient approach to remote information protection.

 

The breadth of these announcements should enable greater flexibility for organizations that maintain their own infrastructure.

Network Fabric for Cloud

Flexfabric.pngToday, HP Launches Industry’s Most Complete Software-defined Network Fabric for Cloud. This network fabric is built on HP FlexNetwork architecture, enabling business agility for clients by delivering two times greater scalability and 75 percent less complexity over current network fabrics while reducing network provisioning time from months to minutes.

 

This is possible by:

  • Improving IT productivity by unifying the virtual and physical fabric with new HP FlexFabric Virtual Switch 5900v software, which, in conjunction with the HP FlexFabric 5900 physical switch, delivers advanced networking functionalities such as policies and quality of service to a VMware environment. Integrated Virtual Ethernet Port Aggregator (VEPA) technology provides clear separation between server and network administrations to deliver operational simplicity.
  • Reducing data center footprint with the HP Virtualized Services Router (VSR), which allows services to be delivered on a virtual machine (VM), eliminating unnecessary hardware, by leveraging the industry's first carrier-class software-based Network Function Virtualization (NFV).

As organizations move to software defined networks, some fundamental changes in the approach will be required and these products are a start down that path. Here is a video with a bit more high level discussion and some details:

 

Search
Showing results for 
Search instead for 
Do you mean 
Follow Us
About the Author(s)
  • Steve Simske is an HP Fellow and Director in the Printing and Content Delivery Lab in Hewlett-Packard Labs, and is the Director and Chief Technologist for the HP Labs Security Printing and Imaging program.
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation