Today, HP Launches Industry’s Most Complete Software-defined Network Fabric for Cloud. This network fabric is built on HP FlexNetwork architecture, enabling business agility for clients by delivering two times greater scalability and 75 percent less complexity over current network fabrics while reducing network provisioning time from months to minutes.
This is possible by:
- Improving IT productivity by unifying the virtual and physical fabric with new HP FlexFabric Virtual Switch 5900v software, which, in conjunction with the HP FlexFabric 5900 physical switch, delivers advanced networking functionalities such as policies and quality of service to a VMware environment. Integrated Virtual Ethernet Port Aggregator (VEPA) technology provides clear separation between server and network administrations to deliver operational simplicity.
- Reducing data center footprint with the HP Virtualized Services Router (VSR), which allows services to be delivered on a virtual machine (VM), eliminating unnecessary hardware, by leveraging the industry's first carrier-class software-based Network Function Virtualization (NFV).
As organizations move to software defined networks, some fundamental changes in the approach will be required and these products are a start down that path. Here is a video with a bit more high level discussion and some details:
This week I had the opportunity to attend one of Leon Kappelman’s classes at the University of North Texas to participate in interactions with students about their senior project/presentation. The teams of students were covering a number of topics like BYOD, Cloud adoption, Biometric based security… All topics where I felt fairly comfortable.
One presentation was focused on Data Management in the Age of Big Data and they had one concept well understood that many analysts miss.
The opportunity for better decision making.
The team focused on 5 key issues. The lack of:
- Data Governance
- Data Quality standards and management
- Data Architecture and Security
- Operations support
- Business buy-in
We had quite a discussion about the business buy-in issue, since we needed them to explain why it would get this far without buy-in but they explained that the issue orbited around business culture and the implications advanced analytic techniques would have on the culture.
I was happy to see these students internalized these concepts, and hope the organizations they move into after graduation are ready for their perspective.
When is the last time we’ve seen truly new thinking in the area of Architecture for business and IT systems? Zachman Framework, 1984? TOGAF, 1990’s? OMG Model Driven Architecture, 2001? Has there been any breakthrough thinking other than to continue to provide more clarification and more decomposition about how to document enterprise and IT architecture? There has, but change is painful even for a profession that prides itself in facilitating change.
The HP Moonshot System is leap forward in infrastructure design that addresses the speed, scale, and specialization needed for a bold, new style of IT.
HP ProLiant Moonshot servers are designed and tailored for specific workloads to deliver optimum performance. The servers share management, power, cooling, networking, and storage. This architecture is key to achieving 8x efficiency at scale, and enabling 3x faster innovation cycle and bringing thousands of cores on target for projects. It uses 86% less energy, 80% less space, 77% less cost and is significantly, less complex to install and maintain.
After talking with other technologists, I believe that it is a start down a path that will change both how software is written and how solutions are envisioned. When I look at the initial product data sheet, I see a 4.3 U chassis that can hold up to 47 server cartridges. As the processing capability improves so can the cartridges. A full rack of these will replace the computational capability of whole data centers just a few years ago. Granted it excels at certain type of computing needs.
As the HP Pathfinder Innovation Ecosystem improves and continues to bring together leadings partners, a broader set of problems can be addressed:
This means having access to the latest technology and solutions at a groundbreaking, time-to-market pace measured in months rather than years. I can’t wait to see what next big thing will spring forth from this.
Lately, I’ve been in discussions with people working in the architecture and software development space about increasing their delivery quality and capability. Various organizations have different problems, but this team is fairly mature and they have defined a great deal of process and project work products.
For me governing development efforts has a number of parallels with managing an economy. In economics, you can control an economy from the supply side or the demand side. Some organizations spend a great deal on the supply side making sure work products and processes are defined, creating a great deal of documentation. They may have process owners even, who are focused on ensuring that the processes is up-to-date. In this case, there was no shortage of supply.
Teams may overlook the need to focus on the demand side. How should these work products be used by the leadership in making better decisions? Do the leaders understand their value and how to use them? Are they actually being used? How can we prove it? … If we get the leaders to actually use the materials (increasing demand), the work products will get created more consistently, the process followed and the whole effort will generate greater value. It is about leadership, not about creating more stuff.
That is one area were Agile techniques have greater focus. Agile approaches are zealous about ensuring work output is understood and used. Unfortunately, they may give up a level of consistent delivery in the process.
Understanding your organizations supply of process work products and how they are actually used can go a long way to increasing performance.