Displaying articles for: 07-15-2012 - 07-21-2012
“Wikibon found that the Intelligent Rack contributes about 70% of the benefit, for less than 6% additional cost (hardware and implementation cost). This makes the installation of Intelligent Racks as a standard for the data center a “no brainer” decision – it is cost-justified even if only 10% of the equipment in the rack can take full advantage of the environments.”
As a result in a typical enterprise with $1 Billion in revenues and an IT budget of $30M, and 40 racks in its data center, upgrading to intelligent racks over four years would cost $85,000 per year and provide annual benefits of about $300,000. At this relatively slow replacement rate, the ROA analysis shows that the Net Present Value of the investment would still be a reduction of $1.8 million dollars (on a total IT spend of $30 million dollars per year).
The report states that, only the HP version identifies physical assets in a simple way and provides a framework for improved asset management, power management, equipment management and utilization, and software management. HP’s solution was WIkibon’s recommendation as best-in-class at the time the report was created.
Here is a video about the Gen8 servers:
We need the education system to provide platforms to teachers and students to enable Education 2.0. And it’s not just about the technology, Education 2.0 must support the knowledge economy and therefore there must be a focus on creativity and innovation.
I was talking to a group the other day about the need to align Big Data activities with automation to take latency out of organizational response and then came across this post titled: Computers that predict the future, where it talks about a number of real world examples of using Big Data techniques to overcome scarcity.
A sample from the article is:
“Helberg built a system to model highway traffic, which showed that reducing the distance between vehicles resulted in an end to stop-go delays. The only proviso was you had to reduce the space between moving vehicles so much that you needed self-driving cars to make it work.”
Cloud computing deployments today do much of this kind of automation within IT processes. The manufacturing industry has been doing this for decades using modeling and analog feedback loops.
“We can look for patterns and anomalies and these may be used to predict outcomes or create policy. We are also on the cusp of a world with unparalleled volumes of data and analytical sophistication.”
In a twitter based discussion yesterday, (keyword #convcloud) we were firing tweets back and forth about cloud, devops and business. A few of the posts dealt with the issues that IT is facing today related to automation are going to expand throughout the business. The artificial separation of “IT” and “the Business” may actually be part of the problem with IT implementations today as opposed to part of the solution. In order to develop the deep insight that “big data” demands (let alone automate the enterprise response) the business needs to be deeply understood. These are not stand alone skill sets.
Jim Duffy wrote an Internet 2 article stating that the consortium that supports Internet2 is nearing completion of its OpenFlow-enabled 100G Ethernet software-defined network. This month, Internet2 network engineers will meet to launch what they call the “Innovation Platform” which will be the United States’ first open software-defined network.
The consortium hopes to eventually deliver an enhanced network that can handle the transfer of large data sets in response to the abundance of data today. Having this high speed network crossing the US will provide a solid test bed for further enhancement to future networking needs of those initially involved as well as everyone else in the future – just like the original Internet.
I was recently pulling together a presentation on memristorsand the breadth of HP labs research activities. As I was looking at the material to cover, I noticed a great deal of research taking place on memristors outside of HP.
I do believe (contrary to what some may say) that is because this technology will fundamentally change the IT industry going forward.
At their heart, Memristors are a non-volatiles switch with great possibilities.
- Scales to just a few nanometers lateral size
- Multiple layers can be stacked for even greater total density
- Switching speed is <10 ns at a few picoJoules energy dissipation
- Have a state lifetime (value holding ability) for at least a few decades
The opportunities where they can be applied are:
– As the replacement for storage technology like: Flash, Solid-State and Magnetic hard drives and even dynamic random access memory
– Possibly even replacing many of the functions of custom integrated circuit and field programmable gate arrays.
The research on this area is still underway but the following numbers I was able to gather from a variety of sources, so they are presentative of the difference in capability between flash memory that everyone is so excited about and the memristor based equivalent.
Switching element size
Switching time (ns)
Set voltage (V)
Reset voltage (V)
-3 to -5
10^2 to 10^7
10^5 to 10^10
So in the coming few years as this is productionized, it will shift our expectations. For example, how would cloud be implemented differently if there was a terabyte of fast, static memory on our mobile devices? What if it were a petabyte??
Recently, Abbie Lundberg put out a post about The Influential CIO. It points out the shifts taking place with the integration of IT into all parts of the business as well as the consumerization effect of bringing IT into all parts of our lives.
She goes on to talk about the importance of Credibility, Trust and Relationships – which is probably true for all leaders, not just CIOs. It reinforces the servant leader concept, but not just for the employees but for the business as well. As I read the post, it reminded me of the shifting CIO role post I wrote a few weeks back: CIOs: Don’t try to hold on to your hat.
I continue to try and expand my understanding of gamification concepts and recently read: Playful Design
This book is good overview of the design issues of games and gameplay. I was looking for a book focused on the application of game techniques in business and this is definitely not that book.
Playful Design does a great job covering the foundational issues of game design, but it is in the context of traditional gaming. The book is written from the perspective of maximizing the player’s experience playing the game and discusses the various reasons people come back for more.
The book is definitely worth reading as background material. It will help expand your view of how gaming can be applied effectively, since it covers a varied range of implementations over the history of gaming.
NewScientist had an article titled: Glasses-free 3D screens let you see the wider picture. That shows some work done at MIT to send different video to each eye, enabling a 3D display. The video included in the article shows the technique used.
The concept of sitting in “sweet spots” to enable a 3D display is especially applicable to a computer monitor, since the person using the device is almost always setting (or standing) in the exact same spot.
The solution mentioned can even adjust for eye prescriptions and other vision problems. It is only applicable to one person at a time though – which is another reason why it may be useful to computer monitor applications.
Image: Camera Culture Group/MIT