Cloud Source Blog
In This HP Cloud Source Blog, HP Expert, Christian Verstraete will examine cloud computing challenges, discuss practical approaches to cloud computing and suggest realistic solutions.

The Machine, a view of the future of computing

Themachine.jpg

The second week of June is traditionally the week of HP Discover, which allows me to be back in hot and sunny Las Vegas. This year was no exception. What made this HP Discover interesting though is that HP is having its 75th anniversary. Yes, it’s 75 years ago that Bill Hewlett and Dave Packard tossed the coin and called the company Hewlett-Packard. What would have happened if the coin had fallen on the other side, and we would have become PH?

 

These have been 75 years of innovation, and although over the last decade analysts seemed to feel HP had stopped innovation, things are definitely back on track. HP is back. I believe that’s the key phrase for this HP Discover. So let me give you a feel of what I’m talking about. We introduced many products during the week, and I will come back to some of those, but there is one thing I’d like to talk to you about. It’s called “The Machine.” It’s not a product (yet), but it’s an effort. And in the usual HP approach, we call upon our partners to work with us on that effort. But what effort am I talking about?

 

Re-inventing the computing industry

Since 1945, computers follow a clearly defined architecture, called the “Von Neumann” architecture. It consists of a control unit, arithmetic processors, memory and input/output devices. Some special processors have been created with other architectures, but all our general-purpose computers have been following this. Yes, there have been major improvements, one of the key ones was the Reduced Instruction Set Computer (RISC) systems introduced in the 80’s. (By the way, HP played a major role in that.)

 

“The Machine” is changing the paradigm. Up till now, all computers have been working with electrons. Sure you’ll tell me there are experiments with quantum computers and others. But frankly these are still in their infancy. Electrons have this nasty issue that at a given moment you never know exactly where they are. So you need a group of them to ensure, statistically, that you have opened or closed a gate. Over the years we have made silicon traces smaller and smaller, working with less and less electrons. So, we are doomed to reach a limit, the moment we will no longer be sure a gate is opened or closed, as we don’t know whether it will have been hit by an electron.

 

Quantum computers try to solve that by using photons rather than electrons.

 

The Machine takes a completely different approach. Realizing that most computers spend up to 80 percent of their time on tasks there to manage the environment, not to perform the task at hand, The Machine just gets rid of these tasks making computers increasingly effective. How do they do that?

 

Well, in current environments, two key tasks are taking the majority of the effort. On one hand, instructions and data keep being shuffled between persistent storage, memory and cache. Things keep going up and down all the time through these layers. And in that process we go through multiple communication busses, each managed by their own software stack. What if we get rid of them? On the other hand, the second task consists in all the systems we have developed to be able to use the capacity of our current CPUs efficiently. These include virtualization layers, multi-tasking, and etc. Again, what if we can get rid of those?

 

Cache, memory and storage, make it one

If we would have a single technology that would have the speed of cache while having the persistence of storage, we could combine storage, memory and cache into a single device that would keep all the information and instructions. There would be no need any more to boot-up computers, to shut them down, or to hibernate. All instructions and data would just be there when we needed it.

 

Is this dreaming? Actually no. HP developed a technology, called the Memristor, which allows us, at affordable prices, to have cache, memory and storage functionalities in the same system for the environments we will need in the near future. And these include storage of the “big data” we all talk about in previous blogs.

 

Memristance is a property of an electronic component. If charge flows in one direction through a circuit, the resistance of that component of the circuit will increase, and if charge flows in the opposite direction in the circuit, the resistance will decrease. If the flow of charge is stopped by turning off the applied voltage, the component will “remember” the last resistance that it had, and when the flow of charge starts again, the resistance of the circuit will be what it was when it was last active. More information on the HP memristor activities are described on Wikipedia.

 

Let me just point one thing out for you to understand the change. Using the memristor technology, HP prototyped a crossbar latch memory that can fit 100 gigabits in a square centimeter, and proposed a scalable 3D design (consisting of up to 1000 layers or 1 petabit per cm3). In 2012, the device achieved a read time of 90 nanoseconds (if not faster), approximately one-hundred times faster than contemporaneous flash memory, while using one percent as much energy.

 

Connect the memory and the processor

It’s great to have huge memory/storage space. But now, how do we get quick access to all that memory as we will need it to do the job? We could use copper. But in that case we would need huge cables and consume great energy. So why not go for fiber optics? Let’s directly attach the memristor technology to the central processing unit. Not only can you transfer information up to 6TB per second, but you do it with very low power consumption. So, you win on two fronts—speed and energy consumption. Lower consumption also leads to lower heat dissipation, which in turn allows less need for cooling and the opportunity to develop denser environments, so you use less space.

 

And what about the CPU?

Well, because you don’t need the compute power to do the 80 percent of things you won’t do anymore, you don’t need the CPU power either. So you can easily work with low-energy chips, such as the ones used in cell phones. We already worked on that with MoonShot. Actually just going for such architecture allowed us to reduce the size of the IT environment powering hp.com, our own website, from 25 racks to 3. With that, we are handling 300M hits per day. And the power consumption is down to 720 watts! Taking this one step further, HPLabs is looking at servers the size of a credit card. And, as with MoonShot, the idea of using specialized servers for specific tasks is also in the designs. In an Internet of Things world, this makes a lot of sense. As servers are so small and don’t consume a lot of power, there is no need any more to virtualize the environment. You can easily dedicate one server to one task, particularly if you tune the server to the task. No virtualization reduces the overhead, and leaves more CPU power to do the real job at hand.

 

Pulling it all together

Using ions to store, photons to communicate and electrons to compute: that’s the vision of The Machine. Simplicity is its motto. We have the opportunity to build an environment that is very different from our current state. But obviously, the current operating systems have to take care of all the overhead required. It would be very difficult to strip current operating systems of all those functions, so it makes sense to create a new one. That’s what we want to do. But we do not want to do it alone, so we are creating an open-source development project. Listen to how Martin Fink described “The Machine” at HP Discover in Las Vegas.

 

 

HP is back

I started this blog entry by pointing out “HP is back”, I’d like to add, innovation is back at HP. I hope you agree with me, this project is really exciting. But it’s not all. We believe that this new technology will allow us to create mesh Clouds. You probably ask yourself what that means. I’ll come back to you discussing this in my next blog entry. But be aware, this is not the only news from HP Discover.

There’s plenty more.

 

Labels: cloud| CloudSource
Comments
John Henry McMills Warrington(anon) | ‎07-15-2014 10:07 AM

Great stuff !

StephenCoda | ‎07-20-2014 05:03 PM

Hello,

 

Where do we find the open source project?

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the community guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
Showing results for 
Search instead for 
Do you mean 
About the Author
Christian is responsible for building services focused on advising clients in their move to cloud, particularly from a business process and ...
Featured


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.