Over the weekend I was reading a spy novel written by Robert Ludlum in 1982 and I was struck by how much technology has changed in the last 30 years. The novel was the story of a worldwide intelligence manhunt for a mole inside the US government. The tools they were using were payphones, teletype machines, logic, deductive reasoning, and lots and lots of paper files. There was no GPS, Internet, cellular, DNA testing, or big data analytics. As it happens, 1982 was the year I graduated from high school, and it doesn't really seem that long ago.
I got to thinking about how much things had changed in IT just in the last ten years. Ten years ago I had a very cool StarTAC cell phone, but the only thing I could do with it was call people. I had about twenty CDs in my car, no MP3s, no iPod, Bluetooth or GPS. I had a Palm V organizer with a 16MHz processor, 2 MB of RAM, a little pointer stick, and a black-and-white screen. At that time Palm was a division of 3Com. Funny how things work out… Also 10 years ago I was using the Internet over a 56K dial-up modem. I had never used online banking, eBay, Google, Twitter, or MapQuest. I had seen HDTV, but only at the Museum of Science in Boston.
How about five years ago? A lot had changed by 2006, but we still did not have the technology and infrastructure in place to deliver fast, reliable, high-bandwidth delivery of digital content. Today we no longer have the kind of unpredictability or contention we would have experienced only five years ago. We get high-quality delivery of massive amounts of content at incredibly affordable costs. Literally billions of people have Internet connected handheld devices and are sending and receiving digital images and video, and unlike five years ago, we now take all this for granted. The world has changed radically in a very short period of time.
The impact on business and IT is significant in many ways. The one I want to focus on is that this rate of change means that technology and application requirements have become highly unpredictable over much shorter time horizons than in previous decades. Let's say I were to go back in time ten or even five years, and when I got there I spoke to the top one thousand CIOs in the world. Imagine that I told them that in 2011 all of them would be:
- Handling Skype or similar communications across their network from every desktop,
- Thinking about virtualizing all of those desktops and bringing them behind the firewall,
- Letting all their employees access corporate e-mail from an iPod-like device and just about any other device the employees felt like bringing in to work,
I would wager that they would've all laughed at me and sent me back home in my time machine. But that's where we are. Every business is dealing with unpredictable business requirements, many of which are driven by technology change, and many of which are driven by innovations within their own industries. Financial companies are now tackling high-frequency trading applications they didn't have five years ago. The movie industry is producing digital, high-definition and 3D movies. Healthcare companies rely on electronic medical records that have to be available anywhere. The music industry has been completely replaced.
Let's think about the next five years. What's going to change next? Nobody really knows. That's why it's called unpredictable.
What this means for the storage industry, and the people that have to buy storage today, is that if they continue to use traditional storage technologies they are playing a very high risk game. They're betting that what worked in the past will continue to work in the future. That used to be a good bet, but those days are gone. The problem is that most of the storage systems that are installed in data centers today were architected 15 or 20 years ago. They were designed for a world where application requirements were predictable. If you had an OLTP application, you implemented a storage device that was designed for OLTP, and you set it up in such a way that it would support that workload. Typically e-mail ran on a different storage system that was set up and tuned for e-mail. Today the storage system you install needs to be able to handle a tremendously wide variety of application and workload requirements with little or no tuning. It has to handle many more applications because nearly all systems today are virtualized. Rather than having one or two applications per system, you frequently have hundreds or thousands of different apps hosted on the same storage array, and a wide variety workloads. It also has to handle whatever magical new application is thrown at you three or five years from now. You can no longer deploy traditional storage, plan and tune every app, and try to manage it all with a whole bunch of people and monitoring software. IT organizations that keep doing it that way will fail.
This unpredictable world is where HP 3PAR shines
3PAR was designed to work in IT-as-a-service environments, and has been widely deployed in IT service providers that host many applications of all types from many different customers on shared infrastructure. To accomplish this, a 3PAR system had to manage itself autonomically, and it had to be architected for shared-everything world. 3PAR is the only major storage platform that was delivered to market within the timeframe that I described in the first part of this article. It is the only one that contemplated the rate of change and unpredictable workload requirements that companies have to deal with today. If you are concerned about what the next five or ten years will bring, you need to look at HP 3PAR now.