You won’t be surprised to hear that data continues to grow at an alarming rate. In fact in 2011, 1.8 Zettabytes of data was created and that number will rise to 35 Zettabytes by 2020.
Today’s 1.8 Zettabytes are coming from many places: from social media sites, digital images and video to online transaction records and beyond. Multiple devices, platforms and data sources make capturing, storing and managing data difficult and complex. Existing approaches weren’t designed to support the volume of data generated by companies today and are even less equipped to take advantage of this information for actionable business insight.
So inevitably customers struggle with how to manage this Big Data, and how to do so cost effectively. To address these challenges, many organizations are looking to Open Source technologies and particularly towards implementing Apache Hadoop, an open source distribution data processing technology. According to IDC, Big Data technology and services will grow at 40% to reach $17B by 2015. The worldwide Hadoop-Map Reduce ecosystem software market is growing even faster, at over 60% and is expected to cross $800 million in 2016.
However, while the Hadoop platform offers many advantages, Hadoop infrastructure can be challenging to configure, manage and scale and in many cases, customers do not have the expertise required to deploy Hadoop clusters with the best performance and resource utilization.
That’s why the latest announcement from HP at Discover 2012 in Las Vegas on Monday 4th June is so exciting.
New HP Solutions for Apache Hadoop take the guesswork out of Hadoop deployment for customers. The solutions distill thousands of hours of expertise and experience gained by HP Big Data experts into three optimized, pre-tested, pre-configured reference architectures and a factory integrated system to deliver the best customer experience for Hadoop environments.
There are many highlights of our new solutions including the following:
- Simple push button deployment of Hadoop clusters, for deploying thousands of nodes in minutes vs. months with unique HP Cluster Management Utility
- Optimized solution health and performance with 3D real-time and historical monitoring of infrastructure and Hadoop metrics
- World record Hadoop performance to help customers make faster business decisions – HP delivered the world’s fastest 10TB Hadoop TeraSort benchmark for Hadoop
- Quality assurance through partnership with the top three Hadoop distribution vendors
- Deep synergy for end to end analytics with Vertica and Autonomy
- Risk free scalability backed by HP Technology Consulting Services
Over the coming days you will find a huge amount of information on our Apache Hadoop solutions at our new Rethink BI blog, including videos, podcasts and blogs which will give you the deep insight you need to help make decisions in the world of Big Data.
If you use Apache Hadoop or are figuring out your next steps please get involved in the discussion at the site and let us know what you think.