By 2015, it is expected that there will be eight zettabytes of information in the digital universe, according to research by IDC; that’s 8,000,000,000,000,000,000,000 bytes (21 zeros). I shared this statistic at a recent presentation for HP channel partners in Australia; the response was that of amazement and surprise.
This explosion of data signals a new era that will leave one not only amazed, but also perplexed on how to manage, secure and optimise this big data should CIO’s decide not implement a big data management strategy.
So what is big data?
Wikipedia refers to Big Data as “a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, manage, and process the data within a tolerable elapsed time. Big data sizes are a constantly moving target, as of 2012 ranging from a few dozen terabytes to many petabytes of data in a single data set.” (Retrieved Monday 23rd July 2012)
Another angle on Big Data is offered by an IDC report: “Big data technologies describe a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data, by enabling high-velocity capture, discovery, and/or analysis.”
What is fuelling this beast? From social networking, to thin clients, to streaming video, the trend we see here is not unusual or unexpected. It is quite the opposite – the desire for always-on, available-anywhere access to this rich content.
IDC also reports that unstructured data is set to grow approx. 60% year-on-year compounding; at this rate organisations will outgrow their storage capacity. How does one combat this phenomenal growth?
Throw more hardware at the problem?
Traditionally this was the CIO\IT Manager’s quintessential answer to growth, but this answer will no longer be feasible given the exponential rate the storage is growing. It will outpace the rate at which hardware can be procured.
More hardware also leads to more costs in data centre expenses – the more shelves of disks bought, the more space, power and cooling is required to keep the lights on. And the cost of power has been rising consistently in many countries; an example of this is a report released by the Energy Users Association of Australia (EUAA), which shows average electricity prices have grown by as much as 40 per cent in the past five years.
Thankfully, the concepts of data deduplication and thin provisioning exist to relieve some of the pressure that this growth imposes, but unless it is unallocated and reclaimed back to the storage array, it will feed into the overarching big data statistic.
Push it to the cloud?
CIOs can make the decision to adopt a cloud initiative to make their own Big Data phenomenon someone else’s problem, as opposed to spending on internal infrastructure investments.
It seems like more CIO’s are deciding to take the plunge – IDC’s updated IT Cloud Services Forecast estimates that public cloud computing will make up $17.4 billion worth of IT purchases and be a $44 billion market by 2013.
The move to the cloud can offer organisations benefits such as flexibility, scalability, and lower costs, certainly appealing to anyone on the cover of it. But this may also re-introduce risks in the shape of security and legal obligations. Cloud computing is not necessarily insecure, but it does require a heightened level of policy integration, such as access control and resource provisioning.
There isn’t one solution to suit all; any strategic IT decision and\or investment requires the right inputs and understanding of big data before execution.
Business IT leaders must not join the crowds by sitting back paralysed with indecisiveness on what to do; they need to start thinking about this sudden exponential growth of data at two levels: the traditional store and restore (backup), and the new requirements of extracting the pertinent information from external stores.
Learn how HP Big Data Consulting can help you realize all the benefits of your data.