Some might call it hullabaloo, some would even say ballyhoo, but in any parlance there seems to be an awful lot of hoopla over Hadoop security.
In the film “Big,” Tom Hanks forgot who he was when he became big. The lesson here is that once small data becomes big data, it should never forget the types of security that protected it when it was small.
Identifying risk is much like decoding a cipher, with the proper key the answer is quickly revealed. As Ralphie discovered using his Ovaltine secret decoder ring in the classic “A Christmas Story” movie, the answer can actually be hiding in plain sight.
One of the main considerations for moving toward a cloud service provider is vastly improved uptime over what one could achieve in house. However, is that consideration going by the wayside?
Twenty-six percent of data breaches occur at externally hosted facilities, accounting for 45+ million compromised records in 2011. If we assume for a moment that this represents the known universe of cloud computing attacks, this assumption is alarming in of itself. Moving data to a public cloud provider requires one relinquish a substantial amount of control over data as well as the assets in which they reside. This loss of control is where the risk gap exponentially expands.