Hyperscale Computing Blog
Learn more about relevant scale-out computing topics, including high performance computing solutions from the data center to cloud.

How will you analyze 1,600,000,000,000 GB of Information?

IDC estimates the economy produced a constant flow of information equaling 1.6ZB (1.6 trillion gigabytes) in 2011. [1]  Now that’s Big Data!  This trend has caused a rapid growth in a new breed of big data applications to analyze and monetize this information, but traditional architectures aren’t scaling to meet the need.

Search
Follow Us


About the Author(s)
  • Hello! I am a social media manager for servers, so my posts will be geared towards HP server-related news & info.
  • HP Servers, Converged Infrastructure, Converged Systems and ExpertOne
  • WW responsibility for development of ROI and TCO tools for the entire ISS portfolio. Technical expertise with a financial spin to help IT show the business value of their projects.
  • Luke Oda is a member of the HP's BCS Marketing team. With a primary focus on marketing programs that support HP's BCS portfolio. His interests include all things mission-critical and the continuing innovation that HP demonstrates across the globe.
  • I am part of the integrated marketing team focused on HP Moonshot System and HP Scale-up x86 and Mission-critical solutions.
Labels