By Frank Wagner
Business Manager, HP Technology Services for SAP, EMEA
SAP HANA turbo charges the analysis of business data. In fact, SAP has customers who are seeing performance improvements of 140,000:1, meaning that analysis that used to take days can now take less than five seconds.
Consider the application of this kind of capability. For example, in the financial services industry, SAP HANA enables the near real time analysis of disparate, “moving target” data, such as fluctuations in government bonds, currencies, emerging market projections and country credit ratings, all of which are critical to making decisions on where to invest in a financial portfolio. If your business is advising clients where to invest, and these investments are worth multi millions of dollars, it’s critical that you have the most accurate and most immediate information available.
Before the introduction of SAP HANA, analysis could take days to complete, in which time the calculations would be effectively obsolete. Now, with SAP HANA, analysis can be discussed, revised and recalculated over the phone with a client as it happens, with greater accuracy and more potential for being a step ahead of the rest of the market.
SAP HANA, when it is deployed, creates both a changed, streamlined architecture for data warehousing and also higher expectations of analytics throughout the organisation. As another consequence though, if something goes wrong with the solution, the actual and perceived impacts are much greater than before.
Let’s have a look at the evolution all the way to SAP HANA. Prior to Data Warehouses or Business Warehouses the analysis of data was inside the various source systems themselves, such as the ERP system. Collecting data inside such warehouses inevitably creates redundancy, yet allows for much faster analysis. Conventional tuning of SAP Business Warehouses is done through “aggregates”. These are duplications of commonly required data which enables significantly improved performance (20 second query times as opposed to minutes or hours). The two downsides of aggregates are that they take up a huge amount of storage and they must be kept up-to-date, which is a significant effort. In many customer cases we know of, only 20% of the database is original data, with the other 80% being taken up by aggregates.
Now, by implementing SAP NetWeaver Business Warehouse Accelerator, customers save significantly on capital and operational costs of over specified storage and improve their analytic capability to sub-second query response times (typically by double-digit factors, i.e. sub-second response times as opposed to more than 20 seconds). Customers save even more effort if they, in addition, delete the aggregates and do no longer maintain them at all.
The downside occurs if
1) SAP NetWeaver BW Accelerators are introduced to become faster
2) If aggregates were removed (to release equipment and to save the maintenance efforts for the aggregates)
3) The SAP NetWeaver BW Accelerator then fails.
If the above 3 circumstances come together, then response times in the Business Warehouse can sky-rocket to hours or even run into timeouts.
SAP NetWeaver BW Accelerator can be regarded as SAP HANA’s in-memory“predecessor”. While the SAP NetWeaver BW Accelerator accelerates the Business Warehouse reporting, the SAP HANA will replace the Business Warehouse database rendering the SAP NetWeaver BW Accelerator obsolete. Reporting performance improvements of 1500x was out of reach for acceleration through SAP NetWeaver BW Accelerator. Double digit factors yes, maybe low triple digit factors as well. But not to the levels seen with SAP HANA. So the downside potential from an SAP HANA failure is dramatically bigger than the one from an already dramatic SAP NetWeaver BW Accelerator failure.
When SAP HANA is the database for the SAP NetWeaver BW, and subsequent failure of SAP HANA creates a truly dramatic impact, because the data is now persistently mastered within the SAP HANA in-memory database. This is a major problem, particularly if the expectation of consistently reliable mission critical availability and real time analytics has led to changes in an organisation’s business processes, such as product manufacture being guided minute-to-minute by analytics of retail sales.
To summarize, the hierarchy with regard to speed when analyzing data is:
- Business Warehouses without aggregates. This is (often too) slow.
- Business Warehouses with aggregates. This is faster, yet still slow but often acceptable.
- Business Warehouses combined with SAP NetWeaver BW Accelerator (aggregates either kept or deleted, this has no impact on response times). This is fast.
- SAP HANA (everything is in memory by design, no aggregates possible anywhere). This is so hyper-fast that the term paradigm shift is truly appropriate.
The one and only solution to this is to ensure that the right level of service and support is in place to pro-actively avoid SAP HANA downtime.
This is where HP Mission Critical Services comes into play. In the next blog on this topic I’ll discuss how HP can de-risk SAP HANA and ensure constant uptime for customers. To learn more on how new solution, services enable SAP application users to rapidly access, store and analyze data read HP Delivers Scalability, Availability, Expertise for SAP HANA.