Cloud Source Blog
In This HP Cloud Source Blog, HP Expert, Christian Verstraete will examine cloud computing challenges, discuss practical approaches to cloud computing and suggest realistic solutions.

Keep data under control while benefiting from hybrid cloud

Source: Cordis.europa.euIn my last blog entry, I talked about the importance of understanding how data is managed within a cloud environment to ensure data protection and compliance with local legislation. In a previous blog entry, I discussed a different way of approaching cloud and talked about the concept of “Service as a Service,” pointing out how, by separating the business process and the elementary transactions, business people could adapt their processes and ensure they achieve the appropriate level of agility for the enterprise.

 

Could I use such approach to resolve my data headaches? Probably, but before discussing this, let me explain why I speak about data headaches. By now we agree that cloud is not one size fits all, and that most companies see the need of combining multiple cloud environments and services to deliver appropriate functionality at the pace required by the business. To do this, companies are going back to a “best of breed” approach developing core services themselves and sourcing all others from a variety of service providers.

 

Data management in hybrid cloud

As for each service, the logic and data are hosted together on the providers cloud platform, the company ends up with its data scattered over multiple, often incompatible, environments. I’m not going to discuss the legal and security implications again (see my last blog entry for those), however, I want to draw your attention to the headache this becomes when you want to consolidate the data for analysis purpose for example.

 

Let me take a classic example. Many enterprises use Salesforce.com and have their entire customer data located in the Salesforce environment. Let’s assume for a moment they close an order that was tracked in the forecasting system. Their order now needs to be entered in SAP, and let’s assume for a moment that the SAP environment is behind their firewall. Most of the information required is available in Salesforce as it was entered during the sales process. So, what I’d like to do is press a button, get the information transferred to SAP, add the missing pieces and place the order.

 

That sounds logical and simple doesn’t it? Well actually it isn’t. Salesforce takes care of CRM, SAP of order management (amongst others). Sure, I can write the necessary scripts using the Salesforce PaaS environment, called TheForce.com, extract the data, transfer it to the SAP system and use iDOCS or any other technique to feed the information in SAP.

However, it is my responsibility to maintain that link and update it every time there is a new version of one of the environments. As this data is sensitive, it’s also my responsibility to ensure the data remains secure during the transfer process. Now you understand why I speak about a headache?

 

Now think about if you want to perform some business intelligence analysis, not exactly knowing what you are looking for. How do I search the Salesforce environment and correlate what I find there with appropriate information found in SAP? It can be done and companies are doing it, but it isn’t that easy and straight forward. Do you now understand why I used the term “headache?”

 

Data in a service-oriented world

Let me go back to the “service-oriented world” I described earlier. Each elementary function is triggered by the orchestration engine and provided with the data it requires to perform its task. As soon as the task is performed, the service returns the achievement status and the associated data to the orchestrator which can then store the data for later use by another elementary function. Using the outcome status (succeeded, failed or a specific code), the orchestrator can then initiate the appropriate next step by triggering another service. And so we go through the business process instance.

 

If some changes are required to the process, the design tool is used to adapt the orchestration workflow to the new situation. This may include:

  • Changing the order in which services are executed
  • Adding or removing steps
  • Changing decision criteria
  • Providing or retrieving different data items
  • Etc.

That can easily be done through a graphical designer and does not require programming. The designer would need access to two things, a library of services that are available to perform the process steps, and a description of the data structure that will retain the information. For each service, the input and output information should be documented as well as the possible outcome status. With that the process can be designed.

 

What’s the difference?

The main difference is associated with the data and how data is handled. In the first scenario the data is located with the service and as such distributed across the different service locations as I mentioned with the Salesforce example. If I want to re-use that data on another platform, I’ll have to create a synchronization mechanism.

 

Either remote access or duplicating the data and keeping the two data sources in synch are the typical methods used when remote services are consumed.

 

In the case of the service-oriented world, things become different. Now the data is maintained in one central environment and only the required information is sent to the service for execution. Obviously that data can be encrypted before being sent to ensure security and confidentiality of the data. The advantage is that the persistent data is kept in a single place, addressing many of the fears and needs discussed in my previous blog entry.  Now, we will need bandwidth and latency to ensure the steps in our processes can work as fast as they need. This will result in increased traffic, which some service providers are charging for. So, you need to be aware of what you are getting into, but frankly I believe the benefits outweigh the drawbacks.

 

Conclusion

Separating business process logic from elementary transactions may be foreign to our current software development thinking in the same way as code and data separation was 30 years ago when the database was created. I believe technology is there to evolve this next step and I hope I have given you some good reasons why to do this. But there will be strong push back from the leading ISVs. Some of them reluctantly move to the cloud, so will push back when it comes to converting their crown jewels in a series of elementary services which I choose from. And yes, they would be paid every time I use one of the services, but such financial model is far removed from the current licensing model.

 

But the market will push them as centralized data is so much easier to manage. And by the way, if the existing ISVs don’t move, new start-ups will see this as an opportunity to create a business. 

 

 

If you’re interested in HP’s building blocks for a service oriented world, click here

If you’re looking at developing applications for a service oriented world, click here

Labels: cloud| CloudSource
Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the community guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
About the Author
Christian is responsible for building services focused on advising clients in their move to cloud, particularly from a business process and ...


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation