To be a modern application today, it seems they must have social, mobile and analytic capabilities. As companies strive for greater flexibility for both for their employees as well as their business model, on-the-road access to corporate information is an expectation. The BYOD movement takes it a step further and places the application delivery on employee selected hardware -- another level of flexibility.
These expectation changes are taking place within the existing application portfolio, as well as expanding into the thin-air of marketing strategies and non-traditional IT solutions. Most businesses have a search underway for additional techniques that can drive strategic business process changes, shifting the behavior of employees, partners and consumers. Gamification being one of those…
The consumers of today increasingly access social networks on mobile devices that are rarely in the control of the business. Sentiment analysis of public forums and the aggregation of that information with demand creation systems like for promotions, product feedback… can provide significant insight into the view of the consumers of products and services. As the shift in how mobile devices are used, a greater contextual understanding of the consumer and their interaction can be gained further refining an organizations understanding of their consumer base.
Some people ask why all this detailed analysis is needed? Is there something different about business today than in the past? I state that it is the “need for speed”, since the lifespan of products (let alone companies) is steadily declining. Any way to squeak out a bit more life or revenue is becoming important.
If an organization can develop a support network on the Internet (or within a large company) engage customers and build communities – all the better. This means that products that are flexible, configurable and able to integrate into other business systems have a definite advantage. Do you look at your systems from an integration point of view?
There is another change underway at the foundations of how mobile devices are used. Flurry reports that mobile time consumption on social networks increased by 60% between Q1 2011 and Q1 2012. This is significant because gaming and entertainment have always been a key focus for mobile apps and now people are using their devices for more serious concerns.
For those organizations exploring this space, there are great possibilities, but it will be important to ensure that each application provides a real, cost-effective benefit. Know what you expect and then measure against it. If you see something different, at least you learned something. If you confirmed your expectations, you’ve move to validation from supposition.
Last week, I was part of a panel discussing innovation and technical adoption with a number of CEOs in the Dallas Texas area. During the discussion we talked about the opportunities that exist around us and the new type of business models that will be driving organizations forward.
I was asked what kind of research is needed to for organizations today to match the new service opportunities of tomorrow, After the meeting some other folks in the HP Services and Solutions lab went through a few iterations to come up with a short paragraph that captures the essence of our thinking:
“Staying aligned with rapidly evolving business needs will require future enterprises to be agile and dynamic. The ability to identify and link related data, establish the right information flow, connect people and information, and provide insights on information is crucial in enabling decision making from an ever increasing stream of information. Research is needed to reduce the time to action for the enterprise, and streamline the organizational changes necessary to proactively react to the competitive landscape of the firm. In the enterprise of the future, not only employees but also customers influence success, it is important to establish the relationships and foster the collaborative culture among employees, customers, suppliers and the enterprise, and engage this ecosystem in generating value. Enabling this vision will require automated capture of digital information, technologies for connecting people-to-people and people-to-information, platforms for data analysis, response automation, context recognition, dynamic configuration capabilities, innovative collaborative technologies and knowledge enabled decision-making. As business becomes more digital (and social), these advances will be the foundation and measure for the value of IT in the enterprise.”
Tomorrow I’ll have another post about the vision implied by this research.
McKinsey just released a report titled: Are you ready for the era of ‘big data’? The article asks a few big questions to help people think about their business and the future in a world abundant with data and the analytics to make sense out of it. Those questions are:
- What happens in a world of radical transparency, with data widely available?
- If you could test all of your decisions, how would that change the way you compete?
- How would your business change if you used big data for widespread, real-time customization?
- How can big data augment or even replace management?
- Could you create a new business model based on data?
These are the kinds of questions organizations need to ask when shifting from a world based on scarcity to one based on abundance. (Wow, I just noticed I started talking about this issue back in 2005.) Notice they are not asking questions about where the data comes from – they assume it exists.
This will affect the kind of employees organizations will need as well as the types of demands that will be placed on the ecosystem that surrounds the organization. Partners will need to share more than just their work output, or maybe it is a shift in thinking what constitutes the output of an organization. Its data and insight will likely become a primary output, instead of an afterthought.
Forrester recently released an analysis of the concept of a Cloud Broker. Although there was much to agree with, I actually view it a bit differently. They have a too hardware centric view for my taste. This is not something new and we (in HP) have been talking about it for years. My view is that the various components of a cloud approach build up on top of each other as I’ve described before.
The various market components can interact in different ways to provide value to the marketplace.
- Cloud Infrastructure provider – Infrastructure operators have been around for decades. Organizations who specialize in this area focus on the automation, security and performance needed to take what has been a traditionally business by business approach into a multi-tenant solution that can be charged “by the pound”. This is the core of the IaaS space.
- Software vendors – These players are critical to the cloud since their solutions are what actually add the business value on top of the lower cost solutions provided by the Cloud Infrastructure providers. Their software (and just as importantly their licensing) needs to change to enable the cloud shift for organizations. Taking advantage of the many cores that can be applied to the new leveraged environments and to have the solutions fail over gracefully when needed with no downtime is something the software vendors need to continue to address.
- Industry consulting – As more of the infrastructure operations and management functions are provided by 3rd parties, the need to integrated it all together in a unified, business value generating solution still remain. Consultants still play a role by having the deep expertise that can be brought to bear on the strategic cloud move.
Businesses specialize at the intersection of these IT industry components:
- SaaS – this is where the software intellectual property mentioned earlier meets the cloud infrastructure. Business application functionality is offered as a service by subscription. The consumer does not normally see the IaaS issues that may reside under the SaaS layer.
- Cloud Integrator – This is where the IaaS capabilities are used by consultants to help modernize both the infrastructure and the application portfolio and make it operate effectively in this new environment. Since the value needs to be generated against the organizations business model, integrating the various components can be critical to effective use of the cloud.
- BPO – Business process outsourcing has been around for decades and it usually involves access to both the software IP as well as the personnel with the industry (or at least process skills) required to take on specific business functions. It allows organizations to concentrate on areas where they want to focus and offload other business functions to experts in that area. Some organizations have relabeled this to BPaaS so it can align to the XaaS abbreviation model, but BPO has always been cloudlike.
Finally at the hub of these intersections is the cloud broker. This function may do all of the items previously mentioned or just perform 3rd party administration ensuring that a unified solution meets the needs of the business.
They need to have expertise in a wide range of hardware capabilities, ranging from the servers through the desktop and mobile platforms – after all the computer you have with you all the time is likely a smartphone.
They must also understand the analytics and user interface issues to weave together all these possibilities into a solution that is coherent and effective for the business. No one wants to be distracted by user interface or data inconsistencies. The elimination of latency through the use of automated workflow and techniques that allow people to focus on the anomalies and automate “normal” is key here as well.
There can be a range of capabilities in this cloud broker space ranging from the simple sourcing manager all the way though the business model integrator who deeply understands the business and technology objectives of the organization and is constantly on the lookout for technology solutions that disrupt the status quo. They need to be experts on the kinds of problems that usually hurt cloud deployments. Some of the descriptions of the cloud broker function that are out there make it seem like an IT management function, but to me its roots are deeper into the business needs and expectations than that.
What is common to this range of cloud broker is that the IT complexities of managing the workload and the vendors should be hidden from the end user – and to some extent the end business. After all if they wanted to know these details and manipulate the controls, they should have someone on staff that is performing the function, instead of purchasing it as a service. Many CIOs are going to spend much of their future effort in the cloud broker function, among other things.
Although these cloud broker function may seem like a vision, there are a few organization that HP supports today where this level of performance is expected and delivered.
There are some recommendations I can agree with though:
1) Understand your window of opportunity – We can all see this happening. Know what is happening in the marketplace and if you retain your infrastructure what is the price point where you would change to a different model. If you don’t understand this point, you are passively deciding to stick with the status quo.
2) Build trust with the right partners – I used the term partners here because I’m talking about a close relationship. Understand where they are headed. What are the lock-in issues? How does that make you feel? The farther out your headlights shine and the sooner the direction and issues are understood the less likely you’ll over-steer later.
3) Plan to invest – There are many skills involved. You may not have them in-house. You will likely need to do some pilots to understand the implications financially, behaviorally and on the staff itself.
UML (Unified Modeling Language) provides a powerful capability for modeling software systems. The various modeling elements and views, along with an extensibility mechanism, can be applied for modeling other systems. So why not use UML to model any system? Users would only need to buy one modeling tool, and the market for UML tools would be expanded.
Standard extensions of UML for specialized applications are called "UML profiles." OMG (Object Management Group) has defined a number of UML profiles. For example, there is a UML profile for modeling physical systems called SysML, a UML profile for modeling software development processes called SPEM (Software Process Engineering Metamodel), a UML profile that extends UML for specification and integration of services called SoaML, and a UML profile for DODAF (Department of Defense Architecture Framework) and MODAF (Ministry of Defense Architecture Framework) called UPDM.
A UML profile can be imported into a standard UML tool to create a specialized modeling environment. UML tools support views (graphical displays) that are appropriate to a variety of modeling applications. The extensibility mechanisms of UML enable existing UML elements to be adapted as new types of model elements and attributes. While a UML tool may be able to support useful diagrams, in the long-term users will suffer from a limited modeling capability.
There are three basic limitations: (1) the UML modeling elements were designed to represent the concepts of a software system design, (2) the graphical representations are not designed for most effective human understanding of the models, and (3) a UML tool does not provide the computational support for structural consistency and functional analysis.
One of the problems encountered in the design of a UML profile is the design baggage that restricts the use of UML elements. Design of a UML profile (a specialized language) requires deep knowledge of the design of UML in order to work around these restrictions. The result is a compromise that provides a less than ideal representation of the problem domain. The semantics of the profile are in the profile documentation and minds of the users. The tool only reflects the semantics of software design.
UML does not provide good quality graphical representations for modeling software, and this does not get any better for UML profiles. An excellent paper by Daniel L. Moody, "The Physics of Notation: Towards a Scientific Basis for Constructing Visual Notations in Software Engineering," refers to UML for examples of the ineffective use of graphical notation. Users should be able to quickly associate the graphical representation of a concept with its semantics. In many cases UML makes very limited use of visual variables, thus obscuring semantic distinctions, and in many cases combinations of these variables create complexity. For example, combinations of three visual variables (shape, brightness and texture) are used to distinguish 20 different types of relationships in a UML class diagram. In a UML profile, a text label is often used to identify a concept. Text requires less efficient, serial human interpretation (reading) while distinctions using graphical shapes are interpreted through parallel, image recognition. As a result, good graphics make it easier and faster for users to grasp the meaning of a diagram.
Perhaps the most important limitation of a UML profile is that the modeling tool does not provide computational support for structural consistency and functional analysis of the model. Enforcement of structural consistency of the model using a UML profile is based on consistency of UML software design models.
A good modeling tool will help the user develop a consistent and complete model. Furthermore, a robust tool will assist in the analysis and improvement of the model. For example, BPMN 2.0 (not a UML profile) defines a modeling language for specification of business processes. This includes modeling of executable processes and choreographies-interactions between participants supported by their executable internal processes. A good BPMN tool will support analysis of compliance of an internal process with the requirements of interactions specified by a choreography. A robust tool might also provide simulation capabilities to analyze the flow of business transactions.
These functional capabilities are not defined by the standard, but will be developed as differentiators by the modeling tool vendors. Addition of such functional capabilities for a UML profile would require implementation of functionality that may not be consistent with the use of the tool for generic UML. The resulting profile language would still not be the best representation and visualization of the problem domain.
The development of UML profiles may provide a quick and cheap solution to modeling new problem domains, but, in the long term, it provides less effective domain representation and visualizations, and it undermines the market incentives for development of functionality that improves modeling efficiency and the quality of results.