The OMG (Object Management Group) is in the early stages of development of a Case Management Process Modeling (CMPM) specification. The topic of this post is to consider the relevance of case management to project management. Case management involves business processes that cannot be defined ahead of time, but are defined and adapted based on evolution of the particular undertaking.
I first talked about case management in Case Management: The Missing Link in BPM. More recently I discussed Case Management for Managers in which I explored how automation might assist managers in managing and coordinating activities to achieve business objectives.
For case management, a process is not designed for repeated use. Instead a process is planned interactively for the particular case as it evolves. An initial plan for a case may specify everything that is needed to reach an objective assuming everything goes as expected, or it may be as little as the things currently being done and some consequential action such as the next activities or a session to plan what to do next. In either case, the actual plan is expected to be adapted in response to actual needs and events. This is similar to the actual process of many project management efforts except that project management usually relies on a pre-defined project plan with corrective action for deviations.
Modeling for case management includes the specification of various patterns-what we have been calling process fragments in discussions of the CMPM specifications. These fragments have dependencies on inputs and may produce outputs that subsequent activities depend upon. A case management effort will involve the specification of these fragments along with specific activities and events to evolve what needs to be done and when. The model for a type of case may define the high-level phases that are typical of such cases. The pre-defined fragments will incorporate insights and best practices at a finer level of granularity so that they can be leveraged for many undertakings. Over time these fragments can be refined and extended to improve the efficiency, timeliness and quality of case management.
I'll use the Eclipse Process Framework (EPF) and the Open Unified Process (OpenUP) as a basis for considering the impact of case management on project management. EPF is a set of open source tools developed by the Eclipse Foundation for managing development processes, and OpenUP is a methodology and associated process patterns for software engineering that is supported by EPF. EPF implements SPEM (Software and Systems Process Engineering Metamodel), a development process modeling specification from the OMG.
In case management, the primary process structure of OpenUP can be used as case phases supported by the OpenUP patterns as process fragments the same as with EPF. The difference is that the process can be continuously updated with the addition or modification of process fragments at runtime, as the project progresses and evolves and as more becomes known. Case management will also track project activities including the addition or repetition of activities that actually occur in a project. Tracking will provide insight for further improvement of the planning fragments for future projects.
EPF would allow the development process to be adapted, but adaptation is expected to be an off-line activity. Case management can exploit the availability of the Internet and mobile devices to go beyond traditional project management with monitoring and responding to events and engaging participants more directly. Management of the activities and adaptation of the process involves on-going interaction with participants. The activities may also be more fine-grained to track progress and changes more closely. Activities in the case management process may engage outside services and track their completion. In addition, the case file will maintain access to information artifacts that define the state of the project and flag events that drive subsequent tasks, planning and decision-making.
I believe these added capabilities will foster project management that is more responsive, more closely tracks project progress, improves collaboration and communications, reduces delays through prompting based on events, and provides history to reduce delays and improve efficiency and quality of results.
UML (Unified Modeling Language) provides a powerful capability for modeling software systems. The various modeling elements and views, along with an extensibility mechanism, can be applied for modeling other systems. So why not use UML to model any system? Users would only need to buy one modeling tool, and the market for UML tools would be expanded.
Standard extensions of UML for specialized applications are called "UML profiles." OMG (Object Management Group) has defined a number of UML profiles. For example, there is a UML profile for modeling physical systems called SysML, a UML profile for modeling software development processes called SPEM (Software Process Engineering Metamodel), a UML profile that extends UML for specification and integration of services called SoaML, and a UML profile for DODAF (Department of Defense Architecture Framework) and MODAF (Ministry of Defense Architecture Framework) called UPDM.
A UML profile can be imported into a standard UML tool to create a specialized modeling environment. UML tools support views (graphical displays) that are appropriate to a variety of modeling applications. The extensibility mechanisms of UML enable existing UML elements to be adapted as new types of model elements and attributes. While a UML tool may be able to support useful diagrams, in the long-term users will suffer from a limited modeling capability.
There are three basic limitations: (1) the UML modeling elements were designed to represent the concepts of a software system design, (2) the graphical representations are not designed for most effective human understanding of the models, and (3) a UML tool does not provide the computational support for structural consistency and functional analysis.
One of the problems encountered in the design of a UML profile is the design baggage that restricts the use of UML elements. Design of a UML profile (a specialized language) requires deep knowledge of the design of UML in order to work around these restrictions. The result is a compromise that provides a less than ideal representation of the problem domain. The semantics of the profile are in the profile documentation and minds of the users. The tool only reflects the semantics of software design.
UML does not provide good quality graphical representations for modeling software, and this does not get any better for UML profiles. An excellent paper by Daniel L. Moody, "The Physics of Notation: Towards a Scientific Basis for Constructing Visual Notations in Software Engineering," refers to UML for examples of the ineffective use of graphical notation. Users should be able to quickly associate the graphical representation of a concept with its semantics. In many cases UML makes very limited use of visual variables, thus obscuring semantic distinctions, and in many cases combinations of these variables create complexity. For example, combinations of three visual variables (shape, brightness and texture) are used to distinguish 20 different types of relationships in a UML class diagram. In a UML profile, a text label is often used to identify a concept. Text requires less efficient, serial human interpretation (reading) while distinctions using graphical shapes are interpreted through parallel, image recognition. As a result, good graphics make it easier and faster for users to grasp the meaning of a diagram.
Perhaps the most important limitation of a UML profile is that the modeling tool does not provide computational support for structural consistency and functional analysis of the model. Enforcement of structural consistency of the model using a UML profile is based on consistency of UML software design models.
A good modeling tool will help the user develop a consistent and complete model. Furthermore, a robust tool will assist in the analysis and improvement of the model. For example, BPMN 2.0 (not a UML profile) defines a modeling language for specification of business processes. This includes modeling of executable processes and choreographies-interactions between participants supported by their executable internal processes. A good BPMN tool will support analysis of compliance of an internal process with the requirements of interactions specified by a choreography. A robust tool might also provide simulation capabilities to analyze the flow of business transactions.
These functional capabilities are not defined by the standard, but will be developed as differentiators by the modeling tool vendors. Addition of such functional capabilities for a UML profile would require implementation of functionality that may not be consistent with the use of the tool for generic UML. The resulting profile language would still not be the best representation and visualization of the problem domain.
The development of UML profiles may provide a quick and cheap solution to modeling new problem domains, but, in the long term, it provides less effective domain representation and visualizations, and it undermines the market incentives for development of functionality that improves modeling efficiency and the quality of results.
There was a question the other day by some folks (in the Enterprise irregulars) about the capabilities and the pitfalls of modeling and code generation -- something that's been dear to me since the 1980s. We have definitely moved the ball toward the goal, but there is still a long way to go.
EDS actually wrote quite a bit of model-based and model consuming software back in the 1980s. There were also some fairly powerful VAX implementations of CASE tools (software through pictures was one) and some fairly lame mainframe based solutions (e.g., Telon which is now owned and maintained by CA and is hopefully much better). Here is my off the cuff response about model based efforts:
A couple of areas of concern:
1) lock-in: EVERY tool I have ever seen has been proprietary in how it stored its models and the company that created the modeling software (or at least the project) has been on the edge of being canceled. When that happens all the code generated is relatively unsupportable (at least at the level of productivity expected when you bought the tool)? Will you move the model to another tool manually???
2) standards: there are a number of standards (e.g., BPML and BPMN) for modeling and they continue to churn every year. Just because you modeled in today does not mean that it will be as valid in a couple of years and will likely need to be reworked every few releases, since the generating software will advance.
3) level of automation: The question for me is: "Will the tool actually model the issues that are causing the organization pain over the long haul." or are they modeling something that is done once and unlikely to change? Does it support round trip engineering?? It is relatively straight-forward to model the generation of the user interface or the data connection, but what about 3-5 years from now? Is that are of the environment likely to change the most and need increased productivity to address the change? Basic modeling advances as products advanced as well --many of the capabilities of MS Access today (or even IDEs like Visual Studio) exceed the capabilities of the CASE tools of the 1990s. Each generation of a development environment tries to improve code creation. Few do anything about maintenance and yet that is where the most significant costs of the product life cycle reside.
4) Some developers hate it: modeling business processes is not what they signed up for. They want to solve technical problems, not business problems. A business analyst uses modeling tools.
5) Modeling makes certain assumptions that support the abstraction. If the problem does not fit the assumptions, then the solution may be more difficult and less satisfactory. You can't design an electrical system with a mechanical engineering CAD tool.
On the other hand there are tremendous needs to be addressed:
1) Software creation is expensive: Taking the people out of the development process limits the need to take work off-shore. Modeling tools increases the communications conduit significantly between the developer (modeler) and the customer. This should reduce the depth and breadth of rework. These tools are part of a strategic approach to addressing the high cost of code creation and maintenance.
2) Software creation is error prone: Automatically generating the code should significantly reduce the number of errors since the software will be generated the same way each time.
3) Software creation is increasingly complex: A code generator can crank out WSDL and read SOAP quickly and portray it in a visual context for most developers much better than staring at XML will ever allow. Most developers are near average and average developers may not handle the complexity of today's enterprise environments -- especially as we move to parallel processing in the cloud. Mere mortals don't write code for many cores in the 3rd generation language.
4) Modeling can enable applications to survive technology change, so the code can be re-generated when the supporting technology is upgraded or the application is transferred to a new platform.
5) Modeling opens the door to automated analysis at a higher level, so the modeling tools can have built-in assists and quality checks.
6) Modeling usually provide better support for rapid prototyping and collaboration on the solution with the customer.
I am sure there are a number of others aspects of the issue that I've left out.
I listened to a podcast by Setrag Khoshafian of Pegasystems in which he talked about "Smart Case Management and CRM." It struck me that case management has potential to improve continuity of service, efficiency and reliability in a number of contexts that may not be considered case management.
I like the following definition of case management presented by Singularity in a white paper entitled "Case Management: Combining Knowledge with Process: "
"Case management is the management of long-lived collaborative processes that require co-ordination of knowledge, content, correspondence and resources to achieve a goal or objective. The path of execution cannot be pre-defined. Human judgment is required in determining how to proceed and the state of a case can be affected by external events."
The goal of case management technology is to provide computer support for managing these ad hoc processes. Henk de Man of Cordys provides an extensive review of past approaches to case management modeling in his BP Trends article Case Management: A Review of Modeling Approaches.
In Setrag's discussion of smart case management, he describes using case management to manage a customer support interaction through participation of various organizations that otherwise might each provide their contribution within the scope of their own silo. Case management can provide a unified view of the customer, the problem to be solved and the cumulative knowledge about that problem without each silo starting their analysis from scratch. This also provides some opportunity to personalize the customer relationship as the solution is developed. However, I see a greater opportunity that, I think, goes beyond Setrag's solution.
I see the opportunity for case management to provide continuity and personalization of service for the on-going customer relationship-an on-going case. This is more like a primary physician's patient care that retains a patient history and incorporates knowledge of, not only the patient's current malady, but a complete file of the patient's problems, treatments and responses potentially over years.
This opens up many more possible applications of case management. Look again at the Singularity definition. Isn't this what managers do? The difference is that unlike the common perception of business processes, management processes go on and on. A manager has a collection of documents pertaining to his or her responsibility and the state of the organization, as well as a history of problems solved and initiatives completed-a manager's case file. He or she manages various problems and initiatives to fulfill that responsibility, engaging the participation of others. The focus of the manager's case is the functional capability of the organization. The manager collaborates with others, applies business knowledge and experience, and performs ad hoc planning and decision-making in pursuit of his or her business goals. At the same time, the manager engages other employees and services, applies business policies and regulations, and must respond to various events both external events and those that arise in the operation of the organization.
Couldn't we approach any business responsibility in a similar manner? It seems that this characterizes the job of a quality assurance manager, an information security manager, a vendor relationship manager, a strategic planning facilitator, the chair of a task force and many other on-going business responsibilities involving cross-enterprise collaboration.
The Object Management Group (OMG) has issued a Request for Proposals (RFP) for a Case Management Process Modeling specification. Work on this specification is in the early stages. I don't know if we can address this case-management-for-managers capability, but I think it's worth considering.
In a recent article, Jack Vaughan quoted Jan Baan as saying, "The successor of ERP is BPM....ERP is becoming the model of complexity. It has become too complicated." Baan is CEO of Cordys and former head of Baan Corporation, an ERP vendor. BPM (Business Process Management) is the leading edge of a major change in enterprise systems.
Much has changed since the heyday of ERP. The Internet and internet technology has changed communications and integration. Businesses are on-line and accessible from anywhere, any time. The marketplace has become global for all enterprises, not just large companies. The pace of change has accelerated, and information technology is pervasive in the operation of the enterprise and in society in general. Business services are accessible, ad hoc, over the Internet, and service oriented architecture is changing not only the design of computer systems, but the design of enterprises.
ERP systems are traditionally monolithic solutions for automation of business operations. They provide a solution for a particular way of doing business and typically require a major investment in implementation and adaptation to align the solution and the business operation.
The future requires enterprises, as well as systems, designed for change. BPM, supported by business process management systems (BPMS) enables flexible automation of business processes. ERP systems traditionally embed business processes in the code so that changes to business processes become IT projects. A BPMS provides the opportunity to model and automate business processes in a way that is visible and adaptable by business people. The BPMN 2.0 standard from OMG (Object Management Group) defines BPMS notation and modeling elements for defining and automating repeatable business processes.
Not all business processes can be pre-defined and repeatable. Some processes require ad hoc planning and decision-making by humans where actions are driven by the state of a case and records related to the case. We characterize such processes as case management processes. Some ERP systems provide record-keeping for case management. Many other case management processes remain paper-based and manual because they don't fit the definable, repeatable model supported by most BPMSs. OMG has initiated development of a Case Management Process Modeling (CMPM) standard for design and automation support for these processes, thus moving beyond the automation capabilities of ERP systems.
BPM, including case management, will improve the effectiveness and agility of business processes, but it does not necessarily improve the agility and overall efficiency of the enterprise. Enterprise agility and efficiency involves changes to the design of the enterprise. It requires the ability to rapidly adapt or create business capabilities and the ability to engage existing capabilities in new ways. ERP systems are a barrier to such changes. This agility is enabled by exposing business capabilities as sharable services.
In Business Capability Mapping: Staying Ahead of the Joneses, Denise Cook describes how business capabilities, at an appropriate level of granularity, represent stable components of the enterprise. As services, these capabilities can be engaged by different business processes to meet the needs of different lines of business and adapt to changing business strategies. These services must be loosely coupled so that they are independent of the individual lines of business to which they contribute or may contribute to in the future. In Why SOA is a Business Necessity, I describe the importance of this service oriented architecture (SOA).
The identification and management of business capabilities as services requires a specialized discipline beyond conventional BPM. This need is addressed by value chain modeling. Value chain modeling was introduced by Michael Porter in 1985. For more on value chain modeling and its various incarnations, see Value Chains and Other Processes, by Paul Harmon. Value chain modeling brings a focus on the delivery of customer value and the capabilities that are required. This analysis has been used at the executive level in strategic planning, but it typically lacks the detail to identify the stable units of capability that should be engaged as services. This more detailed analysis as well as the design and management of shared services require a modeling environment to manage the complexity.
This need will be addressed by a specification for value chain modeling being developed at OMG in response to the Value Delivery Metamodel RFP. A value chain model will define the network of activities that contribute to the delivery of customer value. These activities represent uses of business capabilities. Different lines of business may have different value chains that define the uses, or potential uses, of shared capabilities. In addition to consolidation of capabilities for economies of scale and agility, this perspective supports consideration of investments in improvement as well as outsourcing and configuration of joint ventures. Optimization will shift to an enterprise perspective, to reflect the cross-enterprise impact of shared capabilities that serve the needs of multiple lines of business.
This architecture is quite different from conventional ERP systems. The business processes will no longer be embedded in program code. The functionality that supports the stable business capabilities may be much the same, but it will be carved out to support loosely coupled services. BPM is the beginning of the end for ERP systems as we know them.
We are in a state of transition to a new business paradigm. The full transformation of enterprise systems and the enterprise will involve BPM, loose coupling of business capabilities and additional modeling capabilities that together will provide an enterprise architecture model from a business perspective. Cordys and Hewlett-Packard are at the forefront of defining the business modeling specifications discussed above and the development of future enterprise architecture modeling capabilities.