Business Process Management or BPM is on the rise again, even though it’s been around for already a long time. Service Orientation makes BPM easier but still there challenges. Modeling data as part of the process is one of these challenges, (too) late designing of the process data results in rework and potentially loss of data when we discover the lack of data after the go-live.
Implementing processes with Business Process Management is different then we’re used to when we implement for instance a transaction system or a CRM system. These systems are mostly targeted at one department and we’re dealing with one set of stakeholders. BPM implementations work across applications, departments and sometimes even across companies. In the process of aligning stakeholders and trying to get a grip on the process functionality we tend to postpone discussions around data flowing through the processes.
Data is flowing through processes, but what data do we need? It sounds like a contradiction to the title of this blog but the answer is ‘As less as possible....’. There are two simple reasons
Firstly the process is not owner of the data, however acts as an intermediate between applications, departments and across companies. Application data can be needed in a process to give context for example in a Human Task, but should not stay there for a long time.
The second reason is performance. Data load in a process has impact on the performance of the application server. Most BPM tools off-load the data from the application server to a database, this in order to avoid clogging of the internal memory by all the process load and avoidance off process data loss in case of a server failure. High volume process traffic results in a lot of data traffic between the application server and the database, which in turn results in bad performance.
What types of process data can be recognized:
Key Data represent the unique IDs of the data flowing through the process, and act as a reference to data stored in a back-end system. The approval process for a Book order for instance should contain the unique ID of the customer. The entire data object of the customer is stored in the CRM system.
Content data is the data travelling in between services, where it acts as data integration. This data acts as an Esperanto between applications, and is based upon a canonical model. Status values in one application typically are not the same as in another, for instance ‘Active’ in system A versus ‘A’ in system ‘B’, so the BPM solution, or the connected service bus solution, should contain transformation functionality. Another usage of Content data is to present context within human tasks. When an exception occurs the end-user is not interested in the Customer ID, but likes to know who to talk to. Content data typically does not flow through the entirety of he process but is attached to a couple of tasks within the process.
Steering data helps determining in the process what route to go or who to assign a process. Gateways in processes use process data to determine what route to go. Often a business rules engine is fed with content data, which in return delivers an outcome that is used in the Gateway. Human Tasks in turn can use Business Rules to determine who to assign to based upon content available in the process. So based upon a specialism knowledge can be retrieved from the data who to assign a certain task.
KPI data is used to give Insight in the behavior of a process. Measurements at certain locations in the process save content data and the behavior of the process. For instance we’d like to know the amount of times the load of a batch has failed, what the reason was and the amount of time involved per retry.
How do we need the KPI data? We can obtain information out of the process and content data on different levels.
- Tactical : Information gathering on the long run, what happened in (finished) processes over the last year. This is ‘good old’ Business Intelligence (BI) as we’re used for more then 10 years
- Governance : This is about monitoring ‘alive’ processes, those processes are not finished and in a certain state. We’d like to see where bottlenecks are, can we meet up with our KPIs. For instance, what is the throughput of processes being executed in a Call Center. If the amount of calls coming increases and a calls start piling up at one call center employee we’d like pro active take action This type of functionality is typically executed in dashboards and supported by event handling
- Operational : Real-time take action upon a situation in a process. A certain situation is been met, and an event is sent out to an application or person. A couple of examples in this area are (1) A couple of years ago I used my credit card in Paris in a pizza restaurant. Within an hour the credit card was charged in New York, Singapore and Tokyo. This behavior was matched by the credit card company as abnormal and give they gave me a call to check the validity of this behavior. (2) In the delivery process of a truck load of meat (needed to be below -20C) by accident the crates have been put in a location at room temperature. The manager needs to be informed a.s.a.p. in order to move these crates to the correct location
The above plate was created by Ard Jan Vethman in work we did four years ago in the area of RFID.
When we are creating our BPM application we should know what level information we’re interested in and what data is needed. The sooner the better!
Léon Smiers, Capgemini Oracle BPM Thought Leader