multiprocessmining.org is an initiative to gather use cases and develop technologies for process mining over
multiple objects and entities,
By Dirk Fahland.
Here is a list of activites by the Process Mining vendors at ICPM that I have seen in the program that – in my view – will have a relation or show use cases for Multi… Process Mining.
It seems that Thursday 9th October 12:30-14:00 is high-noon for Multi…Process Mining of the process mining vendors at ICPM.
I may have missed some relevant events – if I did, let me know in the comments.
By Dirk Fahland.
Analyzing processes supported by an ERP system such as Order-to-Cash or Purchase-to-Pay are one of the most frequent use cases of process mining. At the same time, they are one of the most challenging, because the processes operate on multiple related data objects such as orders, invoices, and deliveries in n:m relations.
Event log extraction for process mining alays flattens the relational structures into a sequences of events. The top part of the following poster illustrates what goes wrong during this kind of event log extraction: events are duplicated and false behavioral dependencies are introduced.
A possible way to prevent this flattening is to extract one event log per object or entity in the process: one log for all orders, one log for all invoices, one log for all deliveries. The result is a so-called artifact-centric process model that shows one “life-cycle model” describing the process activities per data object.
But analyzing the process over all objects also requires to extract event data about how objects “interact”. Technically, this can be done by extracting one event log per relation between two related data objects (or tables). From these, we can learn the flow and behavior dependencies over different data objects.
Decomposing the event data in this way into multiple event logs ensures that event sequences either follow one concrete data object or follow a concrete relation between two related related data objects. The resulting model only contains “valid flows”.
By Dirk Fahland.
In this post, I show how simple data visualizations help understanding the multi-dimensional nature of processes. Even the most simple classical business processes have important dynamics that cannot be understood by cases in isolation.
The Performance Spectrum was originally designed to deliver a useful process map for analyzing logistics processes over time. When applying the same technique to business process event data, the performance spectrum proves equally useful as it unveils process characteristics that were so far hidden by existing process mining tools:
To support and trigger further research in this area, we released a smaller-scale version of the Performance Spectrum Miner enabling the analysis of business process event logs. The platform independent (requiring JRE8) tool is available as a
Below, we show some performance spectra of publicly available event logs.
In the figure below, we see the typical process map or model of the Road Traffic Fines Management Process event log on the left. It describes the possible behavior of handling a single case (a traffic ticket). The arc width and annotations tell how long it takes a case to go from one activity to the next.
On the right we see the Performance Spectrum of this process. Each horizontal stripe describes the transition between two activities – called a segment. Each colored diagonal or vertical line is a case passing through this segment over time. The longer and more diagonal the line, the longer the case took.
We can immediately spot very different patterns in each of the segment, clearly showing that the cases are not handled in isolation, but something manages their progress. We can see
Looking at a single event log, we can see that even a classical process over a single entity (a traffic ticket) is subject to dynamics beyond the scope of a single case.
We can see
What can you find in the Performance Spectra of the other public event logs, or your own data? Get the Performance Spectrum Miner from http://www.promtools.org/ or from https://github.com/processmining-in-logistics/psm and try it out!
One of the core challenges of process analytics from event data is to enable an analyst to get a comprehensive understanding of the process and where problems reside.
In business process mining such an overview is obtained with a process map. It can be discovered from event data to visualize the flow in the process and highlight deviations and bottlenecks.
Process maps of logistics processes do not give these insights: they are too large to comprehend, the maps do not visualize how processing of materials influences each other, and – as they show an aggregate of all event data – they fail to visualize how performance and processing varies and changes over time.
The image below shows the performance spectrum of a baggage handling system along a sequence of check-in lines over time. Bags are put into the system at point a1 and then are moved via conveyor belts to point a2. Each blue or orange line in the top-most segment a1:a2 in the performance spectrum shows the movement of one bag from point a1 to point a2 over time. The angle (and color) of the line indicates its speed.
As shown on the layout schema below, further bags enter the system from another check-in point a2 and are also moved to point a2, where both flows merge on the segment a2:a3, etc. All bags eventually reach the point “s” from where the bags are routed further into the baggage handling system. In the performance spectrum, we can see the movement of a bag over these segments through the consecutive lines.
As bags cannot overtake each other on a conveyor belt, we can immediately identify in the performance spectrum several behavioural patterns:
The visualization allows process managers and engineers to both quickly locate the cause of the problem to prevent it happening in the future. In particular the briefly-visible performance problem in a2:a3 prior to the halt of the conveyor belt can be identified as an early warning signal to detect possible performance problems in the future, and also to understand and improve system recovery behavior.
We realized this technique in a high-performance visualization tool which we call the Performance Spectrum Miner. It has proven itself reliable to:
We released a smaller-scale version of the tool (as a ProM plugin or as standalone tool) together with a manual on https://github.com/processmining-in-logistics/psm.
I am trying to sketch the landscape of describing, analyzing, and managing processes outside the well-established paradigm of a “BPMN process” where a process is executed in instances, and each instance is completely isolated from all other instances.
Let me introduce the term “process thinking”.
Process-thinking is the fundamental paradigm for understanding, designing, and implementing goal-oriented behaviors in social and technical systems and organizations of all kinds and sizes.
Process thinking structures the information flow between various actors and resources in terms of processes: several coherent steps designed to achieve common and individual goals together.
Throughout a process, multiple actors, resources, physical objects and information entities interact and synchronize with each other.
The scope of process thinking varies depending on the system and dynamics considered based on “how many dynamics to consider?” (outer scope) and “how many entities describe these dynamics?” (inner scope).
BPMN and classical process mining focus primarily on describing and analyzing information handling dynamics as they are found in many administrative procedures, for instance in insurance companies or universities.
Processes are scoped in terms of individual cases (or documents) whose information is processed along a single process description independent of other cases, often in a workflow system. In terms of scoping, such processes encompass a single-dimensional inner scope (information processing) structured into a single-dimensional outer scope (along a single case).
Most organizations operate multiple processes sharing data or materials which requires to consider multiple processes and objects and their interlinked dynamics together.
Process thinking around dynamics in manufacturing and retail organizations, such as Order-to-Cash or Purchase-to-Pay processes, is often supported by complex Enterprise Resource Planning (ERP) or Customer Relations Management (CRM) systems.
Processes here are centered around updating and managing a collection of shared and interlinked documents by various actors together leading to mutually dependent and interconnected dynamics of multiple objects and processes (multi-dimensional outer scope) with a focus on information processing (single dimensional inner scope).
While information processing is the dominant behavior analyzed in process mining, the dynamics of a process may also be characterized and analyzed in other dimensions.
For example, how actors and resources, physical materials, and the underlying systems participate in the processing of cases of the same process (inner scope of process thinking)
In most processes, these different factors of processing are not independent but influence each other as the progress of a case depends on availability of information, actors, and corresponding materials alike, and is subject to limited availability of processing resources, and physical limitations of the supporting systems, which requires multiple dimensions to characterize a single dynamic (inner scope).
Processes for manufacturing and logistics, such as baggage handling at airports combine information handling with material flows.
Physical items are processed along a logical process flow – and at the same time have to be moved around a physical environment of conveyor belts, carts, machines, and workers. Steadiness of flow is the central process objective.
In this characteristic, the processing of one material item depends not only on the logical process it has to go through but also on all other items that surround it: they together define whether work accumulates at a particular machine, work cannot be completed at the desired quality, or target deadlines are met. Did your bag reach the flight?
Call centers and hospitals are other examples where the processing of one case highly depends on what happens with other cases. A long waiting time in a queue can make a customer service contact go very differently. The quality and next steps in a medical treatment depend on how well the medical staff can focus on your case.
These phenomena cannot be observed, analyzed, and improved when studying each case in isolation.
More advanced logistics operations, such as warehouse automation and manufacturing systems, also consider material flows that are being merged together, through batch processing and manufacturing steps.
Analyzing and improving processes in such systems requires both a multi-dimensional inner scope and a multi-dimensional outer scope.
What are your thoughts on this? Feel free to join and post a response here!