A visual approach to support process analysts in working with process improvement opportunities

Kateryna Kubrak (University of Tartu, Tartu, Estonia)
Fredrik Milani (University of Tartu, Tartu, Estonia)
Alexander Nolte (University of Tartu, Tartu, Estonia) (Carnegie Mellon University, Pittsburgh, Pennsylvania, USA)

Business Process Management Journal

ISSN: 1463-7154

Article publication date: 3 April 2023

Issue publication date: 18 December 2023

2294

Abstract

Purpose

When improving business processes, process analysts can use data-driven methods, such as process mining, to identify improvement opportunities. However, despite being supported by data, process analysts decide which changes to implement. Analysts often use process visualisations to assess and determine which changes to pursue. This paper helps explore how process mining visualisations can aid process analysts in their work to identify, prioritise and communicate business process improvement opportunities.

Design/methodology/approach

The study follows the design science methodology to create and evaluate an artefact for visualising identified improvement opportunities (IRVIN).

Findings

A set of principles to facilitate the visualisation of process mining outputs for analysts to work with improvement opportunities was suggested. Particularly, insights into identifying, prioritising and communicating process improvement opportunities from visual representation are outlined.

Originality/value

Prior work focuses on visualisation from the perspectives – among others – of process exploration, process comparison and performance analysis. This study, however, considers process mining visualisation that aids in analysing process improvement opportunities.

Keywords

Citation

Kubrak, K., Milani, F. and Nolte, A. (2023), "A visual approach to support process analysts in working with process improvement opportunities", Business Process Management Journal, Vol. 29 No. 8, pp. 101-132. https://doi.org/10.1108/BPMJ-10-2021-0631

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Kateryna Kubrak, Fredrik Milani and Alexander Nolte

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

Well-designed and customer-oriented business processes increase service quality and improve process efficiency (Dumas et al., 2018). Therefore, organisations continuously seek to identify improvement opportunities in their processes. Process analysts have previously identified improvement opportunities by manually modelling and analysing business processes. In the past decade, data-driven methods, such as process mining, have gained traction for process discovery and analysis (Milani et al., 2022). However, while data-driven methods facilitate the discovery of process models, analysts still have to examine the process models to identify improvement opportunities and determine which ones to address.

Current research in the field of process mining focuses on topics such as visualising information on process models (Dani et al., 2019), comparing processes (Low et al., 2017; Pini et al., 2015; de Leoni et al., 2016), process performance (Bachhofner et al., 2017; Gulden, 2016; Pika et al., 2014; Low et al., 2017; Chiò et al., 2021), predictive and prescriptive process monitoring (Conforti et al., 2013; Fahrenkrog-Petersen et al., 2022; Khan et al., 2021) and frameworks for developers of process mining software (Sirgmets et al., 2018; Wynn et al., 2017). However, they do not consider identification of improvement opportunities from a visualisation perspective.

Several process mining vendors (e.g. Celonis [1], MPM [2], Minit [3]) have included functionalities for working with process improvement. However, they mostly focus on highlighting potential improvements for one case rather than considering improvement opportunities that could influence future executions of the process. To the best of our knowledge, there is no existing approach that supports analysts in using visualisations of process executions to identify improvement opportunities. This impedes process analysts from taking data-driven decisions to implement process changes.

To reduce this gap, we set the research objective as to develop a visualisation that can aid process analysts in working with process improvement opportunities. To achieve this objective, we formulate three research questions. First, process analysts have to identify the improvement opportunities in the process before analysing them. Therefore, we ask:

RQ1.

How do process analysts use process mining visualisations to identify improvement opportunities?

Second, there might be several improvement opportunities in a process that could be addressed together or separately. For instance, an analysis of a hospital emergency process log revealed three improvement opportunities (Erdogan and Tarhan, 2022). Therefore, analysts use specific criteria, such as performance metrics, to assess which improvement to implement. Thus, we ask:

RQ2.

How do process analysts use process mining visualisations to prioritise identified improvement opportunities?

Third, we explore how analysts communicate identified improvement opportunities. Often, process analysts present their findings to stakeholders who decide to implement the proposed changes. Therefore, we seek to understand whether visualisations used for communication differ (e.g. adjusted, simplified) from those used for process analysis. As such, we ask the following question:

RQ3.

How do process analysts use process mining visualisations to communicate identified improvement opportunities?

To answer these research questions and achieve the research objective, we follow a design science methodology (Hevner et al., 2004) to create and evaluate an artefact – a mockup of a process improvement opportunities visualisation. We first explore the relevance of the problem through qualitative research involving practitioners from industry. Based on the findings, we elicit requirements for the artefact. Then, we develop a mockup – IRVIN (ImpRovement opportunities VisualIsatioN) – and, finally, we evaluate it with its potential users (i.e. process analysts).

Thus, the contribution of this paper is threefold. First, we create a mockup of a process improvement opportunities visualisation. Second, we provide insight into how process analysts use process mining visualisations to analyse improvement opportunities. Third, we formulate principles for process mining visualisation to aid in analysing process improvement opportunities. These contributions can be useful for developers of process mining tools and process analysts.

Developers of process mining tools benefit from these contributions through an increased understanding of how analysts use process mining tools to work with process improvement opportunities. As such, they can gain insights into how to better develop their tools to be more supportive of process analysis. Process analysts, on the other hand, benefit from understanding practices of other analysts, as well as from improved visualisations for process improvement opportunities. Insights into how other analysts use process mining tools can aid in finding better ways to identify improvement opportunities.

The rest of this paper is structured as follows. Section 2 presents background and related work while Section 3 outlines the research method. In Section 4, we present the results. Then, in Section 5, we discuss the findings. Finally, Section 6 concludes the work.

2. Background and related work

Process mining uses event logs to discover process models (van der Aalst, 2016). As such, process mining tools use data recorded in event logs for data-driven process analysis and aid in identifying improvement opportunities (Milani et al., 2022). Process mining has gained popularity as an analysis tool (Dumas et al., 2018). For instance, process mining is used in telecommunications (Mahendrawathi et al., 2015), IT management services (Vázquez-Barreiros et al., 2016), library information systems (Kouzari and Stamelos, 2018), agile software development (Marques et al., 2018) and logistics (Kedem-Yemini et al., 2018). These studies report on results obtained and illustrate the value of process mining in industry (Corallo et al., 2020). However, an important aspect impacting how an organisation accepts such results (Grisold et al., 2021) relates to how the results are visualised (Sirgmets et al., 2018; Basole et al., 2015).

Several studies have explored different aspects of visualising process mining outputs, such as discovered process models (Agostinelli et al., 2019; Dani et al., 2019) or differences between process variants or event logs (Gall et al., 2015; Bolt et al., 2016). For instance, Dani et al. (2019) provide a review of methods and principles used to visualise business process models. Another work (Stefanini et al., 2020) proposes an approach to visualise unstructured processes. In Gall et al. (2015), a visualisation is presented that uses colour-coding and symbols to help find redundancies or inconsistencies between two process variants. Similarly, Bolt et al. (2016) present a visualisation for comparing event logs. The authors use shades of colours and thickness for nodes and edges to highlight statistically significant differences between the two event logs. An approach by de Leoni et al. (2016) creates animations of process behaviour which helps process analysts obtain a holistic view of the process execution from different perspectives. Kaouni et al. (2021) propose using visual analytics to identify bottlenecks in a process. More specifically, the authors use different plots to visualise the start and end activities. These works aim at highlighting differences between processes from various perspectives. We, on the other hand, explore how process analysts can be supported in identifying and analysing specific improvement opportunities derived from such comparisons. Furthermore, in an interview study of process mining practices (Zerbato et al., 2022), the authors investigate common strategies analysts use during the process analysis stage. However, the discussed strategies relate to the analysis, not their visualisations or how analysis can be facilitated with visualisations.

To some extent, the visualisation of improvement opportunities is addressed in commercial process mining solutions. For example, Celonis [1] offers a solution named Action Engine that can identify “Signals” in process. Similarly, MPM's [2] eXecution Suite provides data-driven alerts to users, who can then take a relevant action. However, such notifications concern minor issues in the process and do not relate to more significant improvement opportunities. Additionally, in both cases, the signals and alerts are text messages describing the issue. In another example, Minit [3] provides a visualisation of root-cause analysis to facilitate the discovery and investigation of process issues. These insights can be used as input for setting up business rules to continuously check how compliant the process is with the defined rules. Finally, in a comparative study, the authors examined 16 process mining tools and compared them using a taxonomy of 55 distinct features (Loyola-González, 2022). Their analysis show that visualisation, as a means to facilitate analysts to identify improvement opportunities, is partially supported by a minority of tools. For instance, some tools support simulation. However, the results of the simulations are not visualised in a manner that considers the analysts' needs when identifying improvement opportunities. Thus, while existing commercial tools provide solutions that contain visualisation elements related to improvement opportunities, the information is fractional and incomplete. Furthermore, such visualisations are not necessarily scientifically derived. In this paper, on the other hand, we focus on exploring how such information could be unified and structured in visualisations that facilitate working with improvement opportunities.

Visualisation is also widely applied in process performance analysis. In Bachhofner et al. (2017) and Gulden (2016), the authors present tools to analyse time-related process behaviour, while Pika et al. (2014) and Low et al. (2017) analyse human resource behaviour over time. Similarly, Pini et al. (2015) propose techniques for multi-perspective process visualisation of process variants and their performances. In Chiò et al. (2021), the authors propose using a combination of plots to analyse variations in process performance. Thus, these studies focus on enhancing discovered process models with performance data. In this paper, we take a step further and build on such visualisations to help process analysts identify, prioritise and communicate possible process improvements.

Another area where visualisations are used is for predictive process monitoring. Such methods predict how ongoing cases will evolve (di Francescomarino et al., 2018). For instance, Conforti et al. (2013) extend a visual plug-in that map risk-based metrics to support decision-making. Prescriptive process monitoring also adopts visualisations. Prescriptive process monitoring methods recommend interventions during the execution of a case that, if followed, improve the case outcome with respect to one or more performance indicators (Kubrak et al., 2022). For example, Detro et al. (2020) develop visualisations for recommendations of patients' treatment in a hospital. Therefore, such visualisations provide insight for managing ongoing process cases. In this paper, we focus on visualisations of strategic improvement opportunities, that is changes that impact the process structure and affect all future cases.

Finally, there are frameworks for visualising process mining outputs. For instance, Sirgmets et al. (2018) presents a framework for guiding developers of process mining techniques in designing process diagrams. In Wynn et al. (2017), the authors propose a visualisation framework for process performance comparisons. These frameworks provide guidance on designing descriptive representations of event logs. In this paper, we build upon such work and examine how visualisations can, beyond being descriptive, aid analysts in identifying and analysing improvement opportunities.

3. Method

To address the objective of developing a visualisation that can aid process analysts in working with process improvement opportunities, we first explore how process analysts currently work. More specifically, we assess how process analysts currently identify improvement opportunities using visual representations of process mining tools (RQ1), how they prioritise improvement opportunities (RQ2) and how they communicate their findings (RQ3). To answer these questions, we use design science research methodology (Hevner et al., 2004). Design science provides guidelines on how to create, evaluate and improve artefacts to achieve organisational goals (Hevner et al., 2004). To this end, we explore the problem relevance and define the objectives through qualitative research involving practitioners from the industry. As a result, we elicit requirements for the artefact. Based on these requirements, we develop an artefact (a visualisation mockup) and evaluate it with potential users (process analysts) to assess the extent by which the artefact can solve the problem of identifying, prioritising and communicating improvement opportunities. The research process is depicted in Figure 1.

3.1 Phase 1: Defining the objectives

3.1.1 Step 1: Exploration interviews

As the design science methodology prescribes, the artefact must solve a relevant business problem (Hevner et al., 2004). We, therefore, started by conducting a qualitative study to explore the relevancy of the problem. We recruited seven participants for the interviews (Table 1). We selected them across two main dimensions: (1) their business role (working internally at their company or as a consultant) and (2) the variability of the domain. We chose this differentiation as it can be expected that approaches to identify improvement opportunities vary among individuals familiar with the processes they are tasked to improve (internal process analysts) and those brought in as external experts (consultants). Moreover, we also selected our participants from different domains to cover various contexts and use cases since we aim for our approach to be useful beyond a single domain. We conducted individual online interviews with each of the seven selected participants, which lasted between 29 and 46 min each.

The interviews were semi-structured (Harrell and Bradley, 2009). At the beginning of each interview, we asked the respective interviewee to think about a recent process improvement project they had been involved in. The motivation for this is three-fold. First, we sought to have the interviewees discuss real-life examples that were familiar to them. Second, having all the information about one project facilitates further analysis. We explored a specific case in detail and asked questions about its details. This helped us to understand process analysts' motivation when they made different decisions on a specific improvement project rather than collecting scattered data across different cases. The deepening of questions was possible due to the semi-structured interviews. Third, we ensured that the interviewees would remember detailed information about what they did and why by discussing one recent project.

We developed an interview guide based on the research questions outlined in the introduction. First, we focused on how the process analysts identify and assess an improvement opportunity (RQ1). Examples of questions asked in this regard were: What was the specific improvement opportunity identified?" and How was the improvement opportunity identified?" This helped us to map prevalent improvement opportunities the analysts work with and understand the process of finding them. Next, we asked the participants how they prioritised their identified improvement opportunities (RQ2). For instance, we asked: Were there any alternatives to the selected improvement opportunity? How was it decided which one to select? Who made this decision?" These questions aimed to clarify what identified improvement opportunities' characteristics made them more important than others. Last, we sought to understand how process analysts communicate identified improvement opportunities to clients or business users (RQ3). To this end, we asked questions like Who were the results presented to?" and How were the results presented (images, dashboards)?" The motivation behind such questions was to assess whether the visualisations used for communication purposes are different (e.g. adjusted, simplified) compared to those utilised by the process analysts themselves. At each stage of the interviews, we asked which visualisation methods process analysts already use to gain an overview of the current state. Therefore, we asked, for instance, about Were any visualisations used to help decide on the improvement opportunity? What was important to see visually to compare them?" The full interview guide can be viewed in Appendix 1.

All interviews were recorded and transcribed using Otter.ai [4]. After manually reviewing and correcting the transcripts, we conducted a thematic analysis (Braun and Clarke, 2006) to analyse the interviews. We first familiarised ourselves with the data and created initial codes derived from the research questions. To do that, we referred back to the questions we asked in the interviews and reviewed the transcripts for the answers related to the RQs. For example, for RQ1, initial codes such as improvement opportunity and visualisation for analysis related to the interview questions that we asked about the improvement opportunity that was identified and the means of visualisation used for that, respectively. Similarly, one of the interview questions that we asked with regards to RQ1 was about the input data used in the project, which led to the code data. Following the same procedure, we applied codes, such as improvement prioritisation when means of assessing and prioritising improvement opportunities were mentioned (RQ2), and codes, such as methods for communication, purpose of communication for data concerning reporting the findings (RQ3). After discussing the coding results within the research team, several adjustments were made. For example, upon reviewing the transcripts, we noticed a range of different improvement opportunities that the interviewees mentioned. Thus, different codes describing them emerged, such as reworks and bottlenecks around waiting times. Thus, the initial code improvement opportunity was transformed into a theme named improvement opportunity description that now incorporated codes related to the range of improvement opportunities mentioned. The final coding scheme can be viewed in Appendix 2. During the previously described procedure, we also used documents provided by the participants (screenshots, videos, slides) as additional context information to aid our understanding of the responses. Findings from this step provided the basis for requirement elicitation.

3.1.2 Step 2: Requirements elicitation

We summarised the findings and, based on them, elicited user stories. We chose user stories as a method since they help to keep the user in mind and prevent adding irrelevant elements to the visualisation (Cohn, 2004). The user stories were then reviewed to ensure that they relate specifically to identifying, prioritising and communicating improvement opportunities and not process improvement project in general.

We also compared the elicited requirements with functionality that is available in existing process mining tools (based on the review of Loyola-González (2022)). The aim was to understand the extent by which required functionalities, as perceived by interviewed process analysts, are already implemented in the tools (see section 4.1, step 2).

3.2 Phase 2: Design & development

Based on the requirements identified, we designed IRVIN (ImpRovement opportunities VisualIsatioN). First, we used the wireframing technique (Hamm, 2014) and created static mockup screens for the artefact with the Balsamiq [5] tool. The static screens were then uploaded to the InVision [6] tool and connected to provide interactivity for future demonstrations.

In developing the artefact, we relied on existing visualisation guidelines from literature (e.g. Gall et al. (2015), Munzner (2014), Shneiderman (1996)). Additionally, for further evaluation with practitioners, we adopted examples of real-life improvement opportunities and their redesign options from the Improvement Opportunity Framework (Lashkevich, 2020). We chose the improvement opportunities to show based on the discussions with interviewees in the first round of interviews. A detailed description of the artefact is given in section 4.2.

3.3 Phase 3: Evaluation & refinement

With the evaluation, we aim at assessing whether the artefact satisfies the requirements. Moreover, the evaluation allows us to collect feedback on the initial design of IRVIN. This helps us to assess IRVIN at an early stage before investing resources into creating a working prototype. In particular, evaluation helps to identify what IRVIN lacks and how it can be improved.

3.3.1 Step 1: Evaluation interviews

We aim to evaluate IRVIN from the perspective of its potential users, such as process analysts. We recruited eight practitioners for the evaluation (Table 2). We used the same requirements to recruit the participants as before. Namely, we looked for representatives of different domains and different roles in the company. Moreover, we invited individuals that were not part of the previous interviews, as well as those who had already participated (see column “Repeated” in Table 2). The aim of this approach is twofold. First, it allows us to confirm whether or not our interpretation of the interviews conducted in phase 1 is indeed correct. Second, it broadens our perspective beyond the initial interviews. Repeating interviewees have the same code as they had in the first round. New participants received codes with numbers following the previous batch (from I-08 onward). The evaluation interviews lasted between 32 and 49 min.

To evaluate IRVIN, we used the contextual interview approach. This approach uses a blend of traditional interviews and observation and, therefore, helps to gain insights into how process analysts could use IRVIN in the context of their work (Holtzblatt and Jones, 1995). At the beginning of each evaluation interview, we demonstrated IRVIN to the interviewee. For the demonstration, we used a specific scenario. This time, we did not ask the participants to think of a recent project but, instead, created a scenario beforehand. The reason is that the first interviews aimed at discovering information. In contrast, in the second, we seek to evaluate an existing artefact developed to showcase specific improvement opportunities that the interviewees frequently referenced during the prior interviews (see section 3.2). Additionally, in the end, we asked the interviewees to comment on the visualisations in IRVIN freely.

The referenced scenario in IRVIN is as follows. Imagine you are analysing a claim-to-resolution process in an insurance company. The process mining tool you are using has identified four improvement opportunities in the process: Order queues, Unnecessary job handovers, Activity rework and Low resource capacity. Your task is to assess the identified improvement opportunities and choose which one to proceed with. During the demonstration of the artefact, we showed all screens based on the scenario (see section 4.2 for sample figures). With that, we introduced all aspects the artefact was built for, for example a unique improvement opportunity in detail, an overview of all the improvement opportunities together and prioritising improvements. After that, the participants received access to IRVIN to try themselves. During this stage, we asked the interviewees three questions related to each of the RQs. First, we sought to understand which visualisations in IRVIN aid the participants in identifying and analysing an improvement opportunity. We asked: How do you proceed with identifying an individual improvement opportunity? What is important to see to assess it?" (RQ1). Second, we asked the participants to prioritise the identified improvement opportunities to explore which visualisations in IRVIN they use. Therefore, we asked: How do you proceed with deciding which improvement opportunity to choose to address? What is important to see to decide on a specific improvement opportunity?") (RQ2). Third, we focused on understanding whether IRVIN helps them in communicating improvements. To this end, we asked: How do you proceed with communicating the chosen improvement opportunity to the business people/clients?" (RQ3). The full interview guide can be viewed in Appendix 3.

To analyse the interviews, we transcribed them using the same tool as before. Then, we proceeded to review and correct the transcripts manually. At this stage, we used the approach of thematic analysis again (Braun and Clarke, 2006). As previously, initial codes focused on the RQs. For example, there was a code understanding the current problem which relates to the participants explaining how they work with understanding what issue they are trying to solve (RQ1). Similarly, the code parameters for prioritisation incorporated all interview sections where the participants spoke about the parts of the visualisations in IRVIN that helped them prioritise the identified improvements (RQ2). As to the communication part (RQ3), codes as methods for communication, things communicated were used. There were also codes such as interface when the interviewees made general comments about the artefact. After composing the first themes, we reiterated them and either merged some or broke down some into smaller parts. For instance, we elicited additional code labels from the previous code interface to mark the parts of interviews where the participants specifically referred to names of elements used in the artefact. The final coding scheme can be viewed in Appendix 4.

3.3.2 Steps 2 & 3: Additional requirements elicitation & Artefact refinement

We summarised the findings from the evaluation on how process analysts use process mining visualisations to identify improvement opportunities (RQ1) and prioritise them (RQ2). Again, we used user stories to specify seven additional requirements. We consider a new requirement as a visualisation need we had not identified during the exploration interviews. Again, we reviewed the user stories to ensure they describe the needs that could be addressed with the visualisation of improvement opportunities, thus discarding details such as the working environment. We reiterated the artefact and either made edits to the existing elements or added new ones based on the new requirements.

4. Results

4.1 Phase 1: Defining the objectives

4.1.1 Step 1: Exploration interviews

This section outlines the findings from the exploration interviews (Table 3). These findings serve as the basis for eliciting requirements for the artefact. We begin with how process analysts identify improvement opportunities (RQ1), how they prioritise and select which one to address (RQ2) and how they communicate them (RQ3).

4.1.1.1 Improvement opportunity identification (RQ1)

As to RQ1, we identified the need to use process models as part of visualising improvement opportunities (F1). Five participants expressed that they rely on process models when analysing processes to identify improvement opportunities (I-02, I-03, I-04, I-05, I-07). For instance, one analyst said that I use the actual process visualisations to find bottlenecks (I-07). Another analyst gave an example of identifying an improvement opportunity by comparing models of the same process from different years: "[I] did the analysis of the process and found out that the steps of the process in fact stayed the same. However, there was one difference. The duration between the steps changed. (I-02). The interviewees use available features in the process mining tools, such as filtering, variants analysis and comparison. They gave examples of comparing process models for different attributes, such as different types of claims (I-05), and to compare the process across countries (I-01). Additionally, insights from both outputs of process mining software and business constraints should be considered in the visualisation (F2). The interviewees highlighted the importance of approaching process analysis from both perspectives to have a holistic view of the findings (I-04, I-05, I-06). As one interviewee said: There are also some findings that may be very dominant and very interesting from process mining, but they have zero interest for business. (I-06). Another analyst also provided an example: Oftentimes, if you don't understand how the data is generated and where the data comes from, you might misinterpret the visualisation of the process mining software. And oftentimes, if you don't understand the business process, you might overreact to exceptions that are shown on the process map. (I-05). Last, analysts decompose problems into smaller and more manageable sizes to investigate them separately (F3). For instance, an internal analyst expressed that “I think it's better to define one or two improvement areas and just help to improve there” (I-01). As another analyst explained, what helps him/here is breaking an analysis problem into smaller parts to make it manageable (I-06).

4.1.1.2 Improvement opportunity prioritisation (RQ2)

We noted several aspects in prioritising identified improvement opportunities (RQ2). First, the participants assess the impact on the process in terms of its location, number of cases and variants involved (F4). As one analyst put it, "[we] see in the process where this would actually have an impact on (I-02). The impact is used to measure the importance of that finding or the terms of the impact on the overall process (I-06). Second, the interviewees consider whether the improvement opportunity can be addressed within the organisation or requires external entities (F5). As one consultant, when discussing a particular improvement opportunity, put it, “but that's external to the process, so you can't do anything with that. So, then you have to specify that the recommendation I'm making is intrinsic […] or extrinsic to the [process]" (I-05). Thus, an improvement opportunity might not be addressed because it requires the intervention of entities outside the process scope (extrinsic to the process). Addressing such improvement opportunities is, therefore, assessed as unfeasible.

Our findings also indicate the importance of visualising the potential financial impact of the improvement opportunities (F6). In the end, “it's always about the money” (I-01). The savings that can be realised must be estimated. As one internal analyst described, in order to estimate the savings, “we build some sort of a business case on how much we can save” (I-01). Additionally, several participants (I-02, I-05, I-06) mentioned that they prioritise those improvement opportunities that are in line with the company's objectives (F7). As one interviewee explained, different improvement opportunities can be found when analysing the process w.r.t. different performance measures, for example Are you looking at it from efficiency perspective? Are you looking at it from a risk perspective? Are you looking at it from a cash flow perspective?" (I-06). Another analyst provided an example of using KPIs: it's important to know the concrete KPI from the beginning of the project but it happens that they need to be clarified with the managers and have to be readjusted, so what could change in the process using what specific KPI. (I-02). Last, our findings suggest that process analysts consider process performance change when addressing improvement opportunities (F8). One process analyst gave an example of assessing how much the process would be improved from what performance measure; if the KPI is time, how much time would be saved (I-02) if a particular improvement opportunity is addressed. As another interviewee explained, he/she is interested in how the average throughput would change or how the resource utilization would change and that kind of things (I-05).

4.1.1.3 Improvement opportunity communication (RQ3)

Our findings show that the interviewed participants developed several strategies to communicate their findings from process improvement projects (RQ3). We found that process analysts commonly communicate their findings by developing a story supported by data and visualisations (F9). In the words of an internal analyst, communicating the findings is “like storytelling for managers with process mining” (I-02). For instance, one consultant explained that “when presenting something to management, you don't have time, and they don't have the attention span to listen to the whole thing and to understand all the details” (I-05). To this end, to make the findings digestible and relatable, “you take that piece of information or data and try to put it in a simplified context that works as a narrative and easy enough to understand” (I-05). Furthermore, when communicating findings, the interviewed analysts adjust them to the audience and select a level of detail that would suit the audience's level of knowledge (F10). For instance, one consultant shared that he/she uses specific visuals for the analysis but creates a new visual representation that is simplified when communicating the findings: “since I'm not sure anybody else would understand it, […] then I create another one that is very specifically targeted to communicating a message” (I-05). As another analyst expressed, different clients have different modes of communication. Some clients require very formal approaches (I-04).

4.1.2 Step 2: Requirements elicitation

We elicited user stories from the interview findings (see Table 4). We discarded one finding (F3) because it related to understanding how the interviewed process analysts use process mining for process improvement projects. As such, the third finding (F3) describes how the interviewed process analysts structure the work in a process improvement project (i.e. decompose given problems for further investigation) but does not describe or analyse a specific improvement opportunity. Therefore, the finding does not directly relate to visual representations.

In developing the artefact, we applied the knowledge of how process analysts work with improvement opportunities in their current setting. In other words, we obtained findings on how they identify, prioritise and communicate process improvement opportunities using the outputs of process mining tools. In the artefact, we specifically aim at visualising process improvement opportunities based on the inputs from the interviewed process analysts. Therefore, user stories stem from the findings (see Table 3) but are translated into the context of visualising process improvement opportunities. For example, finding F1 showcases that process analysts rely on process models when analysing the process for improvement opportunities. The artefact, however, presents the visualisation of improvement opportunities. In other words, it helps the analysts compare and prioritise them from the perspectives of their current impact on the process and what can change if they are addressed. Thus, we translate the finding F1 to user story US1, which is to visualise process models in current and projected states. Similarly, F8 showcases that process analysts consider changes in process performance when analysing improvement opportunities. User story US5, which is based on this finding, considers current and projected process performance.

We, then, compared the elicited user stories with functionalities existing in commercial process mining tools. We considered the descriptions provided in (Loyola-González, 2022) to understand to which user story they correspond. Then, we marked the degree to which a respective user story is supported in the tools in Table 4. For example, if a functionality which a user story refers to is supported by all tools, it is marked as “Fully” (e.g. US1). User stories supported by at least 10 tools are marked as “Mostly”, and less than 10 by “Partially”. Functionalities not available in any tools are marked as “None".

4.2 Phase 2: design & development

IRVIN was developed as static mockup screens connected for interactive step-by-step exploration (see section 3.2 for detail). We made IRVIN look like it is a part of a web version of a hypothetical process mining tool to make it more realistic for interview participants. Thus, every screen of IRVIN is visualised as a browser window and has standard elements that a process analyst would encounter in such a process mining tool, such as user profile and the help button (see Figures 3–5) [7].

Some visualisations of IRVIN were adopted from existing literature. For example, the highlighted parts of the process models are based on Gall et al. (2015), where the authors use colour-coding and symbols to emphasise the differences between process models (e.g. elements were added or removed). We also applied general visualisation guidelines, such as Munzner (2014) and Shneiderman (1996). For example, the mockup does not use any colours except white, grey and black to ensure that the “Get It Right in Black and White” rule is followed (Munzner, 2014). The only exception is that the colours mentioned above emphasise differences in the process models to depict improvement opportunities. Based on the discussions with interviewees in the first round, we selected improvement opportunities and redesign options to show in IRVIN from the Improvement Opportunity Framework (Lashkevich, 2020) – a collection of improvement opportunities that can be identified in a process. Thus, having reviewed what improvement opportunities the interviewees referred to the most, we adopted their descriptions from the framework to showcase in the mockup. For example, several interviewees (I-03, I-07, I-05) described analysing rework loops (improvement opportunity #3 in IRVIN) in the cases they work with. Similarly, interviewees I-06, I-03 and I-01 referred to long waiting times in the process, which served as the basis for selecting improvement opportunity #1 in IRVIN.

The mockup comprises 13 screens that together form a hierarchical structure (Figure 2). The highest element of the structure, Improvement Opportunities (level one), is a separate tab of the hypothetical process mining tool where a user can explore the improvement opportunities. All other screens are visualised as part of this tab. Particularly, user stories US1-US7 correspond to the screens under the element “Overview” (level two), and US8-9 correspond to the screens under the element “Comparison”, with US10 related to both. In example screens (see Figures 3–5) some user stories (see Table 4) are circled in yellow. In level three for Overview, the user can view the details of the identified improvement opportunity (Summary) and assess what changes it brings to the process if it is addressed (Impact). For Comparison, level three elements allow for comparing the improvement opportunities with regard to the original process separately (Separate) and if they are addressed together (Combined). All screens of level two can be viewed both from the point of view of process models and process performance (level four). In Figure 3, the different levels are circled in purple.

In level two, the user can explore the details of a particular improvement opportunity in the Overview. Figure 3 displays an example of the screen with details of improvement opportunity #1 in its current state and after it is redesigned if the user follows the suggestion. The left sidebar provides an overview of all improvement opportunities identified in the given process. The current setting is switched to the activities perspective, and the improvement opportunity is highlighted in red.

Another element (Comparison) in level two allows for comparing several improvement opportunities between each other. In Figure 4, a comparison view of two improvement opportunities (#1 and #3) from a process performance perspective is provided. As described in the evaluation scenario (see section 3.3), the user can compare identified improved opportunities with each other and decide how to proceed.

4.3 Phase 3: Evaluation & refinement

4.3.1 Step 1: Evaluation interviews

In this section, we present the results of the user evaluation study. First, we outline the findings of the evaluation of IRVIN (RQ1). Then, we present the results regarding the prioritisation of identified improvement opportunities (RQ2) and, finally, we present the results on the evaluation of IRVIN for communicating findings (RQ3). In addition, we captured general comments on IRVIN. The evaluation resulted in identifying additional findings related to RQ1 and RQ2 (see Table 5). We did not identify additional findings for RQ3.

4.3.1.1 Improvement opportunity identification (RQ1)

The interviewees used all elements of IRVIN to analyse improvement opportunities. They found the overview of identified opportunities and the related process metric useful. However, the data displayed was not at a sufficient level of granularity, that is it did not provide sufficient insight. Therefore, more detailed data needs to be displayed. The evaluation also identified two additional findings regarding (RQ1) that IRVIN did not support. The interviewees expressed the need to see an overview of the baseline process performance related to the improvement opportunity (F11). One process analyst explained that he/she needs to understand the motivation: Before I start looking at solutions, I would want to know why. Why bother about it?" (I-04). As another analyst expressed it, So, okay, it takes a long time. So what? Does anyone complain? Do we lose customers? Do we get our money later?" (I-01).

Furthermore, the analysts we interviewed highlighted the need to explore the data behind the improvement opportunity (F12). There are two reasons. One analyst expressed concern that the data underlying a specific improvement opportunity, such as a bottleneck, may be incorrect: it's at least our experience, that some of the cases are […] not real data, that is connected with real cases that have been processed by our workers. (I-08). Secondly, process analysts commonly require a lower level of data granularity to, for example, understand the work of a particular resource: some details that let you understand deeply the work of the [resource] alone. (I-11).

4.3.1.2 Improvement opportunity prioritisation (RQ2)

While assessing an improvement opportunity (RQ2), the analysts found visualisations of the impact of the improvement opportunity on process KPIs relevant. For the process KPIs, the interviewees focused on how each improvement opportunity impacted the KPIs. As one analyst explained, I would focus on changes in KPIs. So I would try to look [at] what exactly changes, and if it's significant. (I-08). Another analyst provided an example that for deciding on the specific improvement opportunity, the important thing for me to understand is how they change the process performance (I-01). To this end, we added, “review all KPIs that are changed if the improvement opportunity is addressed” (F13).

We also found that the interviewees needed to see detailed data behind the diagrams (F14). This is because there can be mistakes in the data or the calculations. Therefore, interviewees wanted to understand whether they can “trust” the graphs because it's very easy to get this [calculation] wrong (I-04). As the interviewee explained, in order to ensure that the displayed numbers are correct, he/she would want to be able to double-check it: Why do you think it's going to go by 10%? Show me the underlying data. (I-04). As another analyst put it, he/she would want more clarity what exactly is meant by a certain KPI, or how did certain suggestion come about?" (I-05).

The interviewees also highlighted the need to assess the cost and effort of addressing an improvement opportunity. In the words of one analyst, the most important is to understand, okay, if I have these root causes, what will it cost, like, what will it cost me to implement things. (I-12). The participants also highlighted the lack of information about how easy it is to implement the change: That's the one thing the difference between the output or the result of improvements, but also I'm thinking of some effort that I have to put to resolve this issue. (I-10). Therefore, we identified the need to visualise the cost and ease of addressing improvement opportunities (F15). Finally, our findings indicate that the interviewed process analysts prioritise improvement opportunities that can address or resolve other process problems (F16). As one analyst put it, we'll try to see how the different combinations between these opportunities work " (I-08). Commenting on this possibility, a consultant said that we always have to consider the solutions, if they could actually be solved […] concurrently, in the same time. (I-12).

4.3.1.3 Improvement opportunity communication (RQ3)

Our findings suggest that, when communicating the findings, the focus is on presenting the current state of the process, diagrams that depict improvement in KPIs and alternatives to the improvement opportunity. These needs are currently satisfied with the visualisations in IRVIN. Six interviewees (I-08, I-09, I-01, I-10, I-05, I-12) commented that they use graphs to communicate changes in process KPIs. As one interviewee commented, he/she use images from the application […] to support my decision by data. (I-10). The participants mentioned that what's important that you can just screenshot the graphs from here. (I-01). As to the available graphs, the interviewed analysts expressed that graphs should be simple to help convey the message and focus on the essential things. Therefore, the current visualisations in IRVIN serve that purpose: I think that KPI this could be just the screenshot because this is quite simple, and I think it's good. (I-09). In addition to the graphs, analysts communicate the current process state and alternative improvement opportunities: First, you're telling them, here's your process as it is, here is your most, like, top root causes, what we could found, we could find. Here are opportunities, what we can do better. (I-12).

4.3.1.4 General comments

The participants commented on their general impression of IRVIN. The feedback was positive. For example, Even though it's early, it's very clear. (I-08), And with this visualisation, I would say I'm actually fine. […] the less objects are better, because then you can stay focused. So for me, it's perfect. (I-12). The interviewees highlighted the usefulness of such visualisations since they would make their daily jobs easier because while you are thinking about how to redesign a process, you are very, very concentrated […], you have a lot of information in your brain. And to have everything in front of you is very, very useful. (I-11).

One source of confusion was the naming of certain elements of IRVIN. For instance, three interviewees (I-04, I-01, I-10) commented on the word “impact” of the change in process models and performance between the current and the improved processes. This is because analysts that we spoke to relate impact to how the performance of the process changes. As one interviewee explained: Why I asked you about impact, because […] the word impact suggests that I will see […] performance indicators. Because I would like to see what changes for me, what impact makes this change?” (I-01).

Additionally, the interviewees proposed several suggestions for improving IRVIN. Such suggestions were either mentioned once or at most by two out of eight participants. For instance, I-08 and I-01 suggested presenting different KPIs by default and let the user select, And then we'll just show at least maybe three key KPIs, they are already visible here. And maybe you can leave room for two others, which the people can select. (I-01). Another participant suggested having the improvement opportunity description constantly visible to ensure everyone working on it has the same understanding, “I would want it to be always visible, and not have people interrupt and ask me or even worse, not ask at all and assume that they understood what order queues are but actually they didn't. (I-04). One analyst suggested using different colours for resources: maybe this is not quite important, but colour, for example, a senior specialist would be one colour, and junior another, it would be easier to follow the graph. (I-09).

In conclusion, general feedback on IRVIN was positive. The participants did not demonstrate difficulties with understanding the visualisations. Moreover, those who participated twice did not give any indication that we had misinterpreted their interviews from the first round. On the contrary, they expressed being satisfied with the visualisations. For example, one analyst said that he/she has no questions because it seems to be quite straightforward (I-05). Another participant, when commenting on the screen that allows for comparing the projected performance of different improvement opportunities (see Figure 4), confirmed that the good thing that, obviously, you always need to have your KPIs and comparison between them (I-01). Last, the interviewees also proposed some concrete suggestions for improvement, but they seem to be personal preferences.

4.3.2 Steps 2 & 3: Additional requirements elicitation & artefact refinement

Based on the new findings, we formulated a set of additional user stories (Table 6). We only considered new findings when writing the additional user stories as these present new visualisation requirements not identified during the exploratory interviews. We again compared them with existing tools based on Loyola-González (2022). Minor comments, such as the naming of impact, were also considered and incorporated in IRVIN.

Based on the new user stories, we improved IRVIN. An example of the improved version of IRVIN is depicted in Figure 5. The yellow colour depicts the implementation of new user stories, while the blue colour depicts minor improvements based on comments from participants but not converted to user stories (e.g. names of elements that were found to be unclear). This example figure is the improved version of the previously referenced part of IRVIN (Figure 4). To this end, two findings from the evaluation interviews were that the interviewed process analysts require information about the ease (US13) and cost (US14) of addressing the identified improvement opportunity (see Table 6). Therefore, we added new elements, such as a chart with projected cost and resource utilisation (that depicts the effort required to address the compared improvement opportunities), to the screen in the mockup where the user can compare identified improvement opportunities (Figure 5).

5. Discussion

In this section, we present our findings that provide indications on how process analysts identify improvement opportunities from visual representations (RQ1), how they prioritise (RQ2) and communicate them (RQ3). We then formulate five principles for process mining visualisations that aid analysts in analysing improvement opportunities (see section 5.1). Last, we discuss the limitations of our study (see section 5.2).

As to RQ1, our findings indicate that process analysts require a holistic overview of the problem in the process rather than only focusing on a single improvement opportunity. To this end, when exploring the problem, analysts commonly consider the process model and current process performance as equally important. We found that process analysts mainly focus on time and quality measures. Thus, process models are used to note apparent bottlenecks. At the same time, performance analysis helps assess whether the bottleneck has a crucial impact on the outcomes of the process. Therefore, the visualisation should concurrently capture the process and its performance to provide a holistic view of improvement opportunities. This finding is in line with Pini et al. (2015) who argue for the need to understand different perspectives (e.g. time, resources) together with the model.

Furthermore, in the context of our study, process analysts need to see how much of the process is affected by the improvement opportunity. This is motivated by trying to understand whether the problem is worth considering. The basis for such a decision is the number of cases and variants involved. This finding is aligned with results reported in process mining case studies (e.g. Mahendrawathi et al. (2015)). It can be expected that organisations are interested in addressing those issues that have more impact.

In the context of our study, process analysts commonly consider an improvement opportunity's relevancy based on whether it addresses the organisation's business needs or objectives. This is expected since process analysis prescribes qualitative and quantitative analysis, that is consulting both the data and the domain experts (Dumas et al., 2018). We note that such an approach also extends to automatically identified improvement opportunities. Our findings also indicate that process analysts pay attention to the quantitative data but, at the same time, consider whether such improvements address the company's business needs. Therefore, we suggest combining technical and business approaches to the analysis when visualising process improvement opportunities.

With regard to RQ2, in the context of our study, it was important for process analysts to understand the potential benefits of addressing the improvement opportunity. As such, process analysts evaluate the improvement opportunity with specific KPIs in mind, that is which KPIs of the process can be improved if the opportunity is addressed. As companies define process KPIs with consideration to their business needs, this helps evaluate the relevancy of the improvement opportunity. This finding is in line with previous research of Grisold et al. (2021), stating that the application of process mining in an organisation should be aligned with its strategy. However, our study provides additional insight into specific factors and representations to consider when applying process mining for identifying and visualising process improvement opportunities. We found that the prioritisation of opportunities is complex. Analysts commonly compare KPI graphs of different improvement opportunities and consider the cost and effort of required changes. Previous works have provided recommendations on visualising process performance with regard to, for example, time (Gulden, 2016; Bachhofner et al., 2017; Kaouni et al., 2021) and resource behaviour (Pika et al., 2014; Low et al., 2017) which can be used to analyse a process to find improvement opportunities. However, while an improvement opportunity can change process performance, it can be challenging to implement the required changes due to a lack of financial, technological or human resources. This is confirmed by previous works (Bitomsky et al., 2019; Grisold et al., 2021) that highlight the need for organisations to consider multi-dimensional effects of improvement projects, that is to confirm that the investments required for the changes will yield sufficient returns. Therefore, while visualising the impact of improvements on the process performance, it is also advisable to consider other factors, such as cost and effort of implementation.

As to RQ3, our findings suggest that process analysts adjust their means of communication based on what will help stakeholders better understand the results. In other words, process analysts simplify the results because the audience is commonly not sufficiently acquainted with process mining. This confirms the findings of Dani et al. (2019) who report that managers need an overview while process analysts require a detailed view of the process model. Our study contributes to these findings since it suggests that process analysts also simplify conclusions from their analysis, often extracting data from process mining tools and creating custom graphics that are less complex and contain fewer data. When it comes to communicating process improvement opportunities, managers commonly require a more standardised communication approach such as slides, more straightforward process models, additional notes and comments on the graphics. Thus, it might be helpful to enable analysts to edit visualisations for simplification in process mining tools.

Additionally, we found that process analysts are concerned about the trustworthiness of the results. In other words, regardless of the visualisation presented (for example, a process model with the improvement opportunity highlighted or a comparison graph of KPIs), process analysts commonly need to explore the data behind the visualisation. The motivation for this is twofold. First, the findings indicate that process analysts tend to demonstrate concerns regarding the quality of the original data and, thus, the credibility of the identified improvement opportunity. Second, it is evident that process analysts tend to double-check the findings as they might have specific insights about the data or the process that process mining tools do not capture. In conclusion, we suggest including a linkage to the original data behind each visualisation.

5.1 Five principles for process mining visualisations to analyse improvement opportunities

Based on the discussion above, we elicit five principles for process mining visualisations to analyse process improvement opportunities. First, in the context of our study, process analysts referred to the process model as their starting point for analysis. They explained that they consult the process model in every analysis stage. This is confirmed by the extensive body of research on the visualisation of process models, for example a review on developments of process models visualisation over ten years (Dani et al., 2019). However, we note that process performance diagrams should be provided along with process models to help analysts to assess the impact of improvement opportunities. Therefore, we conclude that combining process models with performance diagrams is valuable for a particular improvement opportunity.

Second, another indicator that helps to assess the impact of an improvement opportunity on the process is the proportion of cases that are affected by implementing a change. Although case studies on process mining (e.g. Mahendrawathi et al. (2015), Vázquez-Barreiros et al. (2016)) report on addressing issues that can entail large savings, we note that process analysts specifically consider the proportion of cases affected by the change. In other words, it is essential to explicitly denote the proportion of cases that ought to be changed by addressing the improvement opportunity.

Third, to aid in the prioritisation of improvement opportunities, there should be a clear indication of how the improvement opportunities relate to the organisation's objectives, that is what potential they have to improve the defined KPIs. As highlighted by Martin et al. (2021), process mining helps to assess the state of the business processes and, therefore, supports the planning of strategic changes. We note that visually highlighting how an improvement opportunity impacts the process facilitates aligning it with the objectives.

Fourth, comparison of cost and ease of implementation of alternative changes are among the criteria that process analysts consider when deciding which improvement opportunities to address. The importance of cost-benefit assessment of process mining initiatives has been reported by Grisold et al. (2021). However, we note that the process analysts we interviewed highlighted analysing the ease of implementation in addition to the cost-benefit assessment for process improvement opportunities.

Last, we conclude that including the data behind the visualisations increases trust. As reported previously, process mining increases the transparency of business processes in the organisation (Martin et al., 2021). Based on the findings of our study, we note the importance for analysts to explore the calculation data behind visualisations to ensure transparency.

We found that certain principles are, to some extent, incorporated by existing commercial process mining tools. For instance, generating process models is the core functionality of every tool and many provide the possibility to review process performance w.r.t. to various metrics. At the same time, functionalities such as estimating potential cost and effort of addressing the improvement opportunities and comparing the benefits they could bring are not wide-spread. These functionalities are, however, considered as important by analysts and could be achieved by means of, for example what-if analysis (López-Pintado et al., 2022).

5.2 Limitations

To achieve our research objective, we applied the methodology of the design science approach. There are certain limitations to the methodology (Hevner et al., 2004), which we discuss in this section. As a starting point, we considered input from practitioners from the industry who use process mining in their work. Namely, we conducted interviews with seven process analysts to elicit requirements. Interviewing different participants working in different domains with different projects can yield different results. Therefore, this is a limitation that we reduced by selecting interviewees from different domains that commonly utilise process mining tools.

When analysing qualitative data, there is a limitation related to misinterpreting such data due to bias or subjectivity. We discussed the data collected in the first round of interviews and the feedback and opinions collected during the evaluation round to reduce this threat. We also used additional materials the interviewees presented (screenshots, images shown directly in the process mining and analysis software used and videos). The analysis of such additional materials aided the interpretation of the interviews. To ensure we did not misinterpret the findings, we invited three interviewees from the first round to participate in the evaluation round. When analysing the data from the second stage, we did not find any significant differences between the input from new participants and repeated participants. Furthermore, the findings from interviewees participating in both rounds confirmed that the visualisation needs were adequately elicited in the first round. Finally, inherent to the design science methodology is the limitation of the extent to which the results can be generalised beyond the scope of this research. This limitation was, to some extent, reduced by selecting both internal and consultant analysts working across different industry domains.

6. Conclusion

In this paper, we explore how process mining visualisations can aid process analysts in their work to identify, prioritise and communicate business process improvement opportunities. In achieving our research objective, we applied the design science research methodology. As such, we elicited requirements for visualising improvement opportunities through interviews with process analysts. More specifically, in the interviews, we focused on researching how process analysts work with process improvement opportunities using outputs from process mining tools available.

Our findings provide insights into how process analysts identify, prioritise and communicate process improvement opportunities from visual representation. As a result, we found that process analysts look for visualisation elements that relate to process models and process performance graphs. Furthermore, analysts use such visualisations to understand the current problem, its influence on the process, the benefits of addressing an improvement opportunity and alternatives for process improvement.

Based on our findings, we developed and evaluated IRVIN, a mockup to visualise improvement opportunities. The results of the evaluation indicated the usefulness of IRVIN. The evaluation also enabled us to identify improvements to IRVIN, which we incorporated. The contribution of this paper is a set of guidelines to visualise process mining outputs representing identified improvement opportunities that stem from IRVIN. They are particularly relevant for process analysts who identify improvement opportunities and communicate their findings to relevant stakeholders. Furthermore, our findings are also helpful for developers of process mining tools. They can use our findings to guide their efforts to identify what to include in their visualisations and visualise aspects related to improvement opportunities. Our findings have limitations that are inherent to the design science methodology. Although we reduced the threats to validity, the main limitation is the extent to which the results can be generalised.

For future work, we aim to develop a tool that can be used to visualise improvement opportunities derived from process mining. Namely, the tool will provide a readily available visualisation of improvement opportunities based on the findings of this paper. Moreover, we found evidence that process analysts consider a combination of changes when assessing improvement opportunities, that is they try to introduce such changes that can solve several problems. However, our findings also indicate that process analysts have concerns that such changes may contradict each other or be challenging to implement. One direction for future work is to research how possible process changes can impact each other and how this interplay can be visualised.

Figures

Research process

Figure 1

Research process

Hierarchy of IRVIN screens

Figure 2

Hierarchy of IRVIN screens

IRVIN example screen (overview of improvement opportunity #1)

Figure 3

IRVIN example screen (overview of improvement opportunity #1)

IRVIN example screen (comparison of improvement opportunities #1 and #3)

Figure 4

IRVIN example screen (comparison of improvement opportunities #1 and #3)

Improved IRVIN example screen (comparison of improvement opportunities #1 and #3)

Figure 5

Improved IRVIN example screen (comparison of improvement opportunities #1 and #3)

Exploration interview participants

CodeDomainBusiness role
I-01Electrical engineeringInternal process analyst
I-02Insurance servicesInternal process analyst
I-03Public servicesInternal process analyst
I-04Data scienceConsultant
I-05AuditingConsultant
I-06Process miningConsultant
I-07E-commerceInternal process analyst

Source(s): Authors' own work

Evaluation interview participants

CodeDomainBusiness roleRepeated
I-01Electrical engineeringInternal process analyst*
I-04Data scienceConsultant*
I-05AuditingConsultant*
I-08Banking servicesInternal process analyst
I-09Banking servicesInternal process analyst
I-10Consultancy servicesConsultant
I-11Public servicesConsultant
I-12RoboticsConsultant

Source(s): Authors' own work

Exploration interview findings

#FindingExemplary quote
RQ1 How do process analysts identify improvement opportunities from visual representation?
F1Rely on process models when analysing the process for improvement opportunitiesI use the actual process visualisations to find bottlenecks (I-07)
F2Consider insights from both process mining and the business side of the analysisOftentimes, if you don't understand how the data is generated and where the data comes from, you might misinterpret the visualisation of the process mining software. And oftentimes, if you don't understand the business process, you might overreact to exceptions that are shown on the process map. (I-05)
F3Decompose problems into smaller and more manageable sizes to investigate them separatelyDividing the problem into smaller chunks of the problem to make it manageable is something which I believe works because you […] do not want to start with a very complex process (I-06)
RQ2 How do process analysts prioritise improvement opportunities from visual representation?
F4Assess the impact of the finding on the process regarding its location, number of cases and variants involved[We measure] the importance of that finding or the terms of the impact on the overall process (I-06)
F5Analyse the dependency on entities outside of the process or the organisationBut that's external to the process, so you can't do anything with that. So, then you have to specify that the recommendation I'm making is intrinsic […] or extrinsic to the [process]." (I-05)
F6Assess the financial gain of the findingWe build some sort of a business case on how much we can save (I-01)
F7Analyse whether improvement opportunities can improve the metrics from business objectivesIt's important to know the concrete KPI from the beginning of the project but it happens that they need to be clarified with the managers and have to be readjusted, so what would change in the process using what specific KPI. (I-02)
F8Consider process performance change when addressing improvement opportunitiesHow much the process would be improved from what performance measure; if the KPI is time, how much time would be saved. (I-02)
RQ3 How do process analysts communicate improvement opportunities from visual representation?
F9Use storytelling to present the finding(s) and select visual representations according to the storyYou take that piece of information or data and try to put it in a simplified context that works as a narrative and easy enough to understand. (I-05)
F10Adjust the communication to the client needs (i.e. including more technical or business details)Different clients have different modes of communication. Some clients require very formal approaches. (I-04)

Source(s): Authors' own work

User stories elicited from exploration interviews. Columns “Degree of Support” and “Tools” show examples of process mining tools that have the functionality that corresponds to the respective user story

#User storyBased on findingDegree of supportTools
As a process analyst, I want to …
US1… see the process models of the old process and the improved process, so I can assess changes in the activities and the pathsF1Fully PartiallyAll 16 (“As-is process visualisation” feature)
US2… see waiting times between activities, so I can improve the long waiting timesF7FullyAll 16
US3… see the processing times of activities, so I can investigate what slows processing timesF7FullyAll 16
US4… understand how many cases of the process are involved in the improvement opportunity, so I can assess its impact on the processF4Mostly13 (“Case and activity list” feature: 1–3, 5–7, 9–11, 13–16)
US5… see the statistics of the old process and the improved process, so I can assess how much a certain improvement would change the processF8Partially5 (“Scenario simulation” feature: 4, 5, 7, 11, 15)
US6… adjust KPIs when I clarify them with management, so I suggest improvements that help reach the KPIsF2Mostly10 (“Custom metrics/KPIs feature”: 1, 3, 5, 7, 10–12, 14–16)
US7… understand whether the improvement opportunity is internal to the process, so I recommend changes that can be implemented within the organization or departmentF5None
US8… prioritize improvement opportunities, so I can choose the ones where the financial gains are the biggestF6None
US9… see the most valuable improvement opportunities for different KPIs, so I can present them from the perspective of those KPIsF7Mostly10 (“Custom metrics/KPIs feature”: 1, 3, 5, 7, 10–12, 14–16)
NoneAutomatic improvement opportunity identification (none)
US10… communicate the performance information in a self-explanatory way, so the users have an overview of the improvement opportunities without additional commentsF9, F10Mostly14 (“Custom dashboards” feature: 1–5, 9–16)

Note(s): Tools list in Appendix 5

Source(s): Authors' own work; Based on Loyola-González (2022)

Evaluation interview findings

#FindingExemplary quote
RQ1 How do process analysts identify improvement opportunities from visual representation?
F11Evaluate process performance in regards to the improvement opportunity before evaluating the redesign options“So, okay, it takes a long time. So what? Does anyone complain? Do we lose customers? Do we get our money later?” (I-01)
F12Consider background data of the identified improvement opportunityWe can maybe have the opportunity to […] see some data, some details that let you understand deeply the work of the [resource] alone (I-11)
RQ2 How do process analysts prioritise improvement opportunities from visual representation?
F13Review all KPIs that are changed if the improvement opportunity is addressedI would focus on changes in KPIs. So I would try to look what exactly changes? And if it's significant?" (I-08)
F14Consider background data of the graphs included in the visualisation of improvement opportunities[…] because it's very easy to get this wrong. So rather than just show me like we saw here, just a quick comparison - this is definitely nice - but the next question is, okay, why do you think it's going to go by 10%? Show me the underlying data. (I-04)
F15Evaluate the cost and ease of addressing the improvement opportunityThat's one thing the difference between the output or the result of improvements, but also I'm thinking of some effort that I have to put to resolve this issue. (I-10)
F16Assess the possibility of addressing multiple improvement opportunities in parallelIf I can by solving the problem two, probably I can already solve half of problem number one. So we always have to consider the solutions, if they could actually be solved, how to say, concurrently, at the same time (I-12)

Source(s): Authors' own work

User stories elicited from evaluation interviews. Columns “Degree of Support” and “Tools” show examples of process mining tools that have the functionality that corresponds to the respective user story

#User storyBased on findingDegree of supportTools
As a process analyst, I want to …
US11… understand the background data of the identified improvement opportunityF12FullyAll 16
US12… understand the data behind a certain KPI improvementF14Partially5 (“Scenario simulation” feature: 4, 5, 7, 11, 15)
US13… understand the ease of addressing the identified improvement opportunityF15None
US14… understand the cost of addressing the identified improvement opportunityF15None
US15… understand which KPIs in the process the identified improvement opportunity can changeF13Partially5 (“Scenario simulation” feature: 4, 5, 7, 11, 15)
US16… understand how the identified improvement opportunity influences my process performanceF11Partially5 (“Scenario simulation” feature: 4, 5, 7, 11, 15)
US17… understand whether I can solve several improvement opportunities with one redesign patternF16None

Note(s): Tools list in Appendix 5

Source(s): Authors' own work; Based on Loyola-González (2022)

Coding scheme for exploration interviews

IDCodeExample(s)Short description
1) Improvement opportunity description
1aBottlenecks around waiting times“Yeah, so a lot of the applications are approved, and the point is to cut down the waiting times maximum possible.”Description of bottlenecks related to waiting times
1bReworks“when the transaction makes a full circle, and how many circles it takes, and that kind of things.”Description of bottlenecks related to reworks
“or if there are any bottlenecks or like reworks”
1cOrder rejection rates“from a standard organization, they say like, okay, 10% of orders ought to be rejected. And we see countries where it's more than 10%, then we start speaking with them”Description of bottlenecks related to order rejection rates
1dSeparation of duties“the most kind of easier to tackle low hanging fruit is this four eye principle, the segregation of duties”Description of bottlenecks related to separation of duties
1eData“it was almost like a semantic analysis of the data because the format they provided us, it was suitable for regulatory purposes, but wasn't ideal for process mining”Data directly related to the process that is to be improved
“you always have challenges with the data sources. Not so much technical. About our understanding of the data”
2) Improvement opportunity identification
2aApproach to process improvement projects“follow the Celonis approach mostly”Methodology used in process improvement projects
“I don't use a framework but simply try to find out what each project needs”
2bAnalysis methods“Dividing the problem into smaller chunks to make it manageable”Methods to identify a specific improvement opportunity in the analysis part of the project
“so you always have a big problem and how to divide it to smaller problem”
2cVisualisation for analysis“I mostly used all of the different types of filterings and, and graphs that you can make in Apromore”Mentions of using different visualisations for process mining in the analysis part of the project
“I use the actual process visualisations to find bottlenecks”
“and we found that by looking at the process diagram”
2dContext“if you don't understand how the data is generated and where the data comes from, you might misinterpret the visualisation of the process mining software and, oftentimes, if you don't understand the business process, you might overreact to exceptions that are shown on the process map”Considering technical and business sides of the analysis when identifying improvement opportunities
3) Prioritisation criteria
3aImprovement opportunity impact“see in the process where this would actually have an impact on”Mentions of assessing the impact that the improvement opportunity has on the overall process
“with the total number of variance that we see, what is the ratio between the total number of variances and ratio of the process population”
“the importance of that finding or the terms of the impact on the overall process”
3bStakeholder feedback“I presented to them everything that I found based on the data, […] and then they gave me feedback about what they would implement and would not implement and why”External (stakeholder) feedback that can help with prioritisation
“it happens that they [KPIs] need to be clarified with the managers and have to be readjusted, so what would change in the process using what specific KPI.”
3cFinancial considerations“[we] want to focus on the ones that will bring us a lot of savings”Mentions of prioritisation criteria related to finance
“it's always about the money”
“financial, the materiality of that, that's of course, very important”
3dDependency on other entities“but that's external to the process, so you can't do anything with that.”Considering dependencies on other entities when making changes to the process
“is it the fundamental problem that can be solved within the process or department that is responsible for that process? Or is it something that has links as I said to other parts of the organisation”
3eProjected process performance change“how much the process would be improved from what performance measure; if the KPI is time, how much time would be saved”How much process performance can change if the improvement opportunity is addressed
“the difference between certain statistics of the old process and try to calculate the same statistics with the modified process”
4) Communication
4aThings communicated“typically, I show them [end-users] things interactively while explaining in parallel”In which form the findings are presented
“usually, it's just screenshots from this tool”
4bMethods for communication“you take that piece of information or data and try to put it in a simplified context that works as a narrative and easy enough to understand”What methods are used to communicate the findings
4cPurpose of communication“we send the preliminary findings to the paying agency to confirm”Description of purposes for which the findings are communicated
“the main use of this dashboard is for people working with the cases themselves”
4dAudience“different clients have different modes of communication. Some clients require very formal approaches”Indications of using different communication strategies for different audiences
5) Tools
5aTools“[process mining finding] is much more valuable when you [also] have business intelligence capabilities”Any technical tools used to support the identification of improvement opportunities, e.g. process mining tools
“You can do it better with the script”

Coding scheme for evaluation interviews

IDCodeExample(s)Short description
1) General comments
1aLabels“another thing that you call this performance dimension, and then you have here process performance, and that's super confusing. Which performance we're talking about?”Issues with understanding the names of some elements in the prototype
“I found that a little bit the naming of things wasn't very intuitive to me.”
1bInterface“This is very clear, this red and green colours are used sparsely. And it's good. It's good, because too much colours are horrible when you try to analyze something.”Comments related to the interface
“I think the visualisation is perfect, because it's clear, it's concise, it's simple.”
1cSuggestions“So you can see all KPIs compared. And you can, instead of like - yeah, I see that it’s not clickable now - but instead of clicking, selecting, if you just have three, four KPIs, just show them.”Direct suggestions from interviewees on what to change in the visualisation
“In fact, I want to be able to put a custom dimension, ”This is what time means to me in this process. This is what matters.”
2) Improvement opportunity identification
2aUnderstanding current problem“But before I go there, I just want to familiarize myself with the actual process and some of the parameters.”Voicing the need to see the problem that is dealt with first before going to evaluate the redesign options
“but before I would like to see the redesign opportunities, I would kind of go through the identified issues here.”
“if this redesign option does not address that problem, then it doesn't help.”
2bUnderstanding context data“Well, it would be nice if I could see the data underlying that. You know, when I click the arrow, for example here, that I can see the specific cases, perhaps a list of specific cases that are affected by this potential improvement.”Voicing the need to understand what the suggestions in the visualisation of improvement opportunities are based on
“We can maybe have the opportunity to put a click on the flow and see some data, some details that let you understand deeply the work of the secretary alone.”
3) Parameters for prioritisation
3aChange in KPIs“So if I can see that there is something that can reduce the rework - great, I would look at that.”Evaluating the change in KPIs (process performance) to prioritise improvement opportunities
“for deciding on the specific improvement opportunity, the important thing for me to understand is how they change the process performance”
“I would focus on changes in KPIs. So I would try to look what exactly changes? And if it's significant?”
3bFeasibility“That's the one thing, the difference between the output or the result of improvements, but also I'm thinking of some effort that I have to put to resolve this issue.”Analyzing the trade-off between the gains and the feasibility of implementation (cost, effort)
“First, how many cases are affected? So when it's just 5%, maybe I will not do anything. So this is important. So when I go to overview, I will look at this. And then costs that I would pay for improvement for this suggestion for this opportunity.”
“I will try to see if I can, how do I make the biggest improvement without devoting extra resources”
3cCombined impact of improvement opportunities“we'll try to see how the different combinations between these opportunities work”Assessing if it's possible to address several improvement opportunities at once
“If I can by solving the problem two, probably I can already solve half of problem number one. So we always have to consider the solutions, if they could actually be solved concurrently, at the same time”
3dConsulting context data“So rather than just show me like we saw here, just a quick comparison - this is definitely nice - but the next question is, okay, why do you think it's going to go by 10%? Show me the underlying data.”Consulting with the context data on which the calculations for changes were made before making the decisions
“I would want more clarity, how things, what do they mean, what exactly is meant by a certain KPI, or how did certain suggestion come about?”
4) Communication
4aThings communicated“Oh, certainly use images from the application and therefore, it's to support my decision by data.”In which form interviewees would present the findings from the prototype
“but what's important that you can just screenshot the graphs from here.”
4bMethods for communication“So in this case, I would, I would have just a few slides, maybe like one or two slides showing the current process and the improvement proposal or the improvement opportunity identified by the software, and the alternative that I like the most or the alternative that I would like to sell.”What methods interviewees would use to present the findings from the prototype
“The best way, it's not a PowerPoint presentation but directly in the tool, directly this data. Fact based, it's top one.”

Notes

Appendix 1

Exploration Interview Guide

The aim of this interview is to understand what process analysts need when they use process mining tools. In particular, we are interested in their approach to identifying, prioritising and communicating improvement opportunities. The gathered information will be used as a basis to develop a tool that visualises improvement opportunities.

Introduction:

  1. First of all, please describe your position and responsibilities in this position in a few words.

  2. We are interested in a specific process improvement project that you carried out and that you remember well. Please try to think about one specific project, for example, if you had to improve the performance of one concrete process in your company, or if there was something interesting that happened in the project.

Introductory questions:

  1. Can you share some of the materials for the project with me? (for example, a process model, a screenshot from the tool that you used, some dashboards, slides, etc.) Can you send them to me or share your screen and show them live?

  2. What was the case for which the process improvement project was initiated?

  3. Why was this case of interest?

  4. Is it a typical case? If not, what is different between this case and usual case for such a project?

  5. What tool was used in this project? (i.e, specific process mining tool)

Theme 1: Identifying improvement opportunities

  1. What was the specific improvement opportunity identified?

  2. How was the improvement opportunity identified?

    • What were the criteria/measures to identify the improvement opportunities/bottlenecks?

    • Were any visualisations used to help identify the improvement opportunity? (if yes, c)

    • (if yes) What was important to see visually?

    • What data was used in the case? Where was the data extracted from?

    • What challenges did you have when you tried to analyze the process? (e.g. too much information to see, can't distinguish the types of bottlenecks, etc.)

Theme 2: Selecting improvement opportunities

  1. Were there any alternatives to the selected improvement opportunity? (if yes, a and b)

    • (if yes) How was it decided which one to select? Who made this decision?

    • (if yes) Were any visualisations used to help decide on the improvement opportunity? What was important to see visually to compare them?

Theme 3: Communicating improvement opportunities

  1. Who were the identified improvement opportunities communicated to?

  2. How were the results presented (images, dashboards, etc.)?

Concluding questions:

  1. What else do you wish you could know to make a decision regarding the improvement opportunity?

  2. Generally, can you name one thing that worked well in this project and one thing that did not?

Appendix 2

Table A1.

Appendix 3

Evaluation Interview Guide

The aim of this interview is to evaluate the mockup of the visualisation of improvement opportunities identified from an event log. We will particularly focus on identifying additional requirements for such visualisation and means for its improvement.

Scenario: Imagine you are analysing a claim-to-resolution process in an insurance company. The process mining tool you are using has identified four improvement opportunities in the process: Order queues, Unnecessary job handovers, Activity rework and Low resource capacity. Here on the left, you can see that the identified improvement opportunities are sorted into performance dimensions, such as time, cost and quality. You can also choose to view them all together like it is now.

Evaluation procedure:

  1. Part 1. We did a walk-through presentation using the scenario, and the participants could ask questions if something was unclear.

  2. Part 2. The participants received access to the mockup, and we asked how they would use these visualisations in their work using the following questions.

  • Using this visualisation …

  • How do you proceed with assessing an individual improvement opportunity? What is important to see to assess it?

  • How do you proceed with deciding which improvement opportunity to choose? What is important to see to decide on a specific improvement opportunity?

  • How do you proceed with communicating the chosen improvement opportunity to the business people/clients?

Post-interview:

  1. What issues did you have while interacting with the mockup?

  2. How could we improve this mockup so that it would better suit your needs?

  3. What is missing in the mockup?

  4. Please tell me one thing that you liked about the mockup and one thing that you did not like.

Appendix 4

Table A2.

Appendix 5

Tools list (based on Loyola-González (2022))

  1. ABBYY Timeline

  2. Apromore

  3. ARIS

  4. Business Optix

  5. Celonis

  6. Disco

  7. Everflow

  8. Logpickr

  9. Mehrwerk

  10. Minit

  11. myInvenio

  12. PAFnow

  13. Pro discovery

  14. QPR

  15. Signavio

  16. UiPath

References

Agostinelli, S., Maggi, F.M., Marrella, A. and Milani, F. (2019), “A user evaluation of process discovery algorithms in a software engineering company”, EDOC, IEEE, pp. 142-150.

Bachhofner, S., Kis, I., di Ciccio, C. and Mendling, J. (2017), “Towards a multi-parametric visualisation approach for business process analytics”, CAiSE Workshops, Springer, pp. 85-91, Vol. 286 of Lecture Notes in Business Information Processing.

Basole, R.C., Park, H., Gupta, M., Braunstein, M.L., Chau, D.H. and Thompson, M. (2015), “A visual analytics approach to understanding care process variation and conformance”, VAHC, ACM, pp. 6:1-6:8.

Bitomsky, L., Huhn, J., Kratsch, W. and Roeglinger, M. (2019), “Process meets project prioritization - a decision model for developing process improvement roadmaps”, ECIS.

Bolt, A., de Leoni, M. and van der Aalst, W.M.P. (2016), “A visual approach to spot statistically-significant differences in event logs based on process metrics”, CAiSE, Springer, pp. 151-166, Vol. 9694 of Lecture Notes in Computer Science.

Braun, V. and Clarke, V. (2006), “Using thematic analysis in psychology”, Qualitative Research in Psychology, Vol. 3 No. 2, pp. 77-101.

Chiò, E., Alfieri, A. and Pastore, E. (2021), “Change-point visualization and variation analysis in a simple production line: a process mining application in manufacturing”, Procedia CIRP, Vol. 99, pp. 573-579.

Cohn, M. (2004), User Stories Applied: For Agile Software Development, Addison Wesley Longman Publishing Co, Boston, MA.

Conforti, R., de Leoni, M., la Rosa, M. and van der Aalst, W.M.P. (2013), “Supporting risk-informed decisions during business process execution”, CAiSE, Springer, pp. 116-132, Vol. 7908 of Lecture Notes in Computer Science.

Corallo, A., Lazoi, M. and Striani, F. (2020), “Process mining and industrial applications: a systematic literature review”, Knowledge and Process Management, Vol. 27 No. 3, pp. 225-233.

Dani, V.S., Freitas, C.M.D.S. and Thom, L.H. (2019), “Ten years of visualization of business process models: a systematic literature review”, Computer Standards & Interfaces, Vol. 66 103347.

de Leoni, M., Suriadi, S., ter Hofstede, A.H.M. and van der Aalst, W.M.P. (2016), “Turning event logs into process movies: animating what has really happened”, Software and Systems Modeling, Vol. 15 No. 3, pp. 707-732.

Detro, S.P., Santos, E.A.P., Panetto, H., Loures, E.D.F.R., Lezoche, M. and Barra, C.M.C.M. (2020), “Applying process mining and semantic reasoning for process model customisation in healthcare”, Enterprise Information System, Vol. 14 No. 7, pp. 983-1009.

di Francescomarino, C., Ghidini, C., Maggi, F.M. and Milani, F. (2018), “Predictive process monitoring methods: which one suits me best?”, International Conference on Business Process Management, Springer, pp. 462-479.

Dumas, M., la Rosa, M., Mendling, J. and Reijers, H.A. (2018), Fundamentals of Business Process Management, 2nd ed., Springer Berlin, Heidelberg.

Erdogan, T.G. and Tarhan, A.K. (2022), “Multi-perspective process mining for emergency process”, Health Informatics Journal, Vol. 28 No. 1, 146045822210771.

Fahrenkrog-Petersen, S.A., Tax, N., Teinemaa, I., Dumas, M., de Leoni, M., Maggi, F.M. and Weidlich, M. (2022), “Fire now, fire later: alarm-based systems for prescriptive process monitoring”, Knowledge and Information Systems, Vol. 64 No. 2, pp. 559-587.

Gall, M., Wallner, G., Kriglstein, S. and Rinderle-Ma, S. (2015), “Differencegraph - a prom plugin for calculating and visualizing differences between processes”, BPM (Demos), CEUR-WS.org, pp. 65-69, Vol. 1418 of CEUR Workshop Proceedings.

Grisold, T., Mendling, J., Otto, M. and vom Brocke, J. (2021), “Adoption, use and management of process mining in practice”, Business Process Management Journal, Vol. 27 No. 2, pp. 369-387.

Gulden, J. (2016), “Visually comparing process dynamics with rhythm-eye views”, Business Process Management Workshops, pp. 474-485, Vol. 281 of Lecture Notes in Business Information Processing.

Hamm, M.J. (2014), Wireframing Essentials, Packt Publishing, Birmingham.

Harrell, M.C. and Bradley, M.A. (2009), “Data collection methods. Semi-structured interviews and focus groups”, Technical report, Rand National Defense Research Institute, Santa Monica.

Hevner, A.R., March, S.T., Park, J. and Ram, S. (2004), “Design science in information systems research”, MIS Quarterly, Vol. 28 No. 1, pp. 75-105.

Holtzblatt, K. and Jones, S. (1995), “Conducting and analyzing a contextual interview (excerpt)”, in Baecker, R.M., Grudin, J., Buxton, W.A.S. and Greenberg, S. (Eds), ‘Readings in Human–Computer Interaction’, Interactive Technologies, Morgan Kaufmann, pp. 241-253.

Kaouni, A., Theodoropoulou, G., Bousdekis, A., Voulodimos, A. and Miaoulis, G. (2021), “Visual analytics in process mining for supporting business process improvement”, NiDS, IOS Press, pp. 166-175, Vol. 338 of Frontiers in Artificial Intelligence and Applications.

Kedem-Yemini, S., Mamon, N.S. and Mashiah, G. (2018), “An analysis of cargo release services with process mining: a case study in a logistics company”, Proceedings of the International Conference on Industrial Engineering and Operations Management, pp. 726-736.

Khan, A., Le, H., Do, K., Tran, T., Ghose, A., Dam, H.K. and Sindhgatta, R. (2021), “Deepprocess: supporting business process execution using a mann-based recommender system”, ICSOC, Springer, Cham, pp. 19-33, Vol. 13121 of Lecture Notes in Computer Science.

Kouzari, E. and Stamelos, I. (2018), “Process mining applied on library information systems: a case study”, Library & Information Science Research, Vol. 40 No. 3, pp. 245-254.

Kubrak, K., Milani, F., Nolte, A. and Dumas, M. (2022), “Prescriptive process monitoring: Quo vadis?”, PeerJ Computer Science, Vol. 8, e1097.

Lashkevich, K. (2020), “Business process improvement opportunities: a framework to support business process redesign”, Master’s thesis, University of Tartu.

López-Pintado, O., Halenok, I. and Dumas, M. (2022), “Prosimos: discovering and simulating business processes with differentiated resources”, (EDOC) Workshops, Lecture Notes in Business Information Processing, Springer, Cham, Vol. 466, pp. 346-352.

Low, W.Z., van der Aalst, W.M.P., ter Hofstede, A.H.M., Wynn, M.T. and Weerdt, J.D. (2017), “Change visualisation: analysing the resource and timing differences between two event logs”, Information Systems, Vol. 65, pp. 106-123.

Loyola-González, O. (2022), “Process mining: software comparison, trends, and challenges”, International Journal of Data Science and Analytics, pp. 1-14.

Mahendrawathi, E.R., Astuti, H.M. and Nastiti, A. (2015), “Analysis of customer fulfilment with process mining: a case study in a telecommunication company”, Procedia Computer Science, Vol. 72, pp. 588-596.

Marques, R., da Silva, M.M. and Ferreira, D.R. (2018), “Assessing agile software development processes with process mining: a case study”, CBI (1), IEEE Computer Society, pp. 109-118.

Martin, N., Fischer, D.A., Kerpedzhiev, G.D., Goel, K., Leemans, S.J.J., Röglinger, M., van der Aalst, W.M.P., Dumas, M., la Rosa, M. and Wynn, M.T. (2021), “Opportunities and challenges for process mining in organizations: results of a delphi study”, Business & Information Systems Engineering, Vol. 63 No. 5, pp. 511-527.

Milani, F., Lashkevich, K., Maggi, F.M. and di Francescomarino, C. (2022), “Process mining: a guide for practitioners”, RCIS, Springer, pp. 265-282, Vol. 446 of Lecture Notes in Business Information Processing.

Munzner, T. (2014), Visualization Analysis and Design, A.K. Peters Visualization Series, A K Peters, CRC Press, Boca Raton, FL.

Pika, A., Wynn, M.T., Fidge, C.J., ter Hofstede, A.H.M., Leyer, M. and van der Aalst, W.M.P. (2014), “An extensible framework for analysing resource behaviour using event logs”, CAiSE, Springer, pp. 564-579, Vol. 8484 of Lecture Notes in Computer Science.

Pini, A., Brown, R. and Wynn, M.T. (2015), “Process visualization techniques for multi-perspective process comparisons”, AP-BPM, Springer, pp. 183-197, Vol. 219 of Lecture Notes in Business Information Processing.

Shneiderman, B. (1996), “The eyes have it: a task by data type taxonomy for information visualizations”, VL, IEEE Computer Society, pp. 336-343.

Sirgmets, M., Milani, F., Nolte, A. and Pungas, T. (2018), “Designing process diagrams - a framework for making design choices when visualizing process mining outputs”, OTM Conferences (1), Springer, pp. 463-480, Vol. 11229 of Lecture Notes in Computer Science.

Stefanini, A., Aloini, D., Benevento, E., Dulmin, R. and Mininno, V. (2020), “A process mining methodology for modeling unstructured processes”, Knowledge and Process Management, Vol. 27 No. 4, pp. 294-310.

Vázquez-Barreiros, B., Chapela, D., Mucientes, M., Lama, M. and Berea, D. (2016), “Process mining in IT service management: a case study”, ATAED@Petri Nets/ACSD, CEUR-WS.org, pp. 16-30, Vol. 1592 of CEUR Workshop Proceedings.

van der Aalst, W.M.P. (2016), Process Mining - Data Science in Action, 2nd ed., Springer-Verlag, Heidelberg.

Wynn, M.T., Poppe, E., Xu, J., ter Hofstede, A.H.M., Brown, R., Pini, A. and van der Aalst, W.M.P. (2017), “Processprofiler3d: a visualisation framework for log-based process performance comparison”, Decision Support System, Vol. 100, pp. 93-108.

Zerbato, F., Soffer, P. and Weber, B. (2022), “Process mining practices: evidence from interviews”, BPM, Springer, pp. 268-285, Vol. 13420 of Lecture Notes in Computer Science.

Acknowledgements

This research is supported by the Estonian Research Council (PRG1226) and the European Research Council (PIX Project).

Corresponding author

Kateryna Kubrak can be contacted at: kateryna.kubrak@ut.ee

Related articles