1 Zarz¹dzanie i Finanse Journal of Management and Finance Vol. 12, No. 3/2/2014 Bartosz Marcinkowski* Bart³omiej Gawin** Bartosz Marcinkowski, Bart³omiej Gawin Enterprise Processes Simulated Prerequisites, Data Acquisition Methods and Tool Support Enterprise Processes Simulated Prerequisites... Introduction Computer simulations of business processes apply to both newly created processes, introduced by analysts in response to the needs of a new business or a particular project, as well as processes that are already in operation in commercial environments. In both cases, simulation aims to evaluate the behavior of the selected organizations business conditions on the basis of digitalized process models elaborated with visual notations such as BPMN [Object Management Group, 2013], eepc [Scheer, 1992], IDEF3 [Mayer et al., 1995], UML Profile for Business Modeling [Johnston, 2004] or BPMS [Karagiannis, Junginger and Strobl, 1996] enhanced with quantifiable data and subjected to appropriate algorithms. Depending on the data set feeding a computer model of the process, it may be possible to apply simulation algorithms that enable, inter alia, verification of the logical model s accuracy, estimation of the process s time requirements and costing, staffing needs to be established, the identification of bottlenecks as well as calculation of resource loads, with which the company intends to carry out the process. If the results of the computer simulations clearly support the need for the implementation of modifications and innovations within the process, then the analyst, with access to the model s parameters, may attempt to execute them virtually in real-time. Additional options include re-shaping of business processes or changing resource allocation to process components in opportunistic fashion. After restarting the simulation, the impact of the changes implemented on the overall results are discernible. By testing different variations of the process models even prior to completion of the computer simulation stage a solution may be developed that meets the organization s business requirements. As a result, the optimized model makes a strong basis for the workflow * PhD, Business Informatics Department, Faculty of Management, University of Gdañsk, Piaskowa 9, Sopot, ** PhD, Business Informatics Department, Faculty of Management, University of Gdañsk,
2 288 Bartosz Marcinkowski, Bart³omiej Gawin system. Adapting this new process definition to the workflow engine leads to the organization being run in accordance with a developed, validated computer model. The procedure described is part of the management framework used in business for the control and continuous improvement of processes and products, which has been formalized by E.C. Deming and W. Shewhart as the PDCA method [Moen and Norman, 2009]. Its name originates from the first letters of the words Plan, Do, Check and Act. The PDCA method may be used as a basis for practical solutions, typically developed by software developers, supporting complex business processes management. Lastly, the PDCA cycle has been enriched with business process simulation (BPS) stage. The stage is usually supported by IT simulation tools and became an important part of the evaluation of (re)designed business processes [Suzuki et al., 2013]. Computer simulation of business processes using dedicated tools is among the most efficient ways to reduce companies costs. Testing newly designed processes on live organisms without prior simulation and verification is likely to fail and lead to significant financial losses. The most popular tools for business process modeling and simulation in the Polish market include ARIS, ADONIS and igrafx. Some companies e.g. Wipro [Srivastava, 2010], Qwest [Teubner, 2008], Toyota Motor Company [Hauser and Paper, 2007] provide case studies of projects involving simulation tools and a large number of these projects have proved successful. The case studies illustrate that numerous simulation iterations led to the elaboration of highly useful workflow models and the simulation outcomes achieved allowed for the development of an effective strategy for implementing business processes. It is worth emphasizing that, regardless of the success or failure of attempts to simulate processes, the challenge of obtaining data for the simulation algorithms is always one of the major obstacles encountered by process optimization teams. The execution of a computer simulation requires the provision of a set of data, information and models that are not easily acquired and, additionally, should be complete and reliable. It is the quality of the quantifiable data input for the simulator that determines whether the results provide a solid basis for implementing processes in real environments or not. This paper seeks to identify, systemize and elaborate implementable techniques of data acquisition that enable the execution of process simulation for the purposes of improving existing (as well as creating new) patterns of workflows in enterprises. The second section provides an over-
3 Enterprise Processes Simulated Prerequisites view of Workflow Management Systems architecture and discusses data collection methods using a variety of IT/IS sources. One of the exploration methods workflow mining has been described in greater detail as an innovative method of collecting data for the purpose of conducting computer based simulations of business processes. Section two provides an outline of the types of algorithms in the ADONIS simulation engine. The necessary data inputs for running possible simulation algorithms are outlined. The third section provides an overview of traditional data collection methods that rely on conducting observations of business processes flows as well as interviews with staff involved in the supervision / execution of activities within processes. The article concludes by summarizing the main findings and making recommendations for selecting data sources for simulation-oriented process models. 1. Data acquisition from information systems Workflow Management Systems (commonly known by the acronym WfMS) facilitate the management of workflow, information and documentation in business organizations. Business process models elaborated by analysts can be used (once supplemented with additional information) as execution models for workflow systems, which ensure the allocation of tasks to certain resources (humans, machines) in a defined sequence. Workflow systems operate in a client-server architecture, where the client side is usually a web browser (rarely a separate application) that can be operated from the employee computer or terminal. WfMS server software is usually called a business process engine, which can interpret process definitions, transposed from graphical models to computer code. Based on the process definition, workflow engines execute process models as flows of forms and documents that contain information regarding the tasks for employees. The aforementioned forms, called Transaction Sets, are managed in accordance with defined route, which is a standard feature of every workflow system. For example, if an employee performs tasks from the list, the process engine will then flow the task to the next participant in the process, whose computer screen will display information regarding the assigned task. In commercial use, dedicated business processes (such as collecting information about banks clients) are run on the engine several times, but based upon a single process definition. Workflow engines can support many business processes and each of the processes can have a separate model allocated.
4 290 Bartosz Marcinkowski, Bart³omiej Gawin The resources involved in business task execution are accountable for completing assigned tasks, which, in accordance with the process model, flow through the structure of the company. Each event that involves executing a task and assigning workflow to the consecutive organizational units, until the process is completed, is recorded in WfMS databases as a business process instance (see figure 1). Each business process instance can be differentiated by a unique identification number that is called InitialProcessInstanceID [Workflow Management Coalition, 1998]. Figure 1. Common Workflow Management System architecture Source: Own elaboration. Running the same process numerous times means that workflow systems sequentially record process instances in the database. Each process instance is described by attributes. The values of the attributes reflect the real execution of events. Descriptive data includes, for example: the IDs of users involved in process execution, the duration of individual activities, waiting times for event completion, IDs of external applications invoked as well as condition sets assigned to decision points that affect the further flow of the process instance. Process instances, recorded over the years in the event log, contain a huge amount of workflow data. Exploration of the data, also known as Knowledge Discovery in Databases, is a multi-step process that involves raw data transformation from the event log into actionable knowledge about the organization. In addition to the actual projection of the company s operations, the data may also be used to feed the simulation-oriented computer process models [WfMC, 2013]. The mapping of actual processes into parameterized computer process models is the purpose of model simulation (fed with real-world data).
5 Enterprise Processes Simulated Prerequisites By performing numerous simulations and adjusting the parameters of the model, a set of implementable process variants can be developed. It is also possible to estimate workloads and identify bottlenecks and deadlocks. The aforementioned actions lead to the development of new business process definitions that after being implemented within the structure of the company will improve its operation. The procedure for incremental re-design of business process begins with data mining and through simulation of business processes leads to the development of new models for the WfMS, as illustrated in figure 2. Figure 2. Incremental re-design of business process models Source: [Gawin, 2011]. The line of action illustrated in figure 2 is a recurrent cycle, which, like the Deming wheel, provides for the continuous monitoring and improvement of business processes in companies. Data processing and analysis is the key stage of the procedure. The stage addresses both the issue of preparing the data for the computer processes simulation model and the acquisition of knowledge on the current business process situation analysis. At this stage, it is worthy of using an innovative method of data processing process mining. While process mining uses conventional IT mechanisms such as SQL mining algorithms for discovering knowledge on
6 292 Bartosz Marcinkowski, Bart³omiej Gawin business processes, however, treat processes multi-dimensionally and provide data detailed enough for computer model simulation Database exploration The primary, and most common, way to extract data from electronic repositories is using SQL (Structured Query Language). Other methods, such as OLAP (On-Line Analytical Processing), BI (Business Intelligence), Data Mining and Process Mining, have additional mechanisms that go beyond exploration and include direct data analysis features. Rankings, reports and statistics created using the aforementioned methods are used to monitor business processes in the organization, and any modern workflow system has built-in modules for processes observation and analysis. Process monitoring tools feature graphical user interfaces that allow users to create complex database queries without any knowledge of SQL. Many basic reports are predefined by WfMS administrators. More sophisticated analytical methods, such as Business Intelligence systems, are not just tools for exploring databases (first generation BIs) and data warehouses (second generation BIs). Contemporary BI applications feature decision-making mechanisms (third generation BIs), which help employees manage processes without advanced knowledge of analytical techniques. Many researchers predict the development of future generations of BI systems that will eventually automate the entire decision-making process based on the compilation of business rules, leading to the automatic implementation of business process improvements. As public sources contain extensive knowledge on BI systems, OLAP as well as Data Mining, it is appropriate to examine a less popular, but highly effective, method of data collection for the analysis and simulation of processes process mining algorithms. The process mining method involves discovering knowledge about business processes, which are, by nature, dynamic phenomena. Business process analysis is reasonable provided that observations refer to attributes values not at specific moments, but over a long term. The discovery process involves the transformation of raw data into useful information which is used at a later stage to improve the systems and processes that generate the data. The most valuable knowledge on organizations involves the hidden patterns, rules, trends and correlations in data structures, which are formed automatically over a long time period during data archiving in database systems. Discovery of these non-trivial relationships between attributes provides unique insight into the operation of the company, inaccessible using less refined methods of assessing business processes [Van der Aalst, 2007].
7 Enterprise Processes Simulated Prerequisites Discovering knowledge on business processes is achieved using algorithms being research subjects of many scientific centers around the world. One of the most recognized practical applications of process discovery methods is based on algorithms developed by a research team from the Eindhoven University of Technology in the Netherlands. The concept is called workflow mining. Workflow mining is a set of procedures based on mathematical knowledge and designed for the purpose of browsing the instance registry of business processes. For practical reasons, workflow mining algorithms have been implemented as Java plug-ins, which can be run in the ProM (Process Miner) environment. This computing environment does not incur licensing costs and is compatible with different operating systems. Workflow mining algorithms allow detailed analysis to be carried out taking into account four perspectives, each dedicated to discovery of different aspects of process-related knowledge. Anticipated activities within individual perspectives include: 1. Control Flow Perspective: involves the discovery of workflow maps that reflect possible paths of process execution in the company. 2. Organizational Perspective: reflects the company s organizational structure, task estimates carried out by employees, explores paths of cooperation between employees and identifies the degree of participants involvement in process tasks. 3. Case-Related Information Perspective: identifies all possible execution paths of a selected business process and consequently groups them in patterns. Establishes the most and least popular paths, selected by participants within the process as well as calculates process metrics, i.e. Key Performance Indicators (such as the number of running instances in a given period of time, the frequency of generating instances, activities timing parameters, values of probabilities in decision points etc.). 4. Conformance Checking Perspective: supports comparative analysis of the business process theoretical model with the registry of real-world events. In other words, it provides verification of actual process execution in comparison to its definition within the business process model. Additionally, this perspective provides visualization of business process models identified within the Control Flow Perspective in order to determine the model s degree of complexity in terms of surplus, duplicated activities etc., as well as assess the maturity of the discovered model, which depends on the execution of the instances not provided for in the process definition.
8 294 Bartosz Marcinkowski, Bart³omiej Gawin 1.2. Input datasets for process simulation Business process analysis with workflow mining generates results which, in conjunction with a narrative description of process situation in the company, can be used to construct computer simulation-oriented process models. Simulation-oriented models may be run in a dedicated IT simulation system. Workflow mining algorithms provide many types of results such as the values of individual attributes within the process, the probability of following different paths or a model of the company s organizational structure. All types of feeds for the computer simulation-oriented process model were captured in Table 1 and referred as inputs to simulator. Parameterized computer models of processes allow for the manual intervention within the inputs to the simulator throughout series of simulations. It leads to the improvement/optimization of business processes and to the processes being tailored to current business conditions. ADONIS, provided by BOC group, is one of the commercially available simulation tools for business processes. The tool includes the following types of simulations [BOC Group, 2012]: 1. Times and costs: involves verification of the overall time values and costs linked to a business process. The estimation is carried out in a number of cycles (1000 by default), and the result is averaged over one cycle. The algorithm is based on a Monte Carlo method, which involves a pseudorandom number generator for modeling complex processes. 2. Accounting analysis: involves determining the average times and cost of a business process in per activity approach. The algorithm examines all possible paths within a business process and calculates their probabilities. Based on the provided activity times and costs, the algorithm calculates provisional times and costs results for each process component as well as for the entire process. The final results refer to all cycles. 3. Path Analysis: estimates the times, cost and probabilities of each possible path s occurrence. Individual paths are described textually and also graphically as highlighted on the model. Additionally, results may be ordered in many custom ways (from the most probable to the least probable, from the most expensive to the cheapest etc.). 4. Capacity analysis: estimates demand for the staff required to execute specified number of process runs in a given period of time (such as the number of instances per week, month, year) as well as enables the observation of workloads within the current organizational structure.
9 Enterprise Processes Simulated Prerequisites Workload analysis (steady state view): this algorithm generates instances of relevant processes based on so-called process calendars, registering the total duration of instance executions on a per-instance basis. Workload analysis allows for the identification of bottlenecks within the process as well as for the calculation of waiting times in the task execution queue. The algorithm estimates workers utilization as a ratio of execution time of all the activities performed by individual participants to the total duration of all startups in the process. 6. Workload analysis (fixed time period): workload simulation is performed over a predefined, fixed period of time. The algorithm registers how many times the process can be invoked within a specified time window, identifying bottlenecks and calculating employee utilization. Table 1 summarizes the input datasets required to carry out the aforementioned types of simulations. The suggested method of collecting values for the individual parameters was specified in each case; the upper part of the table lists the inputs that can be extracted from the workflow system database, while the lower part contains the data to be collected using other, non-it techniques for gathering information about processes. Table 1. Data inputs for various simulation / analysis algorithms Algorithm Inputs Business process model Working environment model Activity execution times Time and costs Analytical evaluation Path analysis Capacity analysis Workload analysis (steady state) Workload analysis (fix. time period) Recommended technique for collecting data X X X X X X Process mining (Control Flow Perspective) X X X Process mining (Organization. Perspective) X X X X X X Process mining (Case-Related Information Perspective)
10 296 Bartosz Marcinkowski, Bart³omiej Gawin Algorithm Inputs Frequency of occurrence of activity per unit of time Probabilities of initiating individual??outflows of decision points Cost indicators (per activity) Cost indica-tors (per performer) Number of working hrs. per day Number of working days per year Process instance initiation calendars Performers calendars Availability of staff Time and costs Source: Own elaboration. Analytical evaluation Path analysis Capacity analysis Workload analysis (steady state) Workload analysis (fix. time period) Recommended technique for collecting data X Process mining (Case-Related Information Perspective) X X X X X X Process mining (Case-Related Information Perspective) X (opt) X (opt) X (opt) X (opt) X (opt) X (opt) Interview / Observation X (opt) X (opt) X (opt) Interview / Observation X X X X Interview / Observation X X X X Interview / Observation X X Interview / Observation X X Interview / Observation X Interview / Observation
11 Enterprise Processes Simulated Prerequisites Initiating the most appropriate simulation algorithm requires not only the collection of a complete set of inputs, but providing certain technical constraints as well. The conditions impact on the simulation results and determine its interpretation. For example, increasing the number of instances within the path analysis, increases the accuracy of the results. However, improving the accuracy increases the CPU load on the computer the simulation is performed on. It is, therefore, essential to select the most appropriate parameters for the simulation algorithm in order to run the algorithm properly and provide actionable results. Please note that all the cost-related inputs specified in Table 1 are optional while the algorithm will run correctly without applying these values to the simulation model, the results will not include valuable information on the overall process costs. Simulation results for each of the algorithms may be presented in various ways. For example, path analysis provides all possible combinations of activity sets that can be invoked within the process model. The interpretation of results can be carried out in terms of the duration of each path, its occurrence probability or the path costs (as a total of individual activities costs). Business analysts may implement various ordering and filtering features that facilitate the analysis of business results. 2. Traditional methods of data collection 2.1. Observation Observation is arguably the most flexible means of data collection for the purposes of simulation and subsequent enhancement of business processes. It enables business analysts to access information that is not provided by any class of information systems used in business organizations. One important advantage of this technique is the elimination of intermediate links in the process of data collection which may contribute to increases in the probability of misinterpretations. Business analysts using this technique/approach avoid actions that might cause changes in the behavior of the observed object. Observation as a tool of process-related data gathering may be used in a variety of ways. The most relevant categorizations of the technique include: participant observation assumes observer interference in the business process being analyzed or taking part in its execution and non-participant observation, involving the collection of data from the perspective of the business process surroundings;
12 298 Bartosz Marcinkowski, Bart³omiej Gawin controlled observations carried out on the basis of pre-defined archiving and recording mechanisms and non-controlled observations, involving a spontaneous, opportunistic approach; overt observations in which participants of business process are fully aware of being observed and covert observation, aimed at eliminating the impact an observer may have on the process thereby capturing the most genuine process flow. The recommended combination of observation procedures which facilitate the effective enhancement of robustness and detail level in elaborated process models is presented in figure 3. Figure 3. Combination of observation procedures for data acquisition Source: [Gawin and Marcinkowski, 2013]. In the context of quantitative time and cost-related data gathering, and calculating the probability of specific alternatives occurring in the process flow, the primary failing of observation as a research technique, i.e. subjectivity, is substantially reduced. The observer perceives the universe of discourse through the prism of his/her stereotypes recording what a stereotype confirms while omitting what it contradicts; ipso facto stereotypes not only simplify but also forge our perceptions [Sztumski, 2010]. Coefficients stored in order to run simulations are neutral in nature and leave no significant margin for misinterpretation or bias i.e., which may result from the individual, subjective attitude of the business analyst responsible for carrying out observations.
13 Enterprise Processes Simulated Prerequisites On the other hand, the number of instances a business process that need to be traced in order to obtain reliable quantitative indicators is an issue when carrying out observations. Should a business analyst utilize corporate IT solutions, indicators might be calculated on the basis of results from hundreds, thousands or even greater numbers of business process runs. That is not the case when using traditional methods analysts do not have the luxury of performing an unlimited number of observations, thus the number of documented business process runs is naturally limited. Adopting too restrictive constraints as part of the observation schedule may adversely affect the overall quality of certain indicator values, such as the waiting time for processing individual activities Interview Interviews are a practical alternative to observation as a method of collecting data without the use of IT support. They involve approaching respondents with more or less formal questions within a particular issue area. Interviews boil down to the reciprocal flow of information and may be carried out using different procedures. The most popular interview types include [see Chodubski, 1996]: phone or online interviews usually based on the pre-designed questionnaire surveys and face-to-face ones; structured interviews carried out strictly according to a pre-prepared template semi-structured interviews involving addressing a particular set of problems in any form and order as well as unstructured ones, conducted in an informal and maximally flexible manner; individual interviews in each case involving a single respondent group interviews aimed at efficiently acquiring more reliable data through mutual correction and supplementation of the contributions as well as panels, which explore the immutability of responses during series of interviews. Documenting interviews may involve the use of audio-visual recordings or traditional note taking in various forms. It is vital to precisely capture and clearly classify responses, as well as recognize additional descriptive attributes, such as the date the report was prepared, the name or comments of the business analyst conducting interviews. The recommended template for documenting simulation parameters of individual activities, given the nature of the data collected, comes in a tabular form (figure 4).
14 300 Bartosz Marcinkowski, Bart³omiej Gawin Figure 4. Template for documenting simulation parameters of individual activities Source: [Gawin and Marcinkowski, 2013]. When collecting data for the purposes of conducting simulations of business entities previously pre-documented processes, business analysts are able to accurately identify the individuals who perform specific activities. Respondents in the vast majority of cases possess knowledge, acquired while running a number of instances of the business process, which greatly improve the reliability of the variables calculated. Often, more experienced interview participants are able to accurately describe the circumstances of potential deviations from the typical process instances and also suggest potential areas for optimization. Conclusion Business organizations competiveness is determined to large extent by monitoring processes and carrying out simulations and optimization in a rapidly changing business environment. Process management is based on decisions about the need to change and proposals for improving
15 Enterprise Processes Simulated Prerequisites processes which prior to implementation in business practice should be simulated within dedicated environments. An in-depth analysis of task sequences and their mapping in the form of a computer model allow for the development of alternative process definitions. Such definitions may have future use in the form of a new, improved standard workflows in the enterprise. In IT-rich enterprises, alternative definitions are adapted within WFM environments which, in the context of process management, enable the observation and re-registration of the instances in the database for future analysis and improvement. The use of Workflow Management Systems to feed business process models with quantifiable cost- and time-related data seems to be universally acknowledged as the most convenient way of providing a simulation-orientated layer to business models. In many cases, workflow systems allow for the creation of workflows on an ad-hoc basis necessary in circumstances when a non-standard, custom decision regarding the ongoing flow of the process is required. Should the process instance registered in the database not coincide with the basic process definition (model), it is an indication that an ad-hoc decision was made during instance execution. Sometimes employees deliberately attempt to alter the flow of the process to make it easier to perform, avoid uncomfortable activities, or simply hide certain actions. Provided that the workflow system is designed in such a way that it backs up both instances carried out in accordance with the model, as well as those created by employees on an ad-hoc basis, then the data collected is useful both to feed the process simulators and, equally, for reviewing the actual operation of the company. This feature may prove important in the ongoing quest to develop improved and more efficient business solutions. While conducting analyses and seeking ways to improve business processes, it should be bore in mind, however, that workflow systems, which are reliable sources of useful data, do not usually record the performance of such activities as phone calls, sending s, meetings or business trips. Given that these activities have the potential to impact significantly on task performance, ever increasingly numbers of initiatives linked to registering each employee s activity in dedicated systems and databases have emerged. As a result, knowledge on the actual execution of business processes in business organizations is significantly expanding. There are limits, however, as to what can be automated not only in a typical business entity, but equally within innovative, highly effective cor-
16 302 Bartosz Marcinkowski, Bart³omiej Gawin porations equipped with modern IT tools. Ipso facto, teams set up with business process improvement in mind have to rely on traditional methods of detailing process models interviews and observations play a critical role. Although such methods are considerably more time-consuming (and in practice expensive as well), they play an essential role in complementing quantitative knowledge harvested from IT sources. References 1. BOC Group (2012), Adonis 5.0. Part 2: User Manual. 2. Chodubski A. (1996), Introduction to Political Science Research (in Polish), Gdansk University Press. 3. Gawin B. (2011), Analysis of the Digital Exchange of Information regarding Workflows within a Telecom Company (PhD thesis), University of Gdañsk. 4. Gawin B., Marcinkowski B. (2013), Simulating Business Processes. BPMS and BPMN Standards Applied (in Polish), Helion. 5. Hauser K., Paper D. (2007), Simulation of Business Re-Engineering Processes: Case Study of a United States Motor Manufacturing Company, International Journal of Management, Vol Johnston S. (2004), Rational UML Profile for Business Modeling, accessed Karagiannis D., Junginger S., Strobl R. (1996), Introduction to Business Process Management System Concepts, in: Scholz-Reiter B., Stickel E. (eds.), Business Process Modelling, Springer-Verlag. 8. Mayer R.J., Menzel C.P., Painter M.K., de Witte P.S., Blinn T., Perakath B. (1995), Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report, pdf/idef3_fn.pdf, accessed Moen R., Norman C. (2009), Evolution of the PDCA Cycle, com/files/na01moennormanfullpaper.pdf, accessed Object Management Group (2013), Business Process Model and Notation (BPMN). Version 2.0.1, accessed Scheer A. W. (1992), Architecture of Integrated Information Systems, Springer-Verlag. 12. Srivastava S. (2010), Process Modeling & Simulation. Wipro Technologies, -modeling-simulation.pdf, accessed
17 Enterprise Processes Simulated Prerequisites Suzuki Y., Yahyaei M., Jin Y., Koyama H., Kang G. (2012), Simulation Based Process Design: Modeling and Applications, Advanced Engineering Informatics, Vol Sztumski J. (2010), Introduction to Methods and Techniques of Social Research. 7th Edition (in Polish), Silesia Press. 15. Teubner C., McNabb K., Levitt D. (2008), Case Study: Qwest Uses Process Simulation To Move At The Speed Of Business Change, Forrester Research. 16. van der Aalst W.M.P. (2007), BPM and Workflow Analysis, bptrends.com/publicationfiles/04-07-art-bpmandworkflowanalysis -vanderaalst-final.pdf, accessed Workflow Management Coalition (1998), Audit Data Specification. Document Number WfMC-TC Version 1.1, accessed Workflow Management Coalition (2013), Business Process Simulation Specification. Document Number WFMC-BPSWG Version 1.0, accessed Enterprise Processes Simulated Prerequisites, Data Acquisition Methods and Tool Support (Summary) Widely used methods for designing and optimizing business processes confirm that in numerous enterprises, analysts and managers are not aware of the feasibility of solutions provided by workflows computer simulations. It proves to be a challenge to satisfactorily address the question why the design of business processes is often reduced to creating maps and process models exclusively. It is the lack of adequate data for supplying process definitions that is one of the main obstacles for wide-spread adoption of process simulation methods, even provided that a company possesses adequate tools to simulate the processes. This paper seeks to identify, systemize and elaborate implementable techniques of data acquisition that enable the execution of process simulation for the purposes of improving existing (as well as creating new) patterns of workflows in enterprises. Both information system-related and traditional data acquisition methods are discussed. Authors overview input datasets required to carry out simulation algorithms along with mapping the datasets to suggested data acquisition methods as well as propose best practices regarding combination of observation procedures for data acquisition and documenting simulation parameters of individual activities. Keywords business process, simulation, data acquisition, Workflow Management System
Bart omiej Gawin Department of Business Informatics, University of Gda sk, Piaskowa 9 Street, Sopot, Poland e-mail: firstname.lastname@example.org Jerzy Roszkowski Management Systems Consulting, Pozna ska 28/1 Street,
Mercy Health System St. Louis, MO Process Mining of Clinical Workflows for Quality and Process Improvement Paul Helmering, Executive Director, Enterprise Architecture Pete Harrison, Data Analyst, Mercy
CHAPTER 1 INTRODUCTION 1.1 Research Motivation In today s modern digital environment with or without our notice we are leaving our digital footprints in various data repositories through our daily activities,
Data Catalogs for Hadoop Achieving Shared Knowledge and Re-usable Data Prep Neil Raden Hired Brains Research, LLC Traditionally, the job of gathering and integrating data for analytics fell on data warehouses.
Using Process Mining to Bridge the Gap between BI and BPM Wil van der alst Eindhoven University of Technology, The Netherlands Process mining techniques enable process-centric analytics through automated
Business Process Management In An Application Development Environment Overview Today, many core business processes are embedded within applications, such that it s no longer possible to make changes to
International Journal of Scientific and Research Publications, Volume 3, Issue 1, January 2013 1 Model Discovery from Motor Claim Process Using Process Mining Technique P.V.Kumaraguru *, Dr.S.P.Rajagopalan
Knowledge Base Data Warehouse Methodology Knowledge Base's data warehousing services can help the client with all phases of understanding, designing, implementing, and maintaining a data warehouse. This
BUSINESS INTELLIGENCE Bogdan Mohor Dumitrita 1 Abstract A Business Intelligence (BI)-driven approach can be very effective in implementing business transformation programs within an enterprise framework.
Objective Software Project Measurements Slide 1 Fundamentals of Measurements Educational Objective: To review the fundamentals of software measurement, to illustrate that measurement plays a central role
Monitor and Manage Your MicroStrategy BI Environment Using Enterprise Manager and Health Center Presented by: Dennis Liao Sales Engineer Zach Rea Sales Engineer January 27 th, 2015 Session 4 This Session
Proceedings of the 2014 International Conference on Industrial Engineering and Operations Management Bali, Indonesia, January 7 9, 2014 An Automated Workflow System Geared Towards Consumer Goods and Services
60 Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative Analysis of the Main Providers Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative
IP Telephony Contact Centers Mobility Services WHITE PAPER Business Value Reporting and Analytics Avaya Operational Analyst April 2005 avaya.com Table of Contents Section 1: Introduction... 1 Section 2:
Data Warehouse The solid foundation for your entire, enterprise-wide business intelligence system The core of the high-performance intelligence delivery infrastructure, designed to meet even the most demanding
9 8 TRENDS IN THE DEVELOPMENT OF BUSINESS INTELLIGENCE SYSTEMS Assist. Prof. Latinka Todoranova Econ Lit C 810 Information technology is a highly dynamic field of research. As part of it, business intelligence
Feature Filip Caron is a doctoral researcher in the Department of Decision Sciences and Information Management, Information Systems Group, at the Katholieke Universiteit Leuven (Flanders, Belgium). Jan
Veska Mihova * Summary: Today productivity and performance are the main problems of business applications. Business applications become increasingly slow while their stored and managed data are growing.
Database Systems Journal vol. IV, no. 4/2013 21 Data Mining Solutions for the Business Environment Ruxandra PETRE University of Economic Studies, Bucharest, Romania email@example.com Over
www.ijcsi.org 102 A Knowledge Management Framework Using Business Intelligence Solutions Marwa Gadu 1 and Prof. Dr. Nashaat El-Khameesy 2 1 Computer and Information Systems Department, Sadat Academy For
Discovering the Value of SOA WebSphere Process Integration WebSphere Business Modeler Workshop SOA on your terms and our expertise Soudabeh Javadi Consulting Technical Sales Support WebSphere Process Integration
Process Modelling from Insurance Event Log P.V. Kumaraguru Research scholar, Dr.M.G.R Educational and Research Institute University Chennai- 600 095 India Dr. S.P. Rajagopalan Professor Emeritus, Dr. M.G.R
INTERNATIONAL DESIGN CONFERENCE - DESIGN 2000 Dubrovnik, May 23-26, 2000 THE IMPLEMENTATION OF WEB-BASED TECHNOLOGIES IN ENGINEERING DATA MANAGEMENT Pavlić Davor, Dorian Marjanović, Štorga Mario Keywords:
Using reporting and data mining techniques to improve knowledge of subscribers; applications to customer profiling and fraud management Paper Jean-Louis Amat Abstract One of the main issues of operators
Managing Big Data with Hadoop & Vertica A look at integration between the Cloudera distribution for Hadoop and the Vertica Analytic Database Copyright Vertica Systems, Inc. October 2009 Cloudera and Vertica
Chapter 5 Foundations of Business Intelligence: Databases and Information Management 5.1 Copyright 2011 Pearson Education, Inc. Student Learning Objectives How does a relational database organize data,
What is a life cycle model? Framework under which a software product is going to be developed. Defines the phases that the product under development will go through. Identifies activities involved in each
THE INTELLIGENT BUSINESS INTELLIGENCE SOLUTIONS ADRIAN COJOCARIU, CRISTINA OFELIA STANCIU TIBISCUS UNIVERSITY OF TIMIŞOARA, FACULTY OF ECONOMIC SCIENCE, DALIEI STR, 1/A, TIMIŞOARA, 300558, ROMANIA firstname.lastname@example.org,
Business Process Management Tampereen Teknillinen Yliopisto 31.10.2007 Kimmo Kaskikallio IT Architect IBM Software Group IBM SOA 25.10.2007 Kimmo Kaskikallio IT Architect IBM Software Group Service Oriented
Sandeep Jadhav Introduction Well defined, organized, implemented, and managed Business Processes are very critical to the success of any organization that wants to operate efficiently. Business Process
Digital Healthcare Empowering Europeans R. Cornet et al. (Eds.) 2015 European Federation for Medical Informatics (EFMI). This article is published online with Open Access by IOS Press and distributed under
Adobe Insight, powered by Omniture Accelerating government intelligence to the speed of thought 1 Challenges that analysts face 2 Analysis tools and functionality 3 Adobe Insight 4 Summary Never before
Analytics for Performance Optimization of BPMN2.0 Business Processes Robert M. Shapiro, Global 360, USA Hartmann Genrich, GMD (retired), Germany INTRODUCTION We describe a new approach to process improvement
DATA MINING TECHNOLOGY Georgiana Marin 1 Abstract In terms of data processing, classical statistical models are restrictive; it requires hypotheses, the knowledge and experience of specialists, equations,
BPM and Simulation A White Paper Signavio, Inc. Nov 2013 Katharina Clauberg, William Thomas Table of Contents 1. Executive Summary... 3 2. Setting the Scene for Process Change... 4 3. Identifying the Goals
BPMN PATTERNS USED IN MANAGEMENT INFORMATION SYSTEMS Gabriel Cozgarea 1 Adrian Cozgarea 2 ABSTRACT: Business Process Modeling Notation (BPMN) is a graphical standard in which controls and activities can
1 CHAPTER 1 INTRODUCTION Exploration is a process of discovery. In the database exploration process, an analyst executes a sequence of transformations over a collection of data structures to discover useful
IBM SPSS Data Collection Data Entry Simplify survey research with IBM SPSS Data Collection Data Entry Advanced, survey-aware software for creating surveys and capturing responses Highlights Create compelling,
Technology WHITE PAPER What We Do Neota Logic builds software with which the knowledge of experts can be delivered in an operationally useful form as applications embedded in business systems or consulted
1 B.Sc (Computer Science) Database Management Systems UNIT-V Business Intelligence? Business intelligence is a term used to describe a comprehensive cohesive and integrated set of tools and process used
Business Process Concepts Process Mining Kelly Rosa Braghetto Instituto de Matemática e Estatística Universidade de São Paulo email@example.com January 30, 2009 1 / 41 Business Process Concepts Process
Web Data Mining: A Case Study Samia Jones Galveston College, Galveston, TX 77550 Omprakash K. Gupta Prairie View A&M, Prairie View, TX 77446 firstname.lastname@example.org Abstract With an enormous amount of data stored
IBM Software Business Analytics SPSS Predictive Analytics Better planning and forecasting with IBM Predictive Analytics Using IBM Cognos TM1 with IBM SPSS Predictive Analytics to build better plans and
Business Process Management: A personal view W.M.P. van der Aalst Department of Technology Management Eindhoven University of Technology, The Netherlands email@example.com 1 Introduction Business
Data Management Practices for Intelligent Asset Management in a Public Water Utility Author: Rod van Buskirk, Ph.D. Introduction Concerned about potential failure of aging infrastructure, water and wastewater
Journal of Internet Banking and Commerce An open access Internet journal (http://www.arraydev.com/commerce/jibc/) Journal of Internet Banking and Commerce, December 2010, vol. 15, no.3 (http://www.arraydev.com/commerce/jibc/)
The Role of Information Technology Studies in Software Product Quality Improvement RUDITE CEVERE, Dr.sc.comp., Professor Faculty of Information Technologies SANDRA SPROGE, Dr.sc.ing., Head of Department
Data Science Research Theme: Process Mining Process mining is a relatively young research discipline that sits between computational intelligence and data mining on the one hand and process modeling and
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
Workflow Object Driven Model Włodzimierz Dąbrowski 1,2, Rafał Hryniów 2 Abstract: Within the last decade the workflow management makes an incredible career. Technology connected with the workflow management
Proceedings of the 2011 Winter Simulation Conference S. Jain, R.R. Creasey, J. Himmelspach, K.P. White, and M. Fu, eds. BUSINESS PROCESS MODELING AND SIMULATION Geoffrey Hook Lanner Group The Oaks, 5 Clews
SQL Server 2012 Business Intelligence Boot Camp Length: 5 Days Technology: Microsoft SQL Server 2012 Delivery Method: Instructor-led (classroom) About this Course Data warehousing is a solution organizations
Zarz¹dzanie i Finanse Journal of Management and Finance Vol. 12, No. 3/2/2014 Stanis³aw Wrycza* Bartosz Marcinkowski** Jacek Maœlankowski*** Stanis³aw Wrycza, Bartosz Marcinkowski, Jacek Maœlankowski Applications
CHAPTER 1 DATA MINING AND WAREHOUSING CONCEPTS 1.1 INTRODUCTION The past couple of decades have seen a dramatic increase in the amount of information or data being stored in electronic format. This accumulation
INFORMATION TECHNOLOGY TopicalNet, Inc (formerly Continuum Software, Inc.) Building a Database to Predict Customer Needs Since the early 1990s, organizations have used data warehouses and data-mining tools
Handling Big(ger) Logs: Connecting ProM 6 to Apache Hadoop Sergio Hernández 1, S.J. van Zelst 2, Joaquín Ezpeleta 1, and Wil M.P. van der Aalst 2 1 Department of Computer Science and Systems Engineering
Study and Analysis of Data Mining Concepts M.Parvathi Head/Department of Computer Applications Senthamarai college of Arts and Science,Madurai,TamilNadu,India/ Dr. S.Thabasu Kannan Principal Pannai College
Augmented Search for IT Data Analytics New frontier in big log data analysis and application intelligence Business white paper May 2015 IT data is a general name to log data, IT metrics, application data,
Chapter Managing Knowledge in the Digital Firm Essay Questions: 1. What is knowledge management? Briefly outline the knowledge management chain. 2. Identify the three major types of knowledge management
STAT REVIEW/EXECUTIVE DASHBOARD TASK ORDER EXECUTIVE DASHBOARD SYNTHESIS OF BEST PRACTICES October 26, 2011 This publication was produced for review by the United States Agency for International Development.
Augmented Search for Software Testing For Testers, Developers, and QA Managers New frontier in big log data analysis and application intelligence Business white paper May 2015 During software testing cycles,
ICT Innovations 2010 Web Proceedings ISSN 1857-7288 345 Analysis of Cloud Solutions for Asset Management Goran Kolevski, Marjan Gusev Institute of Informatics, Faculty of Natural Sciences and Mathematics,
Business Intelligence Discover a wider perspective for your business The tasks facing Business Intelligence The growth of information in recent years has reached an unprecedented level. Companies are intensely
Supporting the Workflow Management System Development Process with YAWL R.S. Mans 1, W.M.P. van der Aalst 1 Department of Mathematics and Computer Science, Eindhoven University of Technology, P.O. ox 513,
Pentaho Data Mining Copyright 2007 Pentaho Corporation. Redistribution permitted. All trademarks are the property of their respective owners. For the latest information, please visit our web site at www.pentaho.org
Exceeding Standards LOGO Mind Offers Quick Integration and Dynamic Reporting and Analysis! Provided by an avant-garde technology in its field, Logo Mind will carry your business one step ahead and offer
Chapter 6 Basics of Data Integration Fundamentals of Business Analytics Learning Objectives and Learning Outcomes Learning Objectives 1. Concepts of data integration 2. Needs and advantages of using data
IBM Software Information Management White Paper Harnessing the power of advanced analytics with IBM Netezza How an appliance approach simplifies the use of advanced analytics Harnessing the power of advanced
ANALYTICS STRATEGY: creating a roadmap for success Companies in the capital and commodity markets are looking at analytics for opportunities to improve revenue and cost savings. Yet, many firms are struggling
A Guide Through the BPM Maze WHAT TO LOOK FOR IN A COMPLETE BPM SOLUTION With multiple vendors, evolving standards, and ever-changing requirements, it becomes difficult to recognize what meets your BPM
Business Intelligence Policy Version Information A. Introduction Purpose Business Intelligence refers to the practice of connecting facts, objects, people and processes of interest to an organisation in
King Fahd University of Petroleum & Minerals College of Computer Sciences and Engineering Information and Computer Science Department Master of Science in Software Engineering Student Guide http://www.ccse.kfupm.edu.sa/swe/
AHP Based Comparison of open-source BPM Systems Dragan Mišić *, Milena Mišić **, MilanTrifunović *, Tanja Arh, Pipan Matić *** * Faculty of Mechanical Engineering, Niš, Serbia ** Student at the Faculty
A DataFlux White Paper Prepared by: David Loshin Observing Data Quality Service Level Agreements: Inspection, Monitoring, and Tracking Leader in Data Quality and Data Integration www.dataflux.com 877 846
IBM Software Group Thought Leadership Whitepaper IBM Customer Experience Suite and Real-Time Web Analytics 2 IBM Customer Experience Suite and Real-Time Web Analytics Introduction IBM Customer Experience
CROSS INDUSTRY PegaRULES Process Commander Bringing Insight and Streamlining Change with the PegaRULES Process Simulator Executive Summary All enterprises aim to increase revenues and drive down costs.
1/10 131-1 Adding New Level in KDD to Make the Web Usage Mining More Efficient Mohammad Ala a AL_Hamami PHD Student, Lecturer m_ah_1@yahoocom Soukaena Hassan Hashem PHD Student, Lecturer soukaena_hassan@yahoocom
What Is Specific in Load Testing? Testing of multi-user applications under realistic and stress loads is really the only way to ensure appropriate performance and reliability in production. Load testing
Arbeitsgruppe Semantic Business Process Management Lectuer 1 - Introduction Prof. Dr. Adrian Paschke Corporate Semantic Web (AG-CSW) Institute for Computer Science, Freie Universitaet Berlin firstname.lastname@example.org
Search Powered Business Analytics, the smartest way to discover your data Data Migration How CXAIR can be used to improve the efficiency and accuracy of data migration A CXAIR White Paper www.connexica.com
AdTheorent s Real-Time Learning Machine (RTLM) The Intelligent Solution for Real-time Predictive Technology in Mobile Advertising Worldwide mobile advertising revenue is forecast to reach $11.4 billion
IBM SPSS Modeler Three proven methods to achieve a higher ROI from data mining Take your business results to the next level Highlights: Incorporate additional types of data in your predictive models By