Efficient transaction processing, lower operational cost, minimum risk and high customer satisfaction are the primary objectives of most of the organization now days. To accomplish this and manage their enterprise operations, they are leveraging the power of various information systems such as ERP and CRM systems, web applications, cloud based solutions and many more.
This forms a complex network of systems deployed to integrate the various business functions and streamline the business processes. The workflow being the heart of these systems manages processes to be followed for different operations. Be it a user entering an order, product shipped from a warehouse or bank acknowledging a customer’s payment, everything is processed as per workflow and business rules.
The data stored in these operational systems are then used to generate various operational reports and later transformed to business intelligence dashboards for strategic decisions. However, in this whole process of business automation, there are also an accumulation of various logs generated from different components of these systems which can also be analyzed.
Why to analyze these logs?
The insights from logs can provide valuable information which might help streamline operational processes and impact revenue. Organizations can evaluate what has happened in the past, what is happening in the present and this will help them to formulate strategies for the future. For example, transaction logs from an Order Capture form, logs from RFID devices used in Inventory and click stream data from online store can provide various insights to improve an Order to Cash flow. Also, these can provide insights whether business process is followed as specified.
Let’s look at an organization where a new enterprise system is implemented to organize and manage business processes. It is imperative that whenever there is a process change or a new system implemented, people are resistant to adapt that change. Some users continue to work as they used to do earlier and feed data to the system at a later part of the day. To monitor this manually is a cumbersome task. To discover these patterns requires viewing each and every instance of processes and visualizing what’s going wrong. Process Analytics can help here to discover problems associated with the processes and take corrective actions.
Yet another illustration:
Let’s look from another perspective for an organization where they want to optimize their operations. The traditional approach is a standard interview style discussion with business users and their answers forms a baseline, an ‘As-Is’ process, for process improvement. However, most of them provide an ideal answer rather than what they actually do.
This is just the tip of an iceberg. Process deviation and anomalies of higher magnitude are possible in organizations. Process analytics tools and methodologies can help here to discover and reveal the truth of actual process which is being practiced.
To define Business process analytics, it is the methodology to monitor, discover and analyze the business processes.It is a combination of process mining and real time analytics capabilities that we have today. One can detect process deviations, bottlenecks and also generate inference for recommendations and prediction. The data from the databases as well as the logs provide the essential raw materials for the building process analytics solution.
Methodology and application:
Data Collection: Data is obtained from event logs and transactional databases. Logsget generated typically from certain events and are widely known as “Event logs”. A typical event log has below three attributes which are most essential to reconstruct the process flow from the data.
|Transaction ID/Case ID||To identify a particular case|
|Timestamp||For event Sequencing|
|Activity||Accomplished task for that event instance|
Apart from these three attributes, other attributes additional attributes can be used based on the data and objective in hand.
Process Mining and Discovery: The current actual process is created from the event logs. α-algorithm (Van der Aalst et al.2002) is used in process mining to reconstruct the workflow. The process of reconstruction of workflow uses event logs as input. Then the algorithm calculates the ordering relation of the actual events from it. Like most of the mathematical algorithms, α-algorithm as certain assumptions: event log completeness and absence of noise from the logs. This can be taken care of at the data pre-processing stage.
Process Conformance: The reconstructed flow chart reveals the frequency of various events which followed what path. This can be used either to create the actual ‘As-Is’ business flow if not available or compare with already available desired process flow. Major deviations for the flow can be scrutinized.
There are many process discovery and mining tools like ProM which provides a framework for process mining algorithms. ProM provides an open source platform to that is easy to use and extend process mining algorithms.
Real Time visibility: Big data and stream processing applications (such as Apache Kafka, Apache Spark and Hadoop echo system) have made it possible to integrate, preprocess and analyze data at near real time occurrence of events. This includes but not limited to the web access logs, click stream data, transaction data and logs from enterprise applications.
Creating a framework using these technologies would help organizations to conduct compliance monitoring of the business processes and prevent anomalies or fraud.
“An organization’s ability to learn, and translate that learning into action rapidly, is the ultimate competitive advantage.”- Jack Welch
Business processes are the pillars of organization. This can be managed effectively by business process discovery and monitoring using business process analytics tools and techniques.
Try deep learning using MATLAB