In this blog we explore what task mining really is, focusing on some key benefits that can be expected, together with the lessons learned from our practical experience.

Many of us understand the value of leveraging process mining to provide large scale process insights. However, as firms scale their process mining capability, they’re faced with the challenge of limited insights into the dark data- the adhoc tasks, infrequent systems and less structured activities that occur during a process. The market needs something that can fill the data gaps to enable empirical, end-to-end process insights. To overcome this we use task mining.

Task mining is the deployment of AI-powered software to a desktop that captures every action, application, mouse click and keystroke that a user executes over a process. These actions are then intelligently sequenced together to create a fully transparent, end to end process analysis. Task (or desktop) mining technology has been evolving rapidly to be able to provide insight on improvement opportunities for processes that are off major IT systems. This technology isn’t new, but is being increasingly adopted due to the high value insights that can be derived.

Joining process and task mining analysis enables a transparent end-to-end process view. At Deloitte we have a market leading Digital Discovery™ product to join the two and provide data-driven insights and solutions.

To demonstrate how task mining can uncover value in even the most complex of processes, throughout this blog we’ll also look at an example process (Order to Cash/ OTC) to better understand how an e2e deployment could look. This is a typically complex process, especially when at a regional or global level, where many different variations and deviations occur. 

Figure 1- A typical task mining deployment lifecycle


What are the benefits of task mining?

  • Enhanced end to end process insights

Task mining is deployed directly to an agent’s desktop, therefore it will capture a granular level of detail that occurs across a process. It paints a transparent picture of what really happens across a process, by capturing every action performed by an agent.

In order to unlock the highest level of end-to-end process insights, it is hugely beneficial to deploy both task mining and process mining together. Combining the two tools allows both ‘front end’ and ‘back end’ analysis; it will unlock the lion-share of a process by taking a vast array of historical data points from key systems, whilst supplementing this with real-time user activities to highlight any smaller unseen applications or process deviations.

 Linking this to the OTC example, task mining will shine a light on the vast number of process nuances typically expected- different processes by country, different document format for POs and even different variations of processing an order based on factors such as shipping methods. 

  • Task mining supports automation initiatives

Across the market, a driver for task mining is to support automation discovery, which is often subjective, costly and time consuming. To combat this, leading task mining vendors such as NICE and UiPath now include a software development feature that will significantly expedite the time to automation. Within an analysis, these tools automatically produce sections of the required code (within their respective development environment) for the target process, based on the detailed steps it’s captured. The tool then intelligently converts system data, agent activity and other key process information to pre-create parts of the solution. Whilst this functionality won’t create an end-to-end, production ready solution, it significantly reduces the overall effort and time required to automate a process.

3 Important considerations from our first-hand experience

1) Ensure relevant governance approvals are identified and actioned early. 

Any software that records user activity needs to be carefully governed, and task mining is no exception. Governance and approval processes vary across industries and use cases, but some initial considerations should include:

  • HR approvals: Ensure that an HR representative has given approval for the employee to have their actions monitored- and that these are stored to provide an audit trail. Be vigilant of different HR regulations across different countries (even if it’s on the same process) firms that get this wrong tend to make headlines for the wrong reasons.
  • GDPR approvals: By the very nature of capturing desktop data, personal information (PI) is likely to be present. Some key actions to mitigate this are (1) Assess records for any PI data (2) Agree controls to be put into place to mitigate any PI data risks (3) Seek data owner and data protection office approval.

In the OTC example, a good first step would be to map out all the stakeholders, both colleagues and suppliers, that are involved in the process, understanding regional regulations and laws, and then seek approval from each appropriate stakeholder. Store these for audit purposes to ensure GDPR is being adhered to.

  • Data security configurations: It is best practice to work with the IT function to implement data security controls when configuring & packaging the software, examples of these include hashing data, blacklisting and whitelisting systems and fields to protect the user and customer information.

2) Provide timely & clear employee communications 

In the early stages of a task mining deployment, ensuring that your employees understand, appreciate and ‘buy-in’ to the project is vital. They will often have concerns and questions about the purpose and application of the tool, for example ‘will it assess me?’. In order to bring employees onside, communications should be sent with plenty of time before the deployment occurs- at least 1-2 weeks. The communications should address all facets of the deployment, such as timelines, objectives, hypotheses and technology used- so they have an appreciation for the wider implementation. Whilst there is often excitement and enthusiasm, there is also a natural scepticism, especially given the software will be deployed onto their desktops. Some key messages that need to be communicated clearly and early on are: 

  • The tool is non-intrusive, it won’t alter anything for day-to-day activities
  • Sensitive data can be masked, hashed and blacklisted, meaning no personal or sensitive information will be seen (subject to configurations)
  • Colleagues won’t be individually assessed on their job performance; their inputs feed into wider analysis

Assuring agents of these key messages up front builds trust, allows for faster deployments and ultimately more accurate and valuable process insights.

3) ‘Bookend’ deployments with SME review sessions

During an implementation, it is best practice to ‘bookend’ a task mining deployment with SME reviews. This will:

  • Provide a process benchmark 

It provides a benchmark view of the happy path, providing an appreciation of what is expected within the process and will provide a platform to drive non-conformance/ process deviation analysis, comparing the expected happy path to the actual insights generated.

  • Test application feasibility 

It is crucial to test application feasibility early in the deployment, and by holding a process walkthrough there will be a clear view of key applications used. This way, when configuring the tool, technical considerations and application requirements such as Citrix details, can be identified and actioned early on. Whilst it isn’t expected to see all applications in the process walkthrough (this is what task mining is intended to highlight), understanding the key expected applications will help expedite a deployment.

  • Identify data fields

Understanding what data will be captured is paramount to a successful deployment and should be understood as early as possible. Only by holding initial sessions can appropriate actions occur to protect sensitive information (such as blacklisting applications, hashing names or blanking data values). A process walkthrough with the SME should act as the platform from which key data security configurations are made- and will provide the opportunity to test the implication of each configuration with the SME to ensure that it doesn’t impact on data loss.

Linking back to the OTC example- different products will need different details to process, and even the same supplier can have numerous different PO formats containing different data. Putting the time in up front with an SME (eg walking through PO formats and their unique subsequent process steps) and documenting this will significantly increase both the security of data handled by the task mining tool, as well as the quality of the insights gained.

It’s exciting to see the traction that task mining is getting with our clients, we’ll keep you posted in future blogs on how this capability is evolving.

Want to use task mining in your organisation? 

If you're ready to bring task mining into your business, please contact our leads: