Media businesses have grown very complex in recent times. Where once it was a simple linear process of “get programme in, check it, play it out to air”, now it is considerably more challenging. There is much more content, going to many more outlets, each with their own set of commercial and technical rules.

For the vast majority of broadcasters and media companies, the workflows to do all of this have evolved. In truth, bits have been added as they have become necessary, with the result that often there is no coherent overall design.

That means that it is difficult to be absolutely sure that you are doing things in the most efficient way. Do you have too many resources – equipment lying idle, or staff sitting round waiting for something to happen – or do you have serious bottlenecks? What is the true cost of delivering a service, and what revenue is it earning? Where is further investment needed, and will it make a return?

Wikipedia defines process mining as “a family of techniques… to support the analysis of operational processes, with the goal to turn event data into insights and actions”. It is a well-established technique in the IT industry. At Three Media, we recognised its potential in our industry, as a way of analysing the workflows that have grown up, to determine how best to optimise them.

There are three defined stages of process mining: process discovery; conformance checking and performance analysis. We have developed a simple set of tools to approach these, implemented within the existing XEN:Pipeline, content broadcast management platform.

In our terms, process discovery means setting out the status quo. What steps do you need to get a piece of content through your system? What hardware does it pass through, and what operator steps are required? A new piece of content might be transcoded to the house standard, quality checked (automatic or manual), metadata enhanced (again, automatic and manual), stored, archived and backed up, collected for delivery, metadata transferred to playlists and online discovery engines, and transcoded for each platform.

Simple graphical tools with drag and drop helps you model this on screen. You can mine each point in the process: how much operator time is allocated for a task; what processor resources are allocated for transcoding or transfer; and so on. A simple block diagram on screen will cover analysis of potentially thousands if not millions of data points.

Step two is conformance checking. This uses real data from all the devices and people which have been modelled, to compare the event log with the design goal. In our implementation, you can click on any point in the workflow to see that module’s performance and use a wide range of dashboard tools to give a comprehensive view of overall performance.

The conformance checking exists to show where there are problems. You may have a business operational goal of moving content through a workflow or process within a target time, for instance, and the dashboard tells you that only 72% of content hits that target. That is the point at which you need to mine down into the data to see where the bottlenecks are occurring.

Conversely, if the dashboard says you are hitting the target 99% of the time, but you are not making a profit, process mining will identify where you are over-resourced.

Armed with all this information, you move to performance analysis. The simulation model allows you to change any parameters to understand how it impacts the overall workflow, and to look at performance over any period of time. It calls for a lot of processing, but our optimised software can analyse a three month window for a typical content processing workflow in around two minutes.

That is because we have consciously chosen to make performance analysis the powerful tool that is the real value of the process. Users can adjust any parameter from staff rotas to cloud SLAs. Artificial intelligence and machine learning built into the software drives reasoned decision-making.

The power of modelling comes in performing “what ifs” across multiple workflows, processes and teams. Metrics show throughput by client, supplier, file type, process, user and more, to help you and the AI optimise the system.

If performance is set against cost, you can see at a glance where investment in a pinch point will unlock the workflow; where you currently have resources that could be released. Analysing by file type, for instance, might show that a particular format costs significantly more to ingest than normal, in which case the conclusion might be to prevent suppliers using that format for delivery to you, unless as an exception.

Analysing across not just individual workflows but the wider enterprise supports business decisions. Modelling an SLA might show that you can actually offer your clients a better service within existing resources, which in turn would increase revenue faster than costs. If there is a need in the performance-to-cost map, you know precisely what you can offer a client while remaining profitable on the transaction.

The extension of this is that you could model a workflow which does not exist. If you are a service provider – a playout centre, for example – and you are bidding for a new contract, the model will show exactly what resources are required to meet the SLAs in the tender.

You can then propose a figure that you know, with a very high degree of confidence, that will allow you to meet the client’s expectations and deliver profits. If the potential client says that someone has bid significantly less, you have the model to demonstrate to them why you are sure what it will take to service the contract.

Process mining may be a relatively unknown term within the media industry. But there is no chance that the media creation and delivery chain is going to get simpler. To stay competitive, businesses have to know how to generate efficiencies, to increase throughput, to launch new services.

Backed with an in-depth understanding of the industry, and powerful AI processing, process mining will become a critical tool for your competitive edge.

This article first appeared online with the IABM.