Monday, February 08, 2010

Simulating business process chaos

If the flow of work in a troubled business process is on the edge of chaos (the analogy is the butterfly flapping its wings in one part of the process upsetting something seemingly remote), more than a nice bell curve of work and workers, how do you identify how to make you process better? This is the question a business process improvement discussion (BP Group on LinkedIn) has recently arrived at.

Most people would like to believe that mathematics and statistics could model their workflows, so that changes could be made in software that predict whether the change is good, better or terrible for the real process. Some of the common software simulation tools (components of BPM suites or independent) offer the type of capability to take a known process and model how changes will improve it. You are in a good position if you already have a known process with known input patterns and statistically smooth and predictable workloads. But if you are still tying to escape process chaos, you need to work hard with experience and common sense to identify how to make the process better, repeatable and measurable before software simulation will offer value, rather than more chaos.

When asked my opinion of whether a process is suitable for simulation, I answer the question with a question (or three):

1) is there a high enough volume of work for the inputs to be fairly represented by linear functions for workload?

2) are the actions performed by individuals and teams of workers predictable and repeatable enough in the time and effort required to be represented statistically?

3) is the process complex enough to require a computed model of the workloads, or if it is simple, are you as likely to get a similar result from common-sense?

Note, I'm way out of my depth on the actual functioning of the statistics applied (OK, rusty, not completely sunk). My fear of applying software simulation to many processes is this: at what level does the fact that work items are discrete, individual items of work start to affect the process calculations, rather than the nice curves and lines that the stats functions feed into the system? To use some common analogies, its like driving on the highway - normally you could model the flow of cars with fairly simple flow functions, but when there are too few cars on the road, the flow is affected by the individual driver's own actions, and when there are too many cars, the flow stutters and stops intermittently, before bursting back up to 50 mph for 20 seconds. I don't think that the software simulation tools can account for this type of discrete modeling or over-capacity modeling, which is often where the benefit comes in (really, don't you have bigger problems in you organization to be focusing on than the process that is running well?).

The approach that seems smart, but may not be viable in reality, is to trial a change with a representative portion of the process - a group of individuals performing a black box section of the process differently from the way it is performed today, allowing the a more or less true comparison of the improvements without jumping in with both feet. You commit resources to doing this, of course, and in physically constrained processes (such as a production line) it may not be possible at all. In some processes, this investment may be the only option.

For many of the use cases outside of those with masses of people and masses of predictable work (call-centers), it seems that the models required would be so specific to the process that you would be needing a custom development exercise to model your process that would change the moment you fixed the next piece of the process.

For many businesses, what is needed now is a whiteboard, a person experienced in the current process (with an open mind), a person experienced in process changed (often a consultant external to the business unit or company), and some way of measuring where work goes and how long it takes to get completed. Understanding your processes, and making them 80% better is essential before taking the tools that will allow you a statistically correct improvement of another 7.3275%.


A post from the Improving It blog

To implement workflow and process automation in your business today, visit www.consected.com


1 comment:

Unknown said...

Brian, thanks for that. I will take a look at the book when I get a moment.

Thanks,
Phil