Probably no aspect of BPM has underperformed versus expectations more than simulation. It should be a valuable tool that is commonly used in the course of process analysis… but it’s just not. I’ve been thinking about why that is, and what it would take to make simulation useful in actual practice. It comes down to two basic things: better tools, and better-defined methodologies for deriving useful results from those tools. I haven’t tried every simulation tool out there, so there may be some that do what I ask already.
So what’s needed in a tool? To me, it goes without saying that the typical user of simulation tools is a business process analyst doing process redesign, not an implementation developer. Right off the bat, that implies several things that disqualify most of today’s tools:
1. Simulation should not be embedded in a BPMS. The results of simulation may be what motivates the purchase of a BPMS, but you should not have to buy an automation runtime just to quantify its expected benefits. In fact, I am not sure it should be tied to a particular modeling tool. I’d like a tool that can import any properly serialized BPMN 2.0 XML file in the analytic subclass. That means it should understand BPMN events, like the probability of a message arrival at a boundary event, and able to follow error throw-catch. And, as you might expect, I’d like it to recognize my Method and Style conventions, such as matching subprocess end states with a gateway that immediately follows it, so that simulated instances that hit the OK end state automatically follow the yes path out of the gateway labeled OK? (The tool I use in my training doesn’t do this, and it drives me crazy.)
2. Simulation should not require programming, even javascript. Yes you need expressions to set simulation parameters, so point-click expression-builders are fine.
3. Simulation should address the kinds of structural improvements most common in process redesign, e.g. Lean principles, not just labor utilization efficiency in heads-down workflows. Almost all of the simulation tools I have looked at seem designed to solve the process bottleneck problem: How many widget-tweakers do we need at the widget-twiddling step to relieve the bottleneck? Optimizing that kind of heads-down work is, of course, a possible use of simulation, but it is rarely what process improvement folks are worried about. They are thinking about things like how much checking and rechecking is needed, and should it be up front or at the end of the chain? The tool needs to understand the difference between waiting time (or lead time), which does not consume the resource, and active time, which does. Rework is a key issue; if your model loops back, can simulation allow different parameter values for the second time around? And what is the effect of assigning this task to role A vs role B or some outsourced service provider? There are time, cost, and quality tradeoffs… but most simulation tools cannot expose them.
4. Simulation should provide useful cost and quality metrics, in addition to time-based metrics. Time-based metrics come essentially for free with simulation, but cost and quality metrics, if you can get them at all, require processing the simulation output through Excel or database queries. If I have multiple scenarios that differ in terms of process structure and task assignment logic, I’d like to see not only differences in cycle time but differences in activity-based costing (including indirect costs) and differences in quality or risk metrics as well. The notion of process (and subprocess) end states plays a key role here, as it does generally in BPMN Method and Style. Here simulation methodology interacts with modeling methodology.
5. Simulation should actively assist the user with setting model parameters to match known aggregated metrics of the as-is process. One thing almost everyone can agree on about simulation is Garbage In Garbage Out. So the first thing you MUST do is to accurately model the as-is process and make sure that simulation matches known metrics in the aggregate. Knowing the outputs – e.g. 75% of loan applications of a certain type are approved, and it takes 3 days +/- 25%. – does not mean it is easy for the user to set the branching ratios at every gateway, the probabilities at every event, or even the mean duration of each specific task. The tool should help you do that, at least as a starting point. Once you have a set of reasonable simulation parameters for an as-is model that matches known aggregate results, you need to justify any change to each parameter when you model the to-be.
There is a lot more detail behind each of these, but it’s a good place to stop. I am working with a tool now in development that tries to do it right. I’ll be writing more about this topic as the tool gets better.
Hi Bruce,
I’m an Industrial Engineering masters student who as recently become interested in BPM. The students in my lab routinely use advanced simulation software such as ExtendSim and Arena. When I recently took an MBA BPM course, I was a bit disappointed to discover that simulation would not be addressed at all. Similarly, after following numerous BPM blogs for the past year, this is the first article I’ve encountered that has specifically addressed the issue.
My inexperienced perception is that simulation has taken the back-burner in many practitioners’ minds (maybe because it is so math intensive?). This seems a shame to me because much of the gains in process improvement can be made through simulation and optimization.
Of course many will argue that processes on the creative, human-centric end of the process spectrum can’t be optimized like a factory. But, I think many people would be surprised at how predictable the majority of business processes can be when assigned the best probability distributions and close enough parameters.
I’d like to know, to what extent do you think the principles of Operations Research are applicable to BPM? Also, which vendor(s) do you think provide the best simulation tools currently available?
Bruce
Items #1 and #5 contradict to each other I’m afraid.
If a simulation is embedded into BPMS then one can feed a simulation run with BPMS-collected statistics of attributes values, branches taken etc.
Bruce,
As a consultant in the IBM FileNet space, I have seen only one job posting in the last 3-4 years that had a requirement for process simulation. Even process analytics are not used even close to the extent that they could/should be used by clients implementing BPM.
The first thing that comes to mind is that clients cannot model their business processes because they themselves do not have a good grasp of their own processes. Secondly, the pressure these days to deliver “something” does not allow for the time, money and resources to be spent on gathering the information needed to build and run simulation scenarios. It is all about meeting a usually unrealistic delivery date.
Lastly, a lot of business processes that clients are looking to automate are not of the “production” variety but more of the “author, review, approve, publish” type which will not benefit from simulation.
I would have phrased #1 slightly differently. Simulation within a BPMS is great (and doesn’t depend on things that aren’t there yet, like bpmn2 xml exchange cross-platform)- but you can get good simulation tools outside of a BPMS as well – (so don’t just buy a BPMS to get simulation…).
Richard,
I agree with your perception that one reason simulation isn’t used by BPM practitioners is it requires too much “math”, i.e. programming, Excel wizardry, SQL… to get useful results. Simple models don’t reflect the real world, and canned reports rarely give you the metrics you want. So there is a natural tension between usefulness and usability… something I’m struggling with in the new tool I mentioned. I don’t know a thing about Operations Research. It would be great if you could point to information on how that might be used.
–Bruce
Anatoly,
Good point on feedback of runtime values into the model. I agree that is best with BPMS. And this is an increasing point of emphasis by BPMS vendors. But this is after the fact of an automated (hopefully redesigned-already) process! What about processes that will not be automated in BPMS? That’s probably 99% of process models. And even for those that will be automated, do you have to buy the BPMS before you do the redesign? Shouldn’t have to do that. That is the whole point.
–Bruce
Scott,
I understand your phrasing, but I’ll stand with what I said. BPMS vendors should expose simulation in a standalone modeling/analysis tool, not as a feature of the executable process designer. The latter has benefit of feedback from runtime actuals, but… well, see my response to Anatoly on that. So, for example, TIBCO does it “right” on that score, but IBM’s move from WBM to Process Designer is the “wrong direction” for simulation (see my post on the Donut Hole). Bottom line: simulation should be a tool for business process analysts not implementation designers.
–Bruce
I share David’s view that
… the pressure these days to deliver “something” does not allow for the time, money and resources to be spent on gathering the information needed to build and run simulation scenarios.
What-if analysis is never performed and changes goes directly to implementation while we’re crossing our fingers and hope the improvements will have effects. Maybe better tools will have an impact on how we practice simulation. We’re using ARIS for EPC/BPMN modeling.
Bruce –
I just want both 😉
Bruce,
You know more about OR than you think. I don’t use any online resources for OR because it’s all holed up in textbooks for me. But this resource guide from MIT OR looks pretty decent:
http://www.mit.edu/~orc/resources/index.html
Man, I wish the Kahn Academy had some educational videos on this stuff, but they don’t (yet?).
Simulation and queuing theory seem to be the OR techniques most applicable to BPM. E.g. assigning distributions to task completion times or arrivals of people, documents, event occurances etc. Optimization could be useful as well and it’s most common form is linear programming. E.g. “Minimize total time of process given these constraints of human labor hours available, etc.”
The reason I’m blabbering on about this is I’m worried that if not enough attention is paid to the mathematical modeling of reality, we are just building A solution to problem, not the BEST solution to the problem. Just like the often quoted Bill Gates statement:
“The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.”
Personally, I’m a pretty mediocre mathematician. But, I’m always willing to struggle with it because I know it’s immense potential to make things better.
Hi Bruce,
1. Gartner have defined simulation as a must-have feature to be declared a BPMS, therefore BPM vendors have stuck it in, even though the won’t use it.
2. New customers can see the ROI in automating manual tasks, but cannot see the ROI in simulation.
3. The market for mature customers that have already automated their processes, built the BI around them, and now just need to optimise them – is very small.
4. They odds for using simulation correctly are very low. Not just data issues (garbage in garbage out), but also methodology (if you believe that dropping a task will improve the process, you will prove it). Being impartial is not a human characteristic.
5. The only added value a simulation engine would bring to a BPM project would be if the simulation engine would automatically generate a recommendation list to the manager.
It wouldn’t cover all the possibilities, but would add more value than it does today.
See BPM Simulation – missing a trick
Cheers,
Adam
Adam,
I pretty much agree with all your points, except to note that they come from a BPMS perspective not a “process redesign” perspective, where automation is rarely the intended outcome. There is both a tools problem – the tools are almost universally geared to the BPMS “bottleneck” problem – and a methodology problem. BPMN needed a methodology and I have been reasonably successful providing one. Maybe foolish but I’m now trying to figure one out for simulation as well.
And I agree with your last point. The tool should automatically recommend configurations that optimize selected metrics. Thanks for your comments.
–Bruce
Adam,
Thanks for your BPM Simulation blog link. Those articles are a good counterbalance to my normal mode of thinking at university. It’s refreshing to hear about the importance of practicality… something that is not often discussed by academics or intense analysts. Everything abides by the law of declining marginal utility and business applications of complicated math techniques are surely no different.
[…] BPM and Simulation – Bruce Silver Probably no aspect of BPM has underperformed versus expectations more than […]
Hi Bruce – trust you are well. As you probably know, I spent a lot of time researching the use of simulation in the 1990s … and I don’t think it has changed that much since then.
Firstly, I mostly agree with Adam … but would go further. Most simulation models don’t tell you much that you didn’t already know (assuming they are built with care).
Which brings me onto the real issue – most simulation models are deterministic in nature. That is, the model was constructed to prove a particular point of view (of the modeler). If it doesnt prove that pov, then let’s tweak the input parameters in the model to make it seem right. Simulations tend to bury assumptions made by the modeler.
Then there is the amount of data that is needed to validate the model – to check that it reflects the real world. That tends to be expensive to collect (hence BPM Suite vendors bung it in there because they can collect the data more easily). And just because Jim thinks simulation is the greatest thing since sliced bread (it demos well), doesnt make it actually deliver the value.
And then there’s the problem of resources involved in multiple processes. How many people do you know that do just one thing repetitively like a production line. Typically, resource A spends X% of their time on Process #1 and Y% on P2, Z% on P3, etc. And tiny variations on the amount of resource availability tends to have a massive impact on the potential to predict bottlenecks etc. Which leads on to another issue … most simulation tools can only handle one process at once. They do not look across all processes that a resource may be involved in and simulate them all in parallel (I am talking about most discrete event simulation tools, rather than systems thinking/dynamics style tools).
Finally, there is the issue that most “processes” are not actually one process (where everything is joined up in one chain of activities). I think we both know that this route leads to very complicated and brittle process architectures. A far better approach is normally to break the domain problem up into a set of cooperating processes that call/trigger each other. This is also synonymous with the Case Management pattern (Adaptive or Dynamic … who cares).
Having observed these health warnings, Simulation can be very useful in scenarios where there are very large volumes and large numbers of workers involved in processes that do not change much. It can help identify optimal resource configurations and let managers assess the impact of changes. But even here the temptation is to make quick assumptions about the ability of workers to handle new work patterns (i.e. you need to check).
The only valid use of simulation is to test your assumptions (not bury them). If someone shows you a simulation model with a cash impact … it’s invariably invalid (about the same as me giving my bank manager a spreadsheet that shows I will be a multi-millionaire by year 3 of the business plan).
All,
It is interesting and illuminating to read these comments as the Product Director at Lanner, where we provide not only a leading simulation workbench in our WITNESS product, but also embed our simulation engine in some of the world’s leading BPMS offerings.
I have to agree, though, that many BPMS users are simply unaware of what they may have at their fingertips, as Adam suggests.
Now, the desire that Bruce raises in his first point, where simulation is not embedded but instead utilizes BPMN 2.0 interchange is exactly in line with work we are doing now. Hopefully, in a short whil I will be able to post where you can try this all out, live, on the web.
And I concur totally with Adam and Bruce on the aim of making it simple to get the answers that we really want; but that does rely on the process to reaching those answers being based on good practice, for which there are few (if any) shortcuts.
Thanks Simon. I look forward to checking it out. Take a look at my BPMN-I post if you are interested in BPMN 2.0 xml import.
–Bruce
Derek,
I agree with you (and others here) that simulation is mostly ignored, or abused to sell a predetermined result, or doesn’t tell you much that you couldn’t tell from static analysis (or even simple inspection). Partly the fault of the tools – particularly those built into BPMS to meet the Gartner checklist – and partly the fault of the users, perhaps too much in a hurry (really?) to “do something”. Well, those folks get what they deserve. But there are good tools out there. I haven’t used Lanner but I imagine it’s pretty good. I’ve used ServiceModel and it’s very good… but a programming language, basically. Putting aside the case management problem — not all cases are “unpredictable” but certainly too hard to model using BPMN — my interest here is twofold: 1) Making simulation useful for “process improvement”, i.e. redesign, where the tasks are occasional, not heads-down. And 2)Making simulation more useful for work management, i.e. the heads-down bottleneck problem (where most simulation is used today… as optimization of already automated workflows, not upfront analysis). We now have a new language – BPMN – that is quite expressive, deals with events, allows interchange etc. I think the older tools that are good probably need to take advantage of that — see Simon’s comment here. But also a cookbook methodology is needed, even with good tools. I think with better tools and methodologies, simulation could become a useful tool. But we’ll see…
Building off Anatoly’s comment, I agree there is value in simulation being attached to the BPMS (even though Global 360’s analystView does not require a suite).
Even in cases where businesses have automated a process to improve it, the work of improvement itself is never done. To think that you could improve the process once and never again would be a foolish approach. Therefore, for the BPMS owner, using simulation that takes advantage of operational data captured in the execution of business processes is useful to the business process analyst. Additionally, using the operational data from a BPMS leaves one less at risk of the GI-GO results.
I understand that not all companies who have invested in a BPMS are using the simulation parts of their Suites. That use comes as their maturity of BPMS use grows. For companies that are taking advantage of a BPMS’ reporting capabilities for operational analysis, a question should arise about how to further improve the automated process — an area where simulation may be an option to pursue.
Additionally, for automated processes within a BPMS, simulation provides value in helping to understand the impact of change. For example, what if demand for Product X rises 20% in the next 3 months, do we have enough resources to handle such an increase? Simulation can help assess such impact of changing business demands on such a process.
Amen, on this Bruce. I was disappointed in the simulation capabilities of certain BPM suites. I don’t know about Savvion, but they started as a simulation package, so they may be better than most.
I use ExtendSim simulation and I find a lot of process owners unaware of the performance of their operation. Lots of low hanging fruit.