Even though the meat-and-potatoes of BPM 2.0 — business-oriented, top-down model-driven process implementation (without code), based on some form of SOA — hasn’t yet been finished, BPMS vendors want to skip ahead to dessert, with system-generated recommendations on how to optimize the process design. Lombardi was the first out with this, but now both BEA and Savvion are planning to offer a similar concept in their next major releases.
Lombardi’s Process Optimizer perspective blends simulation analysis and historical performance data to identify points in the process model where projected performance falls short of target KPIs, and then offers recommendations on how to bring those KPIs back in line. Some of the suggestins are fairly obvous: “To decrease wait time
But others are much cooler, for example, suggesting replacement of a human task, such as an approval, with an automated business rule for some subset of process instances. An analysis wizard looks for correlations between the approvalStatus value and some other variable, such as orderAmount. It might report, for example, that when orderAmount is less than 5275, approvalStatus was ‘approved’ 94% of the time, but only only 47% were ‘approved’ when orderAmount is greater than or equal to 5275, and suggest a business rule to automatically approve orderAmounts under 5275. If you agree with the suggestion, the wizard even creates the rule for you.
What about the 6% that normally wouldn’t get approved? Isn’t it dangerous to autoapprove those? Maybe you can add another rule to lower that to an acceptable level… I’m not sure exactly how it works. But it’s an intriguing approach. For me the cool part is the correlation analysis, more than the recommendation.
If you read James Taylor’s blog, you’d say this sounds similar to something he writes about frequently, called “predictive analytics.” That’s something used in high-end decision management, like underwriting loans and insurance, to predict risk based on historical trends using a combination of business rules and business intelligence technology. Sounds like BPMS vendors are trying to take a ‘lite’ version of that and make it part of the suite. Could guided process optimization be Gartner’s next ‘must-have’ BPMS feature?
I agree that this is a kind of “light” predictive model – it might work well for simple cases but what if there are more subtle reasons those one’s were approved? How deep is the statistical analysis I wonder. It also shows the problem of using analytics alone – the rule may not be a legal one even if it is analytically derived. This is why “engineered” predictive models are important and why combining rules and analytics matters.
Cute though…
Too Cute for Words…
Bruce Silver’s post about process optimization garnered a comment by Fair-Isaac’s James Taylor. Bruce has recently spent a lot of time with Lombardi’s product (TeamWorks) and presumably with Savvion and BEA. I don’t know if James has seen any of…
Well I posted to Phil’s blog – I knew that flip little comment was going to get me in trouble. As I said there: “Sometimes I can improve a process, optimize it if you will, simply by evaluating the execution of the process. Sometimes I must consider broader implications. If I want to optimize an orgination process, for instance, I probably need to consider what kinds of accounts end up in collections and a bunch of legal restrictions. Optimizing the process steps might help with certain things and will definitely give me some good indications of where to focus my energies, it’s just not a substitute for a holistic assessment of rules and processes.”
There’s a lot more on my blogs about the relationships between rules and processes (especially in the BPM section) and how you can bring analytics to bear on a process by analytically enhancing decision nodes in your process. Check out this article on BPM Institute also
process simulation…
Hi. Very nice blog. I\’ve been reading your other entries all day long..lol….