[tweetmeme source=”ashish_bhagwat” only_single=false]
To have capability to define and execute dynamic processes on the fly is really powerful, and we see a lot of progress in this area. This support for the dynamic processes or Ad-hoc process definition could also gain more relevance in a collaboration driven businesses, and in lot of new areas where BPM has traditionally not been actively pursued.
However, I faced a situation that keeps me warned of an undesired implication of such capability. We had a similar functionality built up at one of our BPM implementations wherein the exceptions were supposed to be handled by kicking off a quick dynamically created sub-process. When faced with a scenario that the defined process didn’t handle (quite possible in early cycles of an interactive and incremental BPM approach), the user could define a sub-process right there from the execution browser and kick it off. This was pretty innovative and users really loved it.
And they loved it a little too much, this seemed like freedom! We saw these ad-hoc processes kicking off everywhere. One of the business managers even came up and wanted that capability built into every activity throughout her processes. We had a hard time convincing the users for not using it with such indiscretion, and rolling back this feature became nearly impossible even after processes became more mature in those particular areas. The problem is that none of those processes had any characteristics of being unstructured or dynamic.
Such dynamically created processes are difficult to manage, the metadata is inconsistent, one finds it very difficult to benchmark the processes and define KPIs. And process adherence becomes a huge challenge.
So, one has to be cautious with such capabilities, here are some of my suggestions:
- Draw a balance between process flexibility & process adherence objectives. You want user-friendly processes, but the rule-books cannot go away. Process flexibility cannot be at the cost of process adherence.
- Consciously and carefully evaluate whether the processes in question are truly dynamic. Apply the dynamic process kit only at the processes that show the characteristics of being dynamic.
- Question why those processes are dynamic? Is it due to the weaker process adherence culture? Are the variables causing the process to be dynamic out of your or organization’s control?
- Do you have a well-defined Process Exception handling approach? Sometimes, a standard approach to exception handling helps tame uncontrolled exceptions floating around.
- Define the roles that would be able to make the changes to a running process. Simple and obvious, but still very important.
- Remember that just because users like something doesn’t certify that as a good thing. Users know what they do, they know their processes, but they may not know what’d good for their processes and organizational efficiency.
With some caution, we could prevent the right & powerful solutions from being applied to wrong problems, Dynamic process kits are meant for truly dynamic & unstructured processes only.
#1 by kswenson on March 9, 2010 - 2:01 am
So, could you include a little more detail on exactly what was wrong with the dynamic processes kicking off everywhere? From my point of view, if the user was doing this, then it is an indication that something was wrong with the process in the first place. It is well known that people have to work around limitations in software all the time. But, we know that well written software, and that includes well written business processes, are such that people don’t feel like they have to work around it. Of course, I don’t know anything about your particular situation, but if one assumes that they were working around problems, then the thought of you running to prevent people from fixing the processes is really quite amusing.
#2 by Ashish Bhagwat on March 9, 2010 - 2:46 am
Well, the processes we are talking about here were a little complex to begin with, there were multiple variations based on Order attributes. Even with the deep involvement of the business owners, the variations could not be captured 100% while defining the process. So, the idea behind giving the users “some” freedom to kick off a dynamic subprocess on the fly was to allow them to proceed at that time.
In the next iteration of the process deployment, such changes would be incorporated. And these iterations were placed pretty close to each other. So, there was no effort to ‘prevent the people from fixing the process’. The process changes that were really required were done pretty speedily. However the process needs to be changed in the “defined” version. Any Ad-hoc or dynamically created process would not alter the base process every time a user does something to handle an unforeseen scenario, at least not right away.
The problem was that this capability could be misused at processes that weren’t truly dynamic or unstructured. Certain structured processes need to be governed by strict KPIs and process adherence is the key. The metadata and process attributes are used for dashboard and other analytics. This becomes very difficult to assess and analyze if the process keeps coming up with unwarranted variations.
Hence the suggestions to keep some control there. Nothing wrong with such capability, we should rather encourage it. However, usage with caution as pointed in the post.
#3 by KSwenson on March 23, 2010 - 3:19 am
Yes, I completely agree. When defining a process, Pareto principle says you can get most of the way there easily, and a lot more trouble to get the last few percent. So start with a basic process that gets most of the actions, and let people fill in the details with the dynamic capability. This works really well.
Ideally, as you approach the “real” process, the amount of alterations will decrease to the point where ultimately the users will not feel the need to extend the process with dynamic capability. But in your case, this did not seem to happen, and it is worth asking why this does not happen.
You mention that the capability was (mis)used for processes that “weren’t truly dynamic” and “process adherence is the key”. Again, I don’t want to come across as critical — I am familiar with how difficult it is to please everyone in every situation — but still I see a hint that the “ability to analyze the process” is being prioritized higher than comfort of the person doing the job. What is more important here?
There is value in uniformity. Scott’s comment helps to highlight this. It is very important in many cases that people do things the same way. Most workers, when this in pointed out, will purposefully conform to the standard way. One of the problems with Sharepoint is that there is no guiding, and people have to do it all on their own. For example, if the process was completely blank to start with then it would not be surprising that everyone does it differently. But, in this case you have a standard process which you are evolving toward an optimal process. I know that it takes some effort to modify the process, and if users are going to that effort, even when they know conforming is good, I can only assume that there is some good reason for them to go to this trouble.
In the end, I think I can see your point: people are completely constrained with a fixed process, but dynamic makes any change possible, and in cases can be too much. What you need is probably something that allows people some flexibility, while at the same time constraining other dimensions, and finding the right combination of this is still a challenge for the future.
#4 by sfrancis on March 22, 2010 - 11:30 pm
Although I like to call unintentional proliferation of unmanaged processes “The Sharepoint Effect” (http://www.bp-3.com/blogs/2009/09/the-sharepoint-effect-revisited/), it can happen with any technology backing, it just happens that Sharepoint seems to encourage this more than most 🙂
An intentionally dynamic process or portion of the process can be a good thing – but when new processes are constantly being created it can create end-user-confusion (what am I obligated to do for THIS process vs. THAT process? and how do i fill out this form to your needs versus what the last guy wanted?). And the word “dynamic process” means so many things to so many people that while I might envision the creator having to write decent instructions to the participants of their dynamic process, someone else might be envisioning rules deciding which activities to engage in the process, and someone else might envision me merely leveraging common building blocks for the process, not inventing my own instructions for those blocks.
And of course, at the end of the day, if the goal is to arrive at a standardized “best practice” process- then somehow this proliferation has to be reined in and targeted toward the right answer (as you point out in your comment). Thanks for sharing, its a good read-
#5 by Max J. Pucher on March 23, 2010 - 9:05 am
I would agree with your experiences as we have seen the same thing happening when BPM products were expanded to allow dynamic sub-process as is now en-vogue. I also have seen that users LOVE this freedom. This is what we ought to be doing! I propose however that the problems are related to the products and to the orthodox BPM methodology involved. Neither KPIs nor benchmarks are required. Yes, some processes need to be governed and that is in most cases easy with some simple rules. There is no need for process adherance if there is no rigid process. All that counts are the goals that are either human assessed or encoded in goal rules.
With the right software process flexibility does not interfere with goal achievement. All processes are dynamic until they are rigidized in the analysis process because no-one knows better. No one needs a process adherence culture – where does this management concept come from? If the process owner and his team are given authority then the processes are NEVER out of control. If management can audit the process goal achievement then that is all they need. Adherence is not a measure of quality in any book.
Yes, process metadata and business data are essential and have to be properly managed. Otherwise there is no way to write sensible rules. If processes are not rigid then there are NO process exceptions! There is just aditional information that is handled by the people as needed. No need for hundreds of process variants and exception handling. Standard exception handling will make things even worse. There is no such thing.
Clearly, all actions within a process have to role/policy authorized and that includes those activities that change the process execution.
If users like something that they should be allowed to try it, because they do know their processes – as you said. If it turns out to be not such a good activity then it is not a big deal because it can be changed at any time. It would be responsibility of the process owner to discuss with his actors the changes they make and select the ones that are the best. The system would then recommend those actions and only enforce those that are regulated.
Management can at all times audit each an every process and discuss the goals and handovers with the process owner. Changes are fast and local and adhere to a central metadata infrastructure that provides transparency up and down the line.
Therefore I do not agree that dynamic processes are a problem, it is the BPMS that are not flexible enough that are the cause.