Thursday, October 25, 2007

What makes IT better: ITIL, COBIT, or people?

The last time I thought about IT 'governance' in-depth was a while back. Of course everyone claims its top of mind when thinking about SOA, since that same 'everyone' is trying to convince the business guy with the cash that the business and IT have converged, and he should spend his money on more tech stuff. And its true that if you really design meaningful services they can reflect the what the business does, which in turn enables a business leader to ensure he or she will meet the business objectives that demand a big fat bonus.

The last time I thought about IT 'governance' in-depth was prior to working on how SOA makes IT better, or beating business goals with process optimization. This was back when I concentrated on 'compliance'. In the good old days when SOX was new(-ish) and still hyped (although according to Google Trends, it never compared to the Red Sox or White Sox that dominated the sporting public's attention), I paid a lot of attention to the details of how organizations really became and remained compliant with the mass of legislation and regulation out there.

From a business standpoint it was easy - COSO was recommended by the SEC as a fine framework for documenting and testing your internal controls to ensure compliance, even though any sort of framework for defining how effectively you did business was considered radical and expensive.

IT, probably because it was always a left-brain discipline, had many frameworks that were favored and used extensively. Especially when IT was expensive and often owned only by governments or the military, minimal risk of failure (or at least the bureaucracy around CYA) was considered essential and led to the use of frameworks such as ITIL. This was intended to ensure that every phase of telecoms and technology usage, from acquisition to deployment, to eventual failure and fix was defined (if not by the framework itself, by the poor team attempting to implement and run a system).

COBIT on the other hand offered an apparently focused approach to IT governance, since it limited its scope to IT controls - the automated bits of business worried about maintaining systems to perform consistent decisions. The problem with all this was that most organizations already had some form of adopted framework. SOX compliance for IT often became a 600 page bound work of photocopies from other frameworks, printed screenshots, and a little summary of the true processes that the IT organization followed in putting new systems into production.

In reality, what IT framework do most organizations use? And how well does this tie back to the emerging governance requirements of SOA? What does the outside world do? And how does this vary between financial services organizations, government, software or others? People are core and their consistent and effective communication is as important as any framework. Without good people, most frameworks will fail.

Wednesday, October 17, 2007

Jim Sinur - blogging at last

A very quick post to introduce readers interested in BPM, business problems and technology, to Jim Sinur's new blog. Up until earlier this year Jim was the lead BPM analyst for Gartner (and had some extremely fancy title to go with it), so he has huge insight into the BPM industry, its customers and its technology.

Jim now is Chief Strategy Officer for Global 360, but as you'll see from his blog he still retains strong independent views on everything related to business process, which should make for some interesting reading. Let me encourage you to subscribe to Jim's blog now.

Tuesday, October 16, 2007

BPM or ERP - Stand out from the crowd

There are many times I've heard about BPM being described as the panacea to almost every type of business problem. Much in the same way that SAP (or Oracle) would describe ERP as being the true way of implementing solutions to your organization's specific problems. Unfortunately to me this seems like an "all or everything" situation. All business problems that SAP has ever bothered to consider valuable are covered in its ERP. Everything that ever needed process automation that you couldn't justify using SAP for can be implemented using BPM.

It seems to me that there is a whole lot of business problems that are best solved by ERP just because everyone else does. Any business process that requires industry best-practices to be replicated across the board, or application that offer your organization little differentiation are just these. ERP has these coded straight out of the box, so why bother reworking them using BPM for your organization? If these represent good learning processes to achieving BPM excellence, or you have BPM expertise and technology in house already, perfect! Otherwise, BPM is far better for processes you want to differentiate.

Now imagine that you run a business unit of an insurance company. Does your claims process really differentiate from the competition, or is it really the speed in which you pay, or the customer services that support the process that are really different? This is where BPM wins, even though ERP or boxed software could run the same process. Process optimization provided by the best BPM suites provides the analysis of process information and opportunity to improve performance based on real-time business metrics that a restrictive ERP concentrated purely on business data can not. Case management based BPM provides the contextual view of data across many systems that can make customer services really differentiate your service.

ERP and BPM both offer good solutions, often for the same problems. Pick the right solution for the right application - and when it matters, stand out from the crowd and do something different using a BPM based enterprise application platform.

Technorati tags: Financial Services Technology New Account Opening BPM ERP

A post from the Improving New Account Opening blog

Tuesday, October 09, 2007

Facilitate or Automate?

Keith Swenson has been writing some great posts on the differences in modeling what he calls "Automator" and "Facilitator" processes. Automators aim to model and control every interaction between systems and potentially also human users. Facilitators on the other hand model at a much higher level, showing the key participants in a process, and the overall flow, but do not need to delve down much beyond this, relying on the underlying system's abilities to assist users.

Keith's post from a week or two back, Human "Facilitator" Processes really lays this concept out well. He shows how a Facilitator process models the interactions between a writer and an editor in the process of preparing an article for publication. Its a two step process that I don't even need a graphical modeling tool to represent:

O ----> Writer: Write Article ----> Editor: Review Article ----> X

All in a single line we have represented the key participants and interaction between the writer and editor. According to Keith:

In order to explain this process to a person who was to participate in this, the two nodes would be enough: Betty will write the document, and when completed, George with review the document. Both Betty and George understand their role in the process, and can go about their work.

This only makes sense because most of us (especially Betty and George) understand the process this model is trying to represent. At this level, we don't upset the knowledge workers in the process by limiting their creativity with unnecessary administrative tasks or presuming that we know how to do their jobs better than them. This is the classic 'paving the cowpath' approach to process improvement that really centers on the delivery of work between users (remember the document management workflow world?).

The Automator that Keith talks about focuses on the 'server' and what it does. It deals with the automate-able pieces of the process and just presents work to Betty and George when it can no longer automate. It basically represents the system as an execution / delivery model that saves a developer writing automation code, and relies little on the capabilities of the executing system. If you BPM tool wants to execute at this granular a level, great, but don't expect to benefit from new features it may release in version 6.x without explicitly building them into a process. Do expect to be so restrictive that your users reject the system.

In his most recent post BPMN & Methodology Agnosticism Keith talks about how a modeling standard like BPMN, which is supposed to provide common understanding of process models still implicitly represents much of the underlying capabilities of the system that executes it. This is something that is encoded by the underlying and potentially unwritten methodology of the analysts that designed the process, a methodology that is rightly (I believe) biased towards the available capabilities of the target system.

I like this. It feels like there is a place for me and 'my BPM'. It feels like I don't need to have two steps or 50 steps to represent a business process. If 'my BPM' handles assignments, deadlining, escalation and event handling in a meaningful and flexible way, I don't have to draw this stuff on a model. If some things actually don't fit well in a process but are better represented by other capabilities of my system, even better. As I talked about yesterday in Don't forget the real end-users of BPM, process models shouldn't have to be the center of the world.

With a mixture of Facilitator and Automator plus a whole lot more, my process model really represents the valuable work my people and my systems actually do. A process model drawn in a modeling tool at this level would be understandable to someone outside the organization (think of all the SOX documentation you have describing processes in your organization that an auditor must examine). At the same time I can ensure it is not restricting the knowledge worker but 'facilitating' his or her everyday tasks, while automating the dull meaningless stuff. I suppose that 'my BPM' is an Automacilitator...


Technorati tags:

A post from the Improving New Account Opening blog

Sunday, October 07, 2007

Don't forget about the real end-users of BPM

BPM is obviously very much about process, but I feel that obsessive focus on the process model reinforces an abstract view of the business that misses a large part of the story. BPM software vendors are guilty of producing BPM suites that reinforce this view. They present their systems from how pretty and easy to use their process modeling tools are, and how multiple process analysts can work on a process simultaneously - treating the analyst as the primary end-user.

The reality is that the handful of process analysts are not the most important end-users of BPM software - the primary end-user is the business user that participates in the process to do his or her job and contributes to the success of the organization's critical operations day in, day out. Why do BPM vendors rarely focus on the hundreds of 'true' end-users?

BPM software often sells exclusively to business analysts. Analysts rightly need tools that facilitate the design of effective processes. And building pretty design tools is a sure way of selling software. After all, just like buying a car, most business analysts should be expected to go for tools that appear powerful and have a nice shine.

A true business application design platform concentrates on the business end-users first: what will they see, what documents do they use and create, what other sources of information do they access and what tasks do they perform? Process is obviously core, and often represents an orderly structure that many of these other things hang off of. With a platform that only focuses on process, rather than providing a framework for actually running all components of critical business applications (including business process) an organization is left to build a whole lot of technical infrastructure from scratch. This leaves the business users depending on applications that are limited by the skill of an organization's masses of software developers.

Now don't get me wrong, clean and accurate process design is an essential component of effective business applications. But failing to provide the equivalent focus on the sharp end, the actual execution of the newly perfected processes inside a meaningful business application can leave an organization trusting its critical operations to software that is more worried about 10 process analysts than 1000 concurrent users.

Technorati tags:

A post from the Improving New Account Opening blog