Monday, December 31, 2007

Loyalty for financial products

The festive season has given me a little time to start reading some blogs again and spend a little time thinking and writing. A couple of the interesting posts I came across relate closely to some of the research I have recently been doing around financial services and the value of financial products and their associated customer services.

To start with there were several discussions around something that many of us have contact with around this time of year - the loyalty program. Love 'em or hate 'em, the loyalty program has introduced itself into every type of company that depends on repeat business. One of the longest running types, the frequent flyer programs run by airlines (e.g. BA, United) and alliances (e.g. Airmiles, Star Alliance) have traditionally been placed to encourage frequent travelers to always fly with the same airline. In the past the benefit to the customer of this was twofold: firstly to collect enough 'miles' to redeem for a free flight for pleasure; second, to gain status to receive upgrades and enhanced customer service. The cost of these programs could be assessed by the airlines and shown to provide the return of constant repeat business from customers.

As the post The declining value of loyalty plans on The Bankwatch blog highlights, the loyalty program experienced by many travelers is being perceived as much lower value than in the past. In the post, Colin states:

So stepping up a level, loyalty points and plans are only of value if the customer feels value.  Fewer are feeling that value nowadays, and those Banks who pay for those plans, should think about that.  I think the loyalty plan model is broken.

The implication is that airline loyalty programs are failing to meet customer expectations and therefore may not lead to the repeat business that is built into the business model. The question is whether there are things that financial services institutions do for their customers that actually reduce repeat business.

Colin's post was actually in reaction to Customer Experience Matters More Than Points In Building Loyalty on Forrester's Marketing Blog, which relays the experience of Shar Van Boskirk in traveling recently:

Except!  That I am really bothered by Linda's mantra "I don't care who you are or how much you travel."  Now the idea of a loyalty program is that you DEFINITELY care how much I travel and I've found that my fundamental weakness as a traveler is that I really want people to care who I am.

The loyalty program has degenerated from the ability to build status as a traveler to achieve enhanced customer service and rewards, to the one thing that may get you home on the same day you started traveling when the overbooked airline starts prioritizing the handout of the last remaining seats on the only flight that hasn't been canceled. This means that use of frequent flier benefits only reinforces in my mind how poorly an airline is operating since I'm only using the benefits to negate a failure on the airline's part. This is more of an 'anti-loyalty' program...

Does this relate to banking and financial services in general? As banks and other institutions realize that customer service is core to retaining gaining more market share, understand customer loyalty and attracting new customers is key, while avoiding reinforcing negative perceptions should not be overlooked.

An oversimplification is that much of the whole model of retail banking is built on repeat business and that banks can just assume it will happen: a bank account ties you to doing business with the bank. If I want the bank to hold my cash, I have to pay for their transactions to make a transfer. At least with my bank there is no explicit loyalty program, beyond them offering a reasonably competitive online banking service that doesn't annoy me every time I use it (i.e. being marginally better than other banks). As a standard customer if I want the equivalent of premium service I basically have to pay for it through additional charges, and traditionally its been hard to move bank accounts, so I'll not do it too often.

To a bank this may appear to make a lot of sense. In everyday operation there is little to differentiate me as a customer from almost every other customer that maintains a relationship with the bank; my salary is deposited into the account and sits there until I pay some bills and maybe move a little cash to a savings account - pretty much the same as every other professional person. Attempting to segment me based on this limited information does not help the bank offer me better services or do something that will help stop me moving to a new bank.

The best many institutions do is offer a basic form of loyalty bonus such as offering better rates of interest on a new savings account to current customers. Brokerage accounts can offer a sliding scale for charges based on volume or offering cheaper trades to buy into their investment products also offers a loyalty incentive.

Customer service remains the new frontier of financial services as it becomes easier for customers to find and move their money to new institutions. For financial services loyalty and repeat business still have some major stumbling blocks:

  • No aggregated view of a customer is available that the bank can use for managing my relationship and offering reasonable customer service when I call a branch or call center
  • Information is not shared between business units, preventing me easily signing up for new products online
  • A single / complete view of me as a customer is not available so that the bank can see whether I am actually a great customer owning multiple of their products (and therefore worth working harder to retain) or just a mediocre customer with a savings account
  • Based on limited information, segmenting customers for marketing new services and products is impossible
  • Credit agencies are often seen as the primary source of aggregated information about me. Does a FICO score really offer appropriate metrics for offering me appropriate products and strong customer service?

Many of the issues relate back to opening a new account with an institution. Only with an accurate view of a customer from that point of 'on-boarding' through the customer lifecycle can a true relationship be effectively managed. In my opinion many financial services institutions have a lot of work to do to get an accurate and unified view of their customers, let alone measure and use the metrics that really identify the customers worth concentrating on.

Thursday, October 25, 2007

What makes IT better: ITIL, COBIT, or people?

The last time I thought about IT 'governance' in-depth was a while back. Of course everyone claims its top of mind when thinking about SOA, since that same 'everyone' is trying to convince the business guy with the cash that the business and IT have converged, and he should spend his money on more tech stuff. And its true that if you really design meaningful services they can reflect the what the business does, which in turn enables a business leader to ensure he or she will meet the business objectives that demand a big fat bonus.

The last time I thought about IT 'governance' in-depth was prior to working on how SOA makes IT better, or beating business goals with process optimization. This was back when I concentrated on 'compliance'. In the good old days when SOX was new(-ish) and still hyped (although according to Google Trends, it never compared to the Red Sox or White Sox that dominated the sporting public's attention), I paid a lot of attention to the details of how organizations really became and remained compliant with the mass of legislation and regulation out there.

From a business standpoint it was easy - COSO was recommended by the SEC as a fine framework for documenting and testing your internal controls to ensure compliance, even though any sort of framework for defining how effectively you did business was considered radical and expensive.

IT, probably because it was always a left-brain discipline, had many frameworks that were favored and used extensively. Especially when IT was expensive and often owned only by governments or the military, minimal risk of failure (or at least the bureaucracy around CYA) was considered essential and led to the use of frameworks such as ITIL. This was intended to ensure that every phase of telecoms and technology usage, from acquisition to deployment, to eventual failure and fix was defined (if not by the framework itself, by the poor team attempting to implement and run a system).

COBIT on the other hand offered an apparently focused approach to IT governance, since it limited its scope to IT controls - the automated bits of business worried about maintaining systems to perform consistent decisions. The problem with all this was that most organizations already had some form of adopted framework. SOX compliance for IT often became a 600 page bound work of photocopies from other frameworks, printed screenshots, and a little summary of the true processes that the IT organization followed in putting new systems into production.

In reality, what IT framework do most organizations use? And how well does this tie back to the emerging governance requirements of SOA? What does the outside world do? And how does this vary between financial services organizations, government, software or others? People are core and their consistent and effective communication is as important as any framework. Without good people, most frameworks will fail.

Wednesday, October 17, 2007

Jim Sinur - blogging at last

A very quick post to introduce readers interested in BPM, business problems and technology, to Jim Sinur's new blog. Up until earlier this year Jim was the lead BPM analyst for Gartner (and had some extremely fancy title to go with it), so he has huge insight into the BPM industry, its customers and its technology.

Jim now is Chief Strategy Officer for Global 360, but as you'll see from his blog he still retains strong independent views on everything related to business process, which should make for some interesting reading. Let me encourage you to subscribe to Jim's blog now.

Tuesday, October 16, 2007

BPM or ERP - Stand out from the crowd

There are many times I've heard about BPM being described as the panacea to almost every type of business problem. Much in the same way that SAP (or Oracle) would describe ERP as being the true way of implementing solutions to your organization's specific problems. Unfortunately to me this seems like an "all or everything" situation. All business problems that SAP has ever bothered to consider valuable are covered in its ERP. Everything that ever needed process automation that you couldn't justify using SAP for can be implemented using BPM.

It seems to me that there is a whole lot of business problems that are best solved by ERP just because everyone else does. Any business process that requires industry best-practices to be replicated across the board, or application that offer your organization little differentiation are just these. ERP has these coded straight out of the box, so why bother reworking them using BPM for your organization? If these represent good learning processes to achieving BPM excellence, or you have BPM expertise and technology in house already, perfect! Otherwise, BPM is far better for processes you want to differentiate.

Now imagine that you run a business unit of an insurance company. Does your claims process really differentiate from the competition, or is it really the speed in which you pay, or the customer services that support the process that are really different? This is where BPM wins, even though ERP or boxed software could run the same process. Process optimization provided by the best BPM suites provides the analysis of process information and opportunity to improve performance based on real-time business metrics that a restrictive ERP concentrated purely on business data can not. Case management based BPM provides the contextual view of data across many systems that can make customer services really differentiate your service.

ERP and BPM both offer good solutions, often for the same problems. Pick the right solution for the right application - and when it matters, stand out from the crowd and do something different using a BPM based enterprise application platform.

Technorati tags: Financial Services Technology New Account Opening BPM ERP

A post from the Improving New Account Opening blog

Tuesday, October 09, 2007

Facilitate or Automate?

Keith Swenson has been writing some great posts on the differences in modeling what he calls "Automator" and "Facilitator" processes. Automators aim to model and control every interaction between systems and potentially also human users. Facilitators on the other hand model at a much higher level, showing the key participants in a process, and the overall flow, but do not need to delve down much beyond this, relying on the underlying system's abilities to assist users.

Keith's post from a week or two back, Human "Facilitator" Processes really lays this concept out well. He shows how a Facilitator process models the interactions between a writer and an editor in the process of preparing an article for publication. Its a two step process that I don't even need a graphical modeling tool to represent:

O ----> Writer: Write Article ----> Editor: Review Article ----> X

All in a single line we have represented the key participants and interaction between the writer and editor. According to Keith:

In order to explain this process to a person who was to participate in this, the two nodes would be enough: Betty will write the document, and when completed, George with review the document. Both Betty and George understand their role in the process, and can go about their work.

This only makes sense because most of us (especially Betty and George) understand the process this model is trying to represent. At this level, we don't upset the knowledge workers in the process by limiting their creativity with unnecessary administrative tasks or presuming that we know how to do their jobs better than them. This is the classic 'paving the cowpath' approach to process improvement that really centers on the delivery of work between users (remember the document management workflow world?).

The Automator that Keith talks about focuses on the 'server' and what it does. It deals with the automate-able pieces of the process and just presents work to Betty and George when it can no longer automate. It basically represents the system as an execution / delivery model that saves a developer writing automation code, and relies little on the capabilities of the executing system. If you BPM tool wants to execute at this granular a level, great, but don't expect to benefit from new features it may release in version 6.x without explicitly building them into a process. Do expect to be so restrictive that your users reject the system.

In his most recent post BPMN & Methodology Agnosticism Keith talks about how a modeling standard like BPMN, which is supposed to provide common understanding of process models still implicitly represents much of the underlying capabilities of the system that executes it. This is something that is encoded by the underlying and potentially unwritten methodology of the analysts that designed the process, a methodology that is rightly (I believe) biased towards the available capabilities of the target system.

I like this. It feels like there is a place for me and 'my BPM'. It feels like I don't need to have two steps or 50 steps to represent a business process. If 'my BPM' handles assignments, deadlining, escalation and event handling in a meaningful and flexible way, I don't have to draw this stuff on a model. If some things actually don't fit well in a process but are better represented by other capabilities of my system, even better. As I talked about yesterday in Don't forget the real end-users of BPM, process models shouldn't have to be the center of the world.

With a mixture of Facilitator and Automator plus a whole lot more, my process model really represents the valuable work my people and my systems actually do. A process model drawn in a modeling tool at this level would be understandable to someone outside the organization (think of all the SOX documentation you have describing processes in your organization that an auditor must examine). At the same time I can ensure it is not restricting the knowledge worker but 'facilitating' his or her everyday tasks, while automating the dull meaningless stuff. I suppose that 'my BPM' is an Automacilitator...


Technorati tags:

A post from the Improving New Account Opening blog

Sunday, October 07, 2007

Don't forget about the real end-users of BPM

BPM is obviously very much about process, but I feel that obsessive focus on the process model reinforces an abstract view of the business that misses a large part of the story. BPM software vendors are guilty of producing BPM suites that reinforce this view. They present their systems from how pretty and easy to use their process modeling tools are, and how multiple process analysts can work on a process simultaneously - treating the analyst as the primary end-user.

The reality is that the handful of process analysts are not the most important end-users of BPM software - the primary end-user is the business user that participates in the process to do his or her job and contributes to the success of the organization's critical operations day in, day out. Why do BPM vendors rarely focus on the hundreds of 'true' end-users?

BPM software often sells exclusively to business analysts. Analysts rightly need tools that facilitate the design of effective processes. And building pretty design tools is a sure way of selling software. After all, just like buying a car, most business analysts should be expected to go for tools that appear powerful and have a nice shine.

A true business application design platform concentrates on the business end-users first: what will they see, what documents do they use and create, what other sources of information do they access and what tasks do they perform? Process is obviously core, and often represents an orderly structure that many of these other things hang off of. With a platform that only focuses on process, rather than providing a framework for actually running all components of critical business applications (including business process) an organization is left to build a whole lot of technical infrastructure from scratch. This leaves the business users depending on applications that are limited by the skill of an organization's masses of software developers.

Now don't get me wrong, clean and accurate process design is an essential component of effective business applications. But failing to provide the equivalent focus on the sharp end, the actual execution of the newly perfected processes inside a meaningful business application can leave an organization trusting its critical operations to software that is more worried about 10 process analysts than 1000 concurrent users.

Technorati tags:

A post from the Improving New Account Opening blog

Tuesday, September 18, 2007

You can't manage what you can't measure

Last week was fun and informative with a great Global 360 customer conference. With so many new releases timed for announcement, some would believe that the organization was just putting out new versions to conveniently appeal to the customers at the event. I know from being part of the release planning process that this was not the case - the new releases for BPM and process intelligence are incredibly strong and compelling and offer a level of integration with the business on one side, and the users desktops on the other, that is unrivaled.

My colleague and relatively new blogger, Mike Letulle, talks about the importance of process intelligence in his post You can't manage what you can't measure. His discussion lays out a little of why the new Insight360 release is so important.

I'm going to let you enjoy his discussion, while I get back to work on planning some more product releases to help organizations deliver valuable solutions that differentiate their offerings with improved accuracy, faster response and having customer information at their fingertips: the type of customer service that is more compelling than a smile and a 'have a nice day'.


Technorati tags:

A post from the Improving New Account Opening blog

Wednesday, August 29, 2007

Online security - fix the source of the problem

Randy Janinda responded to my post from a while back, Citibank Hardware Tokens Defeated... at the exact time that Bank Systems & Technology talks about how Bank of America is attacking the weakest link in online security - user's desktops:

Recognizing that defenses are only as strong as the weakest link, Bank of America has moved to shore up an area that largely is beyond its control: customers' desktops. In a move experts say is a step in the right direction toward improving online banking security, the Charlotte, N.C.-based bank announced a partnership with Symantec (Cupertino, Calif.) in which the bank will offer the security solutions provider's software to online banking customers.


This is certainly a step forward, although it cynically just looks like a great marketing ploy by Symantec at this point (just another channel for promoting a 90-day free trial). And to Randy's point it does not address the hardest point to protect against - phishing - and fooling people into handing over their authentication. How can banks help customers protect themselves so that they don't have to jump through hoops to detect scamming websites? In an environment where software can be installed on a customer's PC, like BoA is trying to do with Norton, there must be software that can reinforce the browser against phishing attacks. This may have to be the next thing that BoA offers.

Technorati tags:

A post from the Improving New Account Opening blog

Monday, August 27, 2007

Component stack - a simplified architecture for applications

Its been too long, with lots of work, travel and vacation, apparently leaving me with no time to blog. I'm back, but a little rusty and jetlagged, so bear with me!

James Taylor posted today on Enterprise Decision Management Blog about a post by Roeland Loggen on his blog - BPM Suite as a component in a logical architecture. Roeland proposes a BPM-centric architecture that plugs into a range of other components to handle administrative type processes.

James suggests some enhancements to the architecture, largely to ensure that the Decision Platform can interact with the Complex Event Processing (CEP) and Business Activity Monitoring (BAM) components. I agree with his rationale, and would even take it a step further - every component in an architecture (that isn't pure technology) should be able to interact with every other, ensuring that really advanced business requirements can be considered, offering more and more business value.

As ever we should avoid point to point integration of every component in the architecture, instead components should offer services that can be consumed by one other, both for rigid (generic) use cases and to meet application specific requirements. Having SOA technology and a strong integration platform underlying the architecture seems like an essential requirement. SOA technology also provides the ability to host the final application as business services that can be reused in other applications across an organization or by business partners.

A benefit of taking a services approach is that generic architectures like Roeland's are simplified - they are effectively flattened; every component can benefit from every other and the architecture doesn't need a hundred lines showing the explicit connections between all of them since the connections are handled by the SOA / integration backbone. This backbone may utilize a range of technologies, and may be effectively hidden from view. Without fancy drawings, the generic architecture becomes a simple list of available components that have been plugged into the backbone, for example:

  • Business Process Management (BPM) - incorporating BAM and process analytics
  • Decision Management
  • Case Management
  • Content Management
  • Customer Database / Information System
  • Document Capture
  • E-Forms Management
  • All plugged into a SOA/integration backbone

Taking this stack approach allows business analysts to concentrate on using the components providing higher level business value in the most effective way - not the forced integration of monolithic products. Any particular business application can reuse the same architecture and technical backbone, and will benefit from platforms that are more interconnected out of the box. Specific connections (drawn on an 'architecture' diagram) now represent business use scenarios, and the value of each interaction can be easily seen, especially where new integrations may need to be built. Here is a simple example:

The application here is a fairly generic application processing solution, which shows an architecture that highlights the interactions between the components, already relying on the implicit (and invisible) technology backbone and any available services that are already in place. A different application could show very different connections, allowing the 'architecture' diagram and implementation to closely reflect the business value of the application. The technical architecture would be no different from before - a plain old stack implemented once and used over and over.

Picking a stack that can handle the high level business requirements, while being architecturally strong enough to hide much of the technology complexity, and still allowing the flexibility to meet specific application requirements across many different applications is hard work. Organizations looking to get the best value from their technology investments should look beyond simplistic 'drag and drop' and developer toolkits to ensure that new technology really can deliver complex applications fast, while allowing constant enhancement and new applications to be based on a common platform that can perform as needs grow.

Technorati tags:

A post from the Improving New Account Opening blog

Sunday, July 01, 2007

Decoupled services need BPM

One of the key advantages touted by using SOA is the way that integrated systems are 'decoupled'. When I was chatting with a colleague last week about this I realized that the meaning of 'decoupled systems' is less obvious than I originally thought.

When I was reading Roeland Loggen's Process transformation - perspectives on "BPM" blog this morning something he says around the use of process really helped me put my thoughts about 'decoupled systems' into some sort of order.

The post was From interface spaghetti to process coordination layer. Roeland gives a little of the history we are used to hearing around integration: in the olden-days we thought about integration purely as a technical and data-driven problem; technically how do we connect to each system and then join the operations and data elements of this system, to get another system to store, retrieve, update, delete or whatever some related but otherwise completely separate entity in its world? Through proprietary technical protocols and translating the requirements of one system and coding them directly to the data requirements of another. It works, but do this enough times and you have point to point spaghetti.

Its not a new story. What comes next though clearly pushes back on the enterprise application integration (EAI) and enterprise service bus (ESB) approaches to integration. Again, suggests Roeland, these approaches lead to integration spaghetti, just now this spaghetti is standards based (guaranteed pure Durham wheat, made in Italy). Although I would suggest the ESB approach does remove point to point hassles, by at least handling the technical integration layer with each system once only. So rather than needing all the technical integration between each system, the problem is reduced to requiring a transformation of data to be handled for each system to system connection. Doing this we swapped the proprietary protocols with web services (unless you live in the real world of web service interoperability) and made integration more an effort of converting one format of XML document to another.

What Roeland argues nicely is that it is BPM and the management of a process in general, not pure EAI/ESB technology alone, that solves much of the integration problem, especially decoupling systems.

So, what is 'decoupling'? We use SOA principles to access information systems through meaningful business services, rather than system specific APIs; which hides the complexities of the systems themselves. But rather than connecting the output of one service or system directly to the next service request, we decouple them, only making requests to services in the context of the business process - the business process becomes the orchestrator of all activity.

What are the advantages of doing this?
  • Process that was originally embedded inside the coding of business systems and services is extracted to a BPM and therefore becomes more manageable and can be updated as needed.
  • Integration code does not need to concentrate on a specific process or activity context, since the new business services only rely on the state of the business process to ensure correct operation, not on the hard to maintain state information held inside another proprietary application being in-sync with itself.
  • Complex dependencies between applications are broken, making maintenance easier, and the ability to reused services (and their underlying systems) far more likely.
  • Process that was previously embedded, that mixed coding specific requirements, integration activities and true business process can focus on the specifics of each in the most appropriate tool: the business process in BPM; integration in code or an ESB; application specific state in the application code.

So I think I can finally communicate what 'decoupling' is, and why its so important...

Business level interactions with other services should not be coded within the systems themselves, since this makes it impossible to use them in a different context or even variation of the same process. Decoupling of systems is about removing representation of business process state or cross service interactions from underlying applications, so they generally only interact as part of the business process managed by BPM.

Maybe someone out there has an even better description of decoupling. Let me know.

So after all that, we come back to the convergence of BPM and SOA. It seems that you can not in fact decouple systems and services without BPM - true SOA can not really truly exist without some form of business process orchestration. So why not use the strongest business process management technology, which enables the use and hosting of services straight out of the box? SOA and BPM need each other, and decoupling is just one example of this.

Technorati tags:

A post from the Improving New Account Opening blog

Thursday, June 28, 2007

Optimizing New Account Opening - Thanks for listening

I just presented a really enjoyable web seminar about Optimizing New Account Opening, with Dennis Backer from Zarion, and talked about how jointly Global 360 and Zarion can help financial services organizations improve their new account opening processes.

If you were listening (and watching), thanks for spending the time and I hope you found it useful. You'll receive a confirmation from Shared Insights about where this webex recording is archived so you can play it again (Sam).

If you didn't get to join us, but would like to check out the discussion, follow this link to Shared Insights and sign up and they will also provide you information about the recording.

I'll also post a link here when its live

Technorati tags:

A post from the Improving New Account Opening blog

Wednesday, June 27, 2007

Optimizing new account opening

Tomorrow I have the pleasure of jointly presenting a web conference titled Optimizing New Account Opening: Next Generation Solutions for Enrollment.


The highlight of the show will be Dennis Backer from Zarion, talking about Zarion's best practices for new account opening and customer enrollment, based on many years experience work with European financial leaders. Since some people would argue that the European banks and financial institutions are ahead of North America in terms of providing customer-centric services, this should be an extremely insightful discussion.

Global 360 is sponsoring the event, which gives me the opportunity to talk a little about how the Global 360 technology can facilitate many of the best practices that Dennis talks about and how our business and technology focus complement one another.

If you would like to attend the event, please sign up on the Shared Insights website.

Technorati tags:

A post from the Improving New Account Opening blog

Saturday, June 16, 2007

Managing business processes outside the box

Finally I've had a bit of time to catch up again on some news and blog feeds. As I'm reading I'm noting some of the key areas that could benefit from BPM, ECM and associated analytics that I don't think are well addressed. Treat this as a little brainstorm from me that hopefully will lead to some comments, perhaps from vendors telling me how wrong I am and how they do all this stuff already. So to get going:

1. Basel II - using BPMS to measure the accumulated risk in business processes

Nancy Feig writes in Bank Systems and Technology a nice review of Basel II and its gradual adoption in North America. The spend on Basel II focused analytics software is big, with a large amount of attention being paid to cleansing and importing the data in enormous data warehouses.

I wonder if any thought has been placed on the dynamic side of the business, the capital and risk floating in business processes like loan origination and new business accounts and large transactions, which the data warehouse is weeks away from incorporating into an institution's new credit and risk calculations. This takes true end-to-end process management and analytics that can deliver business metrics rather than the typical business activity monitoring (BAM) tools can - something closer to business intelligence for process, than counting the number of workitems in a process and drawing a bar chart.

2. Project Management - using a BPMS to track progress and provide visibility

Admittedly there are more powerful tools than Microsoft Project out there for complex project planning and tracking, and there are many business processes that are repetitively performed that use Project or Excel for definition and tracking that don't need this level of functionality. These 'process projects' often fall into the bucket that are implemented through combining a number of disparate applications through a written or Excel driven manual process.

Tracking, rather than delivery is something that is often overlooked by BPM. A BPMS can act as an incredibly versatile tracking and status monitoring engine, with the ability to truly manage the monitoring of exceptions, 'illegal' or fraudulent activity, and with analytics provide a true view of the activity and performance of these projects compared to historical data. And when paired with more powerful modeling and simulation capabilities, Lean Six Sigma style process improvement can be easily applied to these under-represented processes.

3. Service oriented architecture (SOA) - for services that aren't automated

A great focus has been put into truly automating the business processes in an organization that can truly be automated: handling B2B transactions; automated decisioning with rules engines; responding to online requests and orders with machine generated information. This is valuable, since these 'process fragments' are often the most repetitive and erroneous when handled by human beings, who generally add little value to them.

Focusing on the human components of processes alongside the truly automated requires some powerful services, integration, content management, modeling and process execution capabilities, if the whole end-to-end process is going to be managed effectively. As many integration vendors wait for BPEL-4-people before really being able to work out how to model the human element of business processes (most treat humans as just another 'system'), and many human centric process tools put an SOA tag on their current poorly performing web services, there is a need for systems that can cater to both.

In doing so 'super-solutions' that blend the requirements of pure automation, human workflow and collaboration alongside services technology enable more seamless systems to be delivered faster. And they can also deliver far better business metrics around what is going on across the whole system, not just the limited portion they manage.


So I've written about three ideas that offer food for thought around the underutilization of BPM. Many organizations have a need for handling of these types of requirements and could benefit from BPM to do it. The issues is that most need a visionary leader to work this out for themselves, since it falls outside the standard pattern of an HR onboarding process, accounts payable or one of the other typical business processes that BPM vendors choose to market to.

Technorati tags:

A post from the Improving New Account Opening blog

Wednesday, June 13, 2007

Web service interoperability distracts from business problems

Oh the joys of SOA, web services and technology standards! They don't yet live up to their promise and probably distract businesses from the great improvements that they could make to their operations by having software systems that work together easily.

I've spent the last few days (and nights) brushing up on my understanding of the deepest darkest technology around web services, and the technical underpinnings of Service Oriented Architectures. And at first look, if I ever thought that technology was made complex by self serving vendors wanting to protect their patch, this is one of those times. Web services, and their promise of interoperable systems and simplified integration has led business IT people to believe that software has reached a great nirvana. In fact web services standards just give people another excuse to deeply delve into murky technology weeds and fight battles over often intractable problems that are largely meaningless to the performance of a non-technology business.

I had a great in depth meeting today with the enterprise architects of a large and well known insurance company. It was challenging, since I've had to bring my knowledge of web service standards, XML, SOAP and so on to a level where I can actually understand why the technology works at all. These architects, and many like them are actually translating the vision of XML for representing complex business data into reality. They are pushing XML applied to real business problems; the fact that an insurer represents a person in one way during underwriting and a different way during a claim; the fact that many businesses have different ways of representing identical entities based on the systems they use and the acquisitions of other companies they have made. The fact is that XML can provide a powerful mechanism for unambiguously representing these type of entities, not just for the sake of storage, but to really start to enforce deep rules around the accurate usage and interchange of people's information.

Web services build on the powers of XML for complex data representation, offering mechanisms for effectively transferring it from one system to another, sometimes cleanly, sometime not so cleanly. The problem is that since the beginning of the web service revolution the rules around the application of XML to the problem have been far too lax, and the flexibility of XML to represent almost anything has enhanced this. Open standards frameworks like Apache and even Java have built on their own selected translation, now outdated, so they don't work with Microsoft or IBM's approaches pushed by these superpowers. The outdated approach was not necessarily wrong, but since not widely adopted outside its own space, web services are not interoperable today, and software with a lifespan (and customer base) that doesn't allow completely rearchitecting every 18 months (i.e. virtually every enterprise application) has struggled to keep up. Which means that many of these applications that promise web services are no more interoperable than those that just offer a Microsoft COM interface.

I am starting to believe, based on the discussions I have been having, and the obvious buzz around SOA, that we are reaching a point where the basics are set and everyone is starting to agree. Now the enterprise technology guys will find deeper and more complex problems to pray on in the constant equivalent of the Betamax v VHS wars of yesteryear, and the HD-DVD v BluRay wars of today. Web services are just starting to show promise as enterprise applications catch up - but you can guarantee some extremely smart guys are currently creating some more technology problems that in five years we just manage to resolve.

By the way, my employer, Global 360 is not likely any time soon to have a vested interest in pushing any particular web service standard. Instead it is an adopter of the most broadly used and valuable standards and supporting technology, so I hope this didn't just sound like a rant from a guilty party! Just the rant of a tired supporter of getting technology standards defined and adopted accurately up front so the business knows it works and can benefit from it!


Technorati tags:

A post from the Improving New Account Opening blog

Saturday, June 02, 2007

Metrics for improving insurance

James Taylor has a nice post about the metrics he would consider would offer measurable improvement for insurance companies when deploying a business process management suite (BPMS) alongside a business rules management system (BRMS). I think that his metrics or benefits to the business extend across underwriting as well as claims, which is no bad thing - we know from experience that these technologies work well, so why not target them across the whole lifecycle of the customer?


The business benefits in James' post come after the technology benefits, which I think actually devalues them, but in summary the business should see measurable improvement in the following metrics:

* Fines
* Cost per transaction
* Straight Through Processing rate
* Consistency across Agents
* Appeals
* Reports ordered
* Mail, fax and shipping costs
* Cross-sell/Up-sell rates


James explain his thoughts around each of these in more detail, so take a look at his post for more information.

Now, imagine being able to demonstrate to the business leaders the probable outcome of introducing new BPMS/BRMS technology before the fact. Pre- and post-simulation alongside business metric (not just time and activity metrics, but real dollar values) are core to this and my colleagues at Global 360 are working on providing analytics to help business understand the impact of investments before they make them, as well as measuring their success and potential for improvement after the fact.

Technorati tags:

A post from the Improving New Account Opening blog

Thursday, May 31, 2007

New account opening: as good as it gets?

I've been talking with many coworkers and customers recently about the new account opening problem. It seems that people outside the financial services industry rarely identify a big problem with this phase of a customer relationship, until something goes wrong. After all, what could be so difficult about the process of opening a new brokerage account, buying an annuity, or getting life insurance?


Inside the industry, and depending on who you speak to, the view varies, fitting one of several categories:

  1. Help!!! I'm drowning in paper and my competitors seem like they are a million miles ahead of me.
  2. We've squeezed every last penny out of the process, centralizing our application processing, offshoring the data entry and outsourcing the credit and risk checking
  3. We've made a giant leap to online application forms, which spool out the back of a printer in the backoffice for keying into our current systems
  4. We're better than the others. Electronic forms are actually captured directly into a business application and delivered to a group of people in the back office for processing. We still need to handle paper for signatures and ID, but we think we're fairly electronic
So, if you're at the final stage, everything is great, right?! This has been the view of many US banks and financial institutions for a while - the nirvana of the electronic form and automated process. And other places outside the States are still striving to get to this point as well.

The fact is that Europe is leading a new wave in optimizing new account opening. They've already cut the waste in the process to a level that, in the leading organizations there is little left to save. They've reduced the processing time to a level that customers are comfortable with. What's left?

As we've all seen, there has been a backlash against offshore customer service - we want to hear a voice on the phone line that has a familiar accent and is not distorted by ten thousand miles of cheap copper cable. In fact, many people have shown a desire to pay for personalized service, even face to face. Unfortunately lean, centralized, outsourced processes don't allow that type of familiarity or human interation with the customer. Rarely in fact do they allow visibility into where a customer's new account application actually is in the world.

The optimization of new account opening for the European leaders, and the leading US banks are hot on their heals, is around customer service, personalized attention, and having the customer's information and application to hand instantly on request. This may be why we see a new Citi branch opening on every street corner in major US cities. But without the ability for agents in these branches to get involved in the business processes around their new and most impressionable customers, they'll just be another layer of annoyance between the customer and their new account getting opened correctly.

There is always more to do, so don't sit back on past successes. The rest of the world is getting ready to leapfrog the old-time US customer service reputation.

[UPDATE: By the way, I meant to say that these thoughts are often reflected by The Bankwatch Blog, a constant observer and commentator of banking and financial services globally. For more evidence of what the leaders are doing, take a look]

Technorati tags:

A post from the Improving New Account Opening blog

Wednesday, May 23, 2007

New blogger

A great coworker of mine has just started blogging. He is smart guy with a strong technical and business background, which means his blog bias will be varied and interesting.


If you are looking for a new read for process improvement, six sigma and business problems, take a look at Mike Letulle's Full Leverage blog.


Technorati tags:

A post from the Improving New Account Opening blog

Saturday, May 19, 2007

Internal controls as important as front-line security

The Bankwatch blog has a quick note on how two-factor authentication and token are perceived as being synonymous, when in fact technology such as PassMark can provide the required second form of authentication required to really judge the authenticity of a person performing a transaction.

Even better though, the post points to an easy to read paper by Ross Anderson, Professor of Security Engineering at Cambridge University. This talks not only about the different types of scams, like Phishing, but the importance of internal controls within financial services organizations that front end technical security supplements.

Everyone that uses online banking sites understands the importance banks place on knowing the true identity of customers. Third-party authentication is the primary means to achieve this with a new customer, by using trusted third-party identification (e.g. government issued ID and credit checks), before issuing a customer credentials (username and password) to use the site. Primary authentication (a customer's new credentials and a second factor of authentication) is used to ensure it is really the person that claims to be the customer making a transaction.

These forms of authentication are the first line of defense and it seems that those banks with poor internal controls typically become the focus for online fraud. Since there is a much lower risk to the criminal that the bank will either notice a problem or be able to recover assets, this makes the effort to get around the primary security more likely to be rewarding. Cyber-crime moves to the easiest target. And when this becomes newsworthy, customers get the impression that their investment is not being well protected. Brand damage is a high price to pay when people trust you with their money.


Technorati tags:

A post from the Improving New Account Opening blog

Sunday, May 06, 2007

Six points for business improvement

For the last week I've been surrounded by colleagues from across the globe, many of them long-time experts in BPM, ECM and the many business issues that they often accompany. It was a great chance to hear feedback from sales, implementations, marketing and strategic thinkers, while also trying to communicate to best effect my own experience and visions for the products I manage. Mixed with the chance to announce the imminent release of the next version of one of the products, I've been pretty busy preparing. So that's my excuse for not blogging for a while.

And what came out of all this work? My view of what the biggest issues are around business processes and how to improve them. I might even discuss some of these over the next few weeks.

So in no particular order, here are six areas I believe every organization looking to improve its operations should focus on:

  1. Understand that business processes extend across organizational boundaries
  2. Gain visibility into process performance with meaningful metrics that reflect business goals
  3. Track and manage every piece of work that enters, leaves or is created in the organization
  4. Deliver processes, operations, resources and information as services
  5. Remove waste from processes, don't just "pave the cowpath"
  6. Employ technology that delivers ROI, not just lowest TCO

I'm sure there are many more - feel free to prioritize your own list

Special learning for the week: technology comes last. It is people and their needs that matter.

Technorati tags:

A post from the Improving New Account Opening blog

Monday, April 16, 2007

Run better business processes with a BPMS, not a programmer

A few days back, my colleague Steve McDonald asked me a question that went something like: "Do companies need BPMS to make their processes work better, or can you just write custom applications that represent the process?". Well obviously we both have a bias towards the BPMS, but I believe its for good reason - I have wielded a compiler at some point in my career and I can honestly say that the results were 'variable'. So we agreed that having the structure and tooling of a strong BPMS in place allows you to concentrate on the business problem you are trying to solve, rather than the mechanics of writing code and pure custom apps.

Look at any organization and you'll see that business processes do run without a BPMS, as loosely organized paper and email based processes based around a custom Access database application. Many low volume processes that we see in organizations, around processes in HR and Finance for example are commonly run in this way. Making a business case to improve these processes is difficult since it depends mostly on soft metrics like enforceability, compliance and visibility, and taking them the extra step with a complete BPMS may be considered unnecessary.

Now consider a high volume process handling thousands of insurance claims per day. Typically this is a process that has evolved to incorporate human interaction, system to system integrations, paperless operation and fully managed business processes. The business case for BPMS implementation and the measure of its success is metrics like reduced cost and time to process a claim, improved capacity, reduced errors, less customer service call center abandoned calls. These are not metrics that can be achieved without a BPMS – actively executed and managed processes are essential to ensure that the claims process runs like a well oiled machine. There is no room for unnecessary activities to be performed by the process users, either in receiving or delivering work to the right place, or in the oversight to ensure that the process runs as it should.

Attempting to put a process application together out of component parts, a bit of process execution here, a bit integration there, and some loosely defined reporting over the top, is a bit like the factory workers building a BMW attempting to build a luxury car using tools they had brought in from their home garages that morning. Even with a strong process, and great components, who knows whether they would have enough ratchets and the right size spanners to put the car together successfully. I'm sure there would be a lot of use of the 'leather hammer'.

The strength of a BPMS comes from being to apply the management discipline of BPM to the best technology to really ensure that a well oiled process can roll high quality product out the door, and at every stage you can have visibility and control to ensure this happens. Having great visibility over the crew of BMW workers with inadequate tools is probably as unsuccessful as knowing the crew have great tools, but having no idea whether windshields, wheels and brakes have been delivered for them to actually assemble the car. A carefully selected BPMS (not the one you selected to handle 100 travel expense claims a day) can provide the tools and visibility to deliver the business objectives for highly efficient, cost effective and visible business processes.

Technorati tags:

A post from the Improving New Account Opening blog

Wednesday, April 11, 2007

Tying together BPM and SOA technology

BrainStorm has been a hot topic this week for the BPM and SOA crowd. Sandy Kemsley blogged about the BPMS vendor panel (which didn't appear to engage her unfortunately), hosted by Bruce Silver. I would have loved to attend the event, which I couldn't (being sat in my office in Boston at the time). But I was interested to weigh in around some thoughts around BPM and the relationship with SOA technology, a discussion that came up with the panel. This is where my coworker Steve McDonald and I mind-melded a bit.

I have been working through releasing an SOA edition of the process, content and case management product I manage, and in doing so have observed many pitfalls that face human-centric BPM products in an SOA technology world. My view is that a BPMS can get caught in the trap of becoming another component of the SOA infrastructure. The SOA view of BPM and process execution demands that a process engine interacts with other systems and executes branched business processes and not a lot else. The BPM gets sucked down into the SOA infrastructure, lost in a Bermuda triangle of web service acronym checklists. Steve constantly reminds me that this is a waste of the capabilities of a BPM and content based product.

BPM software customers agree that the purpose of a BPMS is to control real business processes and enable effective management and improvement of those processes. I believe that processes that have typically been implemented with BPM are readily identifiable as business services and should be treated as such. For example, large insurance companies that have not yet embraced SOA may run multiple variations of an otherwise identical property claims process in a BPM application for different business units. This claims process is a high level business service that could be reused, and the BPMS should enable that. Within this top level process there are services that should also be reusable, for example Fraud Investigation. The BPMS should clearly identify the services, allowing them to be extracted from the core process and reused, moved elsewhere in the organization (or even outsourced). Treating traditional process components as loosely couple services provides this flexibility, but to achieve this the BPMS must understand SOA and be able to interact with the infrastructure.

To achieve this, the BPMS must have some strong SOA capabilities and link back to the infrastructure, without becoming overwhelmed by it. So I think that three requirements are:
  • A BPMS must be able to consume services published in service registries, and implemented through ESBs, custom applications and other BPM processes
  • A BPMS must be able to host its processes as services for reuse across the organization
  • A BPMS must provide visibility across services and processes that extend the boundaries of its own process model. Pulling loosely couple services into end-to-end business processes demands stronger oversight and visibility of performance than monolithic business processes executed by a single system
The infrastructure of SOA is important to an organization implementing services and business processes. BPMS must consume the services of more technical components than ever before, should host process based services, and be an enabler for exposing process (and content) services that common data centric SOA components do not handle well.

I think its possible that integration of the BPMS into specific SOA platforms as a complete vendor stack leads to artificially tightly integrated architectures. SOA infrastructure relies on web service standards, and SOA as a methodology demands loose coupling of services and identification of components with separate roles. Integration of platforms into a proprietary stack leads the IT team back to a monolithic architecture, and away from the true objectives of SOA. BPMS and SOA platforms should play together, not be tied together.

---
I'd like to thank Steve for voicing some of our combined opinions around this on the panel. He certainly seemed quite excited by the chance to talk about this common theme in a way that obviously, and unashamedly, reflects the strengths of the products we both work with.

Technorati tags:

Sunday, April 01, 2007

What's in your wallet? The best surveys provide feedback

In the US at least, airline frequent flier miles are the center of marketing efforts for credit cards, hotels, car rental and even restaurants. One of the big marketing campaigns for a credit card mileage program is 'what's in your wallet?' - maybe a credit card accumulating miles with no blackout dates. Since airlines frequent flier programs have become renowned for making it almost impossible to get the free tickets you feel you are entitled to, when you want to use them, does this put people off actually trying to use their miles?

Try this short (one question) online survey to show how you use your miles, and see how you compare with hundreds of other respondents. Interestingly, there is no link back to the sponsoring website (Cheapflights) and nothing asking you to identify yourself. There is no fear of spam, so these guys are relying on word of mouth and good follow up PR to gain any benefit from this survey.

I was happy to do this survey, because it gave me some instant feedback. How many surveys are put out to solicit customer feedback, leaving the customer feeling unrewarded with a 'thanks for your time' but no other feedback. Just seeing the current rating on how I voted alongside everyone else gives a feeling of interactive satisfaction.

Now, I wish it was that easy for me as a product manager to get hundreds of data points to feed into product decisions and better marketing!

Thursday, March 22, 2007

Adding BPMN into the fiery mix

Bruce Silver picked up on Keith Swenson's discussion about XPDL v BPEL, which I blogged about yesterday. As expected, Bruce added BPMN into the mix. And rightly so - having a portable modeling notation that describes how drawn processes should be drawn as well as run is an important way to go. BPMN helps business analysts communicate consistently, both with the business they are modeling and each other.

Now when we get to a stage of evolution where execution engines can consume these BPMN models, through XPDL or some other process language, customers will have reached a nirvana in software buying. At this point they can get commodity (cheap) BPM engines that they can plug and play, then rip and replace to their heart's desire. Microsoft is waiting in the wings for this already, though they are proving that they need a little help along the way (see their Business Process Alliance).

Fortunately, by the time complete BPM commoditization happens, some of the real talent in the BPM space will have moved on to even more pressing issues in the business, like gaining visibility across this new multitude of 'appliance' and SaaS process engines. Or providing federation and process enforcement across disparate process engines that make up an end-to-end process. Or providing some other smart system that can help businesses meet ever tougher objectives.

At this point, maybe we'll give business process back to the business, and terms like BPMN, XPDL and BPEL will just be the underpinnings of stuff real business leaders, users and analysts can understand and use to get their jobs done - better, faster, easier. Oh, and all with minimal IT involvement.

By the way, its worth checking out Bruce's post, since he has proven himself to be an expert in the BPMN realm in the past. His discussion is based around the state of play today with BPMN, and makes enjoyable reading.


Technorati tags:

Tuesday, March 20, 2007

Manage end-to-end processes - don't argue about standards

On the Go Flow blog, Keith Swenson rekindles the debate about the success of XPDL, and obviously draws some comparisons/distinctions with BPEL. In my view both standards have their place, and vendors should avoid trying to quash the one they don't support because their process engine or modeling tool has different objectives its trying to meet. As Keith says:
The biggest misperception in the marketplace is that BPEL and XPDL are in some kind of a war. I have already covered elsewhere how this is silly, so I won’t duplicate it here. I think Jon Pyke’s response makes it clear how these very different standards serve very different purposes.
I put my own (rather lengthy) thoughts into a comment on Go Flow. The key point of my argument is here as well:

XPDL and BPEL have separate objectives and they will overlap at times. At these times it may be better for the organization implementing human and systems BPM to look at ways of pragmatically handling end-to-end processes. Managing the 'federation' of processes, rather than trying to shoehorn everything into a ‘standard’ optimized for different objectives, seems to be the best way to go. End-to-end processes exist all across an organization, and at this point most organizations seem to struggle with gaining even the slightest visibility into what is going on, let alone attempt to model and execute that process within a process modeling or execution language.

Easier said than done? Maybe - but 'process federation' is a real concept that can be attacked from several different directions. Global 360 (my employer) does this with end-to-end process analytics and optimization. There are other approaches that customer can employ that further improve the visibility and manageability of end-to-end processes that cross the systems and human realms.

By the way, don't forget to add your comments to the discussion on Go Flow.

Technorati tags:

Saturday, March 17, 2007

Permission to use free online services

Over on The Bankwatch blog, Colin picks up on a note on Seth's Blog about the evolution of advertising. In his discussion, Colin responds to Seth's idea that Google AdSense is effective because it is 'permission based':
Google AdSense is not permission based IMHO. Why? …. because no-one asked me if they could display those ad’s beside my search results. But lets go further … even if those ad’s are relevant, they are only relevant in the minds of the advertisers. I would consider myself a fairly active internet user, and I not only never click Google ad’s …. I don’t even notice them. Google App’s premium, which I subscribe to, does not display ad’s to me.

I think about this slightly differently. Although I wouldn't call AdSense 'permission based', it could be argued that by choosing to use any free service or information source, like Google, Improving New Account Opening, Hotmail or whatever, you are effectively giving your permission to the service to try and make some money from you through advertising.

The challenge for the service is to optimize how effective this is: balance the annoyance of the adverts against the need to get people to click through to the paying advertiser, so they can profit from running the service. This is what in my mind stops advertising becoming too much like TV and radio - with the Internet I always have the choice to go somewhere else.

This blog is admittedly an exception to the real commercial services in this respect. I do run AdSense on this blog and know I'm not going to get rich, but it certainly helps me understand the dynamics of free online services. I get a steady-ish stream of page views even when I don't post, and I get little bursts of ads being clicked some days - split between ads on my most popular posts (most last year) and the home page. My guess is that its not regular readers doing this, instead people stumbling across the site from outside, and looking for the next place to go for more relevant information. It works far more effectively than the prepaid Adify service on the right, which rarely shows anything but my own banner, and when it does mostly unrelated commercial ads that are really just fluff.

Since I started my blog last June, I think I've been credited with approximately $65 US in clicks. The value of some of the ads seem to be reasonably high, to match the business nature of things discussed here. Based on my original aim, I have donated this 65 bucks, and a lot more to Oxfam. And maybe I have helped the occasional reader find something they were looking for related to my posts.

I have posted in the past that the AdSense is a useful model for corporate intranets and knowledge worker applications, helping target user actions with other useful information that may help them complete their tasks or research faster. It was an idea that I'm sure is already offered by the information access vendors like Autonomy and FAST. I think it is almost better than pure search, since it provides a manageable list of results, that are driven by value as well as content on the page, rather than the 3 words entered into Google.

So everyone is encouraged to read my blog, and by doing so they have my permission to get mildly annoyed by the presence of AdSense adverts. I hope that regular readers do not find them too annoying, and that everyone may occasionally stumble across something related that may be of interest to them. I'm not necessarily a good linker to external stuff, and AdSense helps fill that gap!

Technorati tags:

Wednesday, March 07, 2007

Back to the (account opening) future

The thing that really kicked off this blog (apart from the feeling that I was going nowhere in Vignette) was back in front of me yesterday.

A little over a year ago, a colleague, good friend and extremely smart finance guy, Rob Hill, brought a big business problem to my attention. This was perfect, as we were looking for compliance problems that were a better fit for Vignette document and records management technology than SOX. The new business / new account opening processes around annuities were being focused on by an industry association called NAVA, with unusually strong buy-in from its member financial services institutions.

Rob and I started working on NAVA standards task forces, while planning a Vignette solution to address the New Business for Annuities (NBfA) problem. And I started writing this blog a little later when I realized that we were going to get little attention from the (overloaded) Vignette marketing folks. About 9 months later, Vignette decided that the only solutions it was going to pursue were those that reflected it core web content management competencies, losing the stomach for back-office financial services business problems. I happily escaped Vignette back in October to Global 360, a company that’s bread and butter is exactly this type of problem.

Anyway, back to the future and NAVA STP for New Business for Annuities has come of age, approving a range of standards based on a legal framework for operations that support a set of technical requirements for straight through processing of new business applications. Yesterday, many interested software and services vendors met in a Dulles hotel to understand how they could play a role in helping the new standards become software and operational reality.

NAVA’s lead technology consultant, Tony Deakins, who has been central to much of the standards work, encouraged me to go along to the conference. Global 360 BPM and ECM products are a great fit for the current new business for annuities (NBfA) problem, and I was excited to be back in front of this smart group of people and interesting business problem.

Now the activity of the day has sunk in, it feels like all the requirements are being shuffled together with a bunch of technology, consultants and systems integrators and thrown up in the air. With a bit of seeding, clouds will form as pilot solutions around natural technology synergies and needy customers. And everyone hopes that these clouds are silver lined and will rain down something meaningful that actually meets the NAVA standards.

Over the last 15 months NAVA has achieved some magical results in building and approving its standards, getting regulatory approval from a good number of states, federal and self-regulating bodies. Who knows where this will go when the clouds of vendors and customers form around the pilot systems NAVA is promoting. But I have a hope that improving new account opening will be something this blog can focus on again, with real tangible solution experience to relate.


Technorati tags:

Friday, February 23, 2007

Whistler - and Case Management

I'm off for a long-awaited trip to Whistler to ski myself into happy oblivion (and sore knees).


On a final email check before packing up the PC, I spotted that Bruce's BPMS Watch discussion on 'What is Case Management?' is still going strong. It shows how industry specific the term seems to be, with valuable insights and opinions coming from all round (mine included, maybe!?). Since I'm not going to blog in a while, here's my view of the different focuses on case management:

In government, where repeatability and strict policy adherence is required, process is typically well defined up front, while allowing case managers the flexibility to ‘case manage’ within that structure. For this reason, many case management applications seem to resemble high level state or status tracking applications than BPM tools. For them, the beauty is in a UI structure that is familiar and usable by the end-user.

In commercial environments I’ve seen case management applications fit into any repeatable business process that requires some human intervention and real knowledge based decision making. A large amount of process state is based on the requirement and receipt of external documents and information, and the effective matching of this to current cases. Much of a granular BPM(N) process model could be overwhelmed with ‘rendezvous’, exeptions and representing collaboration as parallel processes, hence case management tools carry their own individual methodologies to abstract the common requirements to something repeatable and configurable.

To handle ad-hoc and collaborative process definition within structured process definitions, an easy to use approach is to present users with ‘task management’ (dislaimer: the Global 360 Case Manager application I product manage takes this approach). This is effectively a mini project plan that guides users down paths based on their decisions, tracking the tasks as they and their colleagues perform them (perhaps a little of Rafael’s project manager approach). It allows task templates to be used based on user or application control, or users to create new tasks to meet their needs on an ad-hoc basis. This way you get tracking of processes, control over tasks performed if necessary, and complete ad-hoc operation if required.

I think that dependent on industry, we’ll all have a different definition of ‘case management’. It is less of a technology definition than BPM, instead being something that seems to be borne out of real business requirements by users that focus less on technology and more on the business goals.


Have a great week everyone.



Monday, February 19, 2007

Where do you find an answer to your burning questions?

I occasionally pay some interest in the ideas that social-media websites come up with - sometimes they are plain useful, other times I just wonder who on earth thought up the idea.

Anyway, one I almost missed was LinkedIn Answers. This is a way of asking people in your network a question and getting responses that may or may not be useful. Questions range from:
How can a one-person, Milwaukee-based business increase visibility?

to:
Can you ever take someone seriously who maintains that barbeque should be a pig in a pit instead of brisket on a grill? And shouldn't a barbeque sauce be sweet and tangy, not just some hot sauce thrown into shredded pork on Wonder Bread?

Not being qualified to answer either of these questions, you'll not see an answer from me. But the concept interested me from the point of view of finding answers to your burning questions in a business setting.

How do you find out information at work when you don't really know who to ask? In an office, typically you'll shout over to the guy or girl in the next cubicle, often not for the answer, but if they know who to ask. Your personal network has just been expanded, often getting you a new point of contact. When you work from a home office this type of informal network sharing is more indirect and IM or a quick email is essential for quick questions like this to the virtual 'next cubicle'.

I'm sure that collaborative knowledge management applications provide 'expert groups' that can be leveraged. Even outside of KM, it would seem that any employee could all benefit from central FAQ and Q&A resources, backed up by a healthy dose of search capabilities. And in business processes and case management? Providing strong mechanisms for sharing knowledge and decision making information should help ensure faster ramp up times for new employees and more consistent decisions. But the challenge is knowing who is qualified to provide that advice and to get it without absorbing huge amounts of those experts' time.

Technorati tags:

Thursday, February 15, 2007

Case management - a definition

Bruce Silver has put together a comprehensive discussion of case management that I think I largely agree with, in his article on BPMInstitute.org. This is a snippet that encapsulates much of the definition:

What is case management, and in particular, how does it differ from conventional BPM? A case, like a conventional business process, involves a collection of activities or tasks. However, unlike BPM, the process from initiation to completion of the case is not easily constrained to a process diagram, certainly not one based on a single end-to-end orchestration, even with complex nesting and chaining logic. Which activities need to be performed in order to complete the case depends on the details of each instance. Typically the case manager, or perhaps any performer of a task in the case, makes these decisions. The “rules,” so to speak, are inside users’ heads, not codified in explicit business rules.

Moreover, users can add new tasks, data objects, documents - even new processes - to the case at runtime. The “model” defined by the case designer cannot anticipate all of these in advance. Case management inherently carries with it some fluidity of structure or ad hoc-ness. This is the part that many BPM suites have a hard time with: New tasks, processes, and data can be added on the fly, but you still want the process engine to execute those processes, track deadlines for those tasks, monitor case status end-to-end, and even measure performance. It's not easy.


This discussion shows just how complex a case management use case can be. Applications that successfully implement a case methodology have to have many seamless components (content, process, metadata management), while being open and flexible enough to be extensible to a specific customer's requirements.

Its these features that I believe really make case management enabled systems very strong for addressing a wide range of common use cases in financial services, where a business analyst would otherwise struggle with a formal BPM approach.

Technorati tags:

Tuesday, February 06, 2007

Real business solutions are more than a process

Kiran Garimella talks about The Arrogance of IT on his BPM blog on ebizQ. It made me smile, since its obvious that the disconnect of IT and the business is so wrong, but so pervasive that it makes me wonder why it always happens:
Consider that the IT-business divide is difficult to bridge precisely because IT keeps thinking of “special technical solutions” for what are essentially ‘end-to-end’ problems in business processes. Rules can’t be managed? Use a BRMS. Data can’t be managed? Use EDM. You don’t know what the data means? Use metadata. You don’t know what happened? Use BI. Need to manage customers and prospects? Use CRM. Can’t find all your documents? Use ECM. Need to comply with SOX? Buy some shrink-wrapped panacea.

Boxing things up into categories is one of the only ways that technology can be understood as it gets more complex, and there are more "special technical solutions" to more and more problems. Kiran warns of this categorization:
A few years later, this company is saddled with a BI application, an ECM suite, a CRM package, and a bunch of other applications. What’s more, there is now a huge IT staff maintaining all those applications. However, the executive is no closer to solving the original problem that brought on all this investment. To crack that problem, this hapless executive (or their equally frustrated employees) must now run to all of the above applications and try to make sense of them.

This is no surprise. We escaped the clutches of the monolithic mainframes to a world of well separate and segregated application boxes that rarely integrate, let alone do something meaningful and sensible when they do. So, Kiran suggests something better:
It is called BPM. It is the one platform that ties all these functional capabilities together and gives them a complete business context. How it accomplishes all this, and how it should co-exist or coordinate with these compartmentalized solutions and legacy systems (which do have specialized uses), are deeper issues.

I agree with Kiran that "BPM isn’t just one more application package". It certainly provides a way for the business to define what they do and how they operate, as executable and enforceable business processes, while looking back to see that its actually having the desired business effect.

Big but... BPM is not a panacea.

Sure, it can tie together the technology boxes we previously bought, but the business analyst is not going to be able to do that; these boxes present IT friendly APIs, not something a business analyst can utilize. For that we need a business view of these boxes. This is why some smart people invented SOA - it presents technical solutions as meaningful and reusable business services, within a technical framework that IT can embrace. SOA is not an application package, but your BPM platform needs to support SOA, and even better enable the backend boxes to expose business services that a business analyst can hook their processes up to.

Then of course, many critical business problems involve people working together, producing documents, communicating with customers and partners, researching data, discussing issues, checking boxes to track what they've done. And it needs to be done within the context of process activities. BPM doesn't easily model or deliver this level of collaboration or unstructured process within the structured business process previously modeled by the business analyst. In my view, Case Management is an approach to this. Not another technology box, but a combination of many of the tools required to deliver meaningful and pragmatic end-user focused applications, alongside backend systems and human-centric business processes.

Kiran's is a great post, since it highlights the disconnect between IT and business. I'd just suggest that any possible solution to a business problem still requires the underlying technology boxes to be in place, remodeled to work together to provide meaningful business services, then tied together with business processes, collaborative case management and all of the other tools that business people might interact with to do their jobs better and faster.

Real business problems are now a combination of technology boxes, business processes and knowledge workers. Don't bring in a platform that only addresses one of these. So, as a business executive, when the IT manager says to you, “Sure, what you need is a BPM platform”, make sure it can demonstrate how it solves your real business problem: make sure it gets your technology boxes to talk to your people, within processes that make your business run better.

Technorati tags: