Friday, June 30, 2006

SOA anti-patterns in New Account Opening

I like many people have been reading Steve Jones' SOA anti-patterns post. Anti-patterns are structured ways of illustrating common issues that can be identified, understood and fixed. Steve's article is targeted at organizations that have been trying, and probably failing, to implement Service-Oriented Architectures.

The benefits of SOA have been widely touted, and have been considered a 'panacea' for many organizations with a range of information and content systems spread around, which have grown up over the years through need or more often organizational churn/acquisition/merger etc. For the level of technical impact that they require they carry a huge risk of IT upset if mis-managed.

For an organization to consider improving their New Account Opening processes, and work with business partners more effectively, they are going to have to consider SOA. In most cases the organization will have disjointed systems that will require attention to enable even the highest level of coordination with Business Process Management, let alone full Straight Through Processing.

So for that reason, I picked out this anti-pattern from Steve's list to highlight a significant issue that I see could happen as soon as organizations put down the 'process standards' PDF and start throwing people at the problem. Steve does a great job of describing the problem and how to solve it. I have added my own notes to put it into context as I have experienced the problem.

Antipattern: Percolating Process

Description

Organisations start with a detailed Process map and then attempt to "fit" this into a Service Oriented Architecture; this refactoring leads to process becoming the dominant feature and leads to a Process Oriented Architecture (POA) rather than SOA.

What does this mean?
Look at the way many business problems areas are scoped, described and often targeted for fixing. Its often the "As Is" and "To Be" process diagrams. Most business people are comfortable with those Visio drawings as a way to communicate what they are doing, and how they would like to do it better (with some random technology).

IT people look at the As Is and look shocked that real people would put up with this type of crazy way of working.

Then they look at the To Be. In their technology world this new system has to be built around SOA, right? So they basically split up a bunch of systems that could be scattered around the place to perform different tasks/services. And they change the shapes on the To Be diagram and add a few lines and feel comfortable that the new project has got some structure, with an SOA title to make them feel like they are using advanced technology. This is the refactoring in the anti-pattern description. It doesn't really make the To Be process SOA, but by believing it does the IT guys are going to incorrectly apply the concepts anyway.

Effect

An organisation's "Services" come in two basic types, firstly end-to-end processes that co-ordinate lots of individual steps, and secondly a large set of fine grained services that represent individual steps. Any hierarchy or structure is solely from the basis of process. The fine grained services proliferate and become difficult to manage while the large business process elements become difficult or impossible to change. The systems slowly, or quickly, stagnate and lead to solutions being built on top of the existing solution and the general treating of the process oriented system as a legacy application.

What does this mean?

The To Be process is typically an end-to-end process that may span a whole conference room wall when its pinned up for project meetings. It shows all the people that get involved, the documents they consume and produce and the systems they touch.

Within some of the big steps there is an arrow that points to a completely separate piece of paper that details the fine grained tasks that are performed at that step. That is the extent of the hierarchy - a drill down into more detailed processes.The consultant on the project loves this, because he or she is the only one that can piece together all of the layers of paper, and can always see another step that must be refined.


And so the IT/BPM guys start building processes and integrations. Eventually, hopefully an end to end, semi-automated process is rolled out. Changes to it are banned, because the complexity of the interactions are poorly understood, and the original consultant has left to pursue another more lucrative process reengineering project.

Cause

The organisation has decided to map the processes and end to end and in detail; the work has created a series of grand "end to end" process models that are categorised by their large number of steps and the lack of sub-processes that they use. Once this exercise has been done to a great level of detail it is decided to make it SOA. The problem is that the valid business services have not been identified and thus the process maps have been created without a proper service structure. This makes identifying valid services a tricky process particularly when looking for cross-functional or horizontal services.

The challenge here is that the organisation has not started with clarity on its services and hence has created a Process Oriented Architecture. Retrofitting services into such an environment creates a sub-optimal system that is liable to have all the issues of other process based systems, such as COBOL.

What does this mean?
The To Be complex business process has not been designed around the people and tasks AND the systems/services that should support them. Nobody really thought that some of the back end systems may actually need to change to support an SOA approach.

You can't just put a new process in place and plug in at various points to the back end and hope all will be fine. The back end system wasn't designed to operate in that way - it probably controlled some of the process you have hacked apart, and by working around that there will be strange side effects (like the system not working any more). And the work-arounds put in place instantly make the new process and systems look like the legacy code you were trying to get rid of.

Resolution

The first resolution is to independently of the process map create your services architecture. This will provide the structure for breaking down the processes and creating a clear hierarchy of use. Next, this service architecture should be over-laid onto the process map to understand where the cuts should be made. The current solutions can then be refactored to create a more service oriented solution by attacking the major inflexibilities in the system and then looking towards incremental change of the current systems.
What does this mean?
This doesn't need too much said about it. When you understand what systems you have in place today and how they can sensibly be split up into new services you can see how the To Be process can best utilize those services. Some changes will be needed to the back end systems. But at the same time, if one of the systems is not going to be open to change, maybe the To Be process should try and embrace it. Otherwise be prepared for major investment.

End note

In writing these anti-patterns Steve has produced a set of very valuable indicators to the health of current projects and red-flags for future projects that involve process, and inevitably SOA. New Account Opening improvement projects will encounter these anti-patterns in many organizations, so beware!


[Update]

Bruce Silver has an interesting take on this anti-pattern as the new business-IT non-alignment anti-pattern.

Technorati tags:

So, what is New Account Opening?

So its about time you are going to say... I've been asked by a couple of people, 'what is New Account Opening?'.

Here is a step-by-step introduction to New Account Opening, presenting the main terms and explaining why the process is so complex that financial services organizations are starting to focus on it with technology solutions.

Back to Basics: Account

Everybody is familiar with a bank account as the way that your bank manages your details, the services they provide you, your rights and obligations, the transactions performed and your funds. The term Account can be applied to a customer relationship with any financial services organization for almost any type of service, including: insurance products, brokerage services, mutual funds, annuities, etc.

Account Opening

Account Opening is the process that a financial services organization goes through to set up an account offering a new service or financial product for a customer. For some products this can be a straight-forward process, especially if the customer currently has a relationship with the organization and the value of the product is relatively low - the account can be funded by the customer easily and the organization can set up the account and link it to the current customer details they already have.

New Account Opening

Opening an account for a new customer who currently does not have a relationship with a financial services organization is a more difficult process, even for relatively low value products, and increases in difficulty to match the value and complexity of the product. New Account Opening is the process that the financial services organization uses to handle the potential new customer, from the point of the customer submitting an application through to the organization creating the account and issuing a Welcome Package to the customer.

Business Challenges

Traditionally New Account Opening has been a very paper-intensive process, and even online accounts struggle to avoid this. Combine this with the fact that many organizations run this process largely manually and you can see that there is a great potential for errors and inconsistencies to be introduced. This can lead to:

  • Low customer satisfaction
  • Customers abandoning the application process
  • Regulatory fines
  • Lawsuits for accounts managed incorrectly due to inaccurate information
  • Damage to brand

Process Complexity

The process is complex due to the fact that at the outset (the point of application) the financial services organization has little or no reliable information about the customer. The process controls the collection of that information to satisfy many financial and regulatory requirements, including:

  • Collecting and validating customer details, identity, etc
  • Generating a profile of the customer appropriate to the product being offered, such as credit history, employment, etc
  • Assessing the suitability of the type of product or products being offered
  • Completing anti-money laundering (AML) risk ranking and other AML program tasks
  • Funding the account
  • Generating business system accounts and records
  • Fulfilling the process with a Welcome Package to the customer
The process can be significantly more complex when the product being offered is composed of products and services from multiple business partners and offered by a third-party sales agent. Annuities are one example, where the product consists of combined securities and insurance products in a complete annuity wrapper. The communication of customer information, ensuring suitability and enabling trust required between organizations are big challenges.

Regulatory issues add complexity to the process, with the requirements of the individual regulators for each type of financial organization, coupled with the legal requirements of the Bank Secrecy Act (BSA) and USA PATRIOT Act, especially around anti-money laundering (AML).

The Future

Financial services organizations are having to respond to the needs of the online customer and ever evolving regulatory and legal requirements. By utilizing a combination of technology components and solutions they can achieve this, while gaining these benefits:

  • Improved customer satisfaction
  • Improved brand image and loyalty
  • Reduced error rates
  • Reduced risk of financial loss from lawsuits and regulatory fines
  • Improved efficiency and speed of account setup


This description is linked from several places on this page for reference.

Technorati tags:

Thursday, June 29, 2006

Whitepaper: Efficient Business Processes Need More...

Just a quick post to say that my whitepaper: Efficient Business Processes Need More Than an Online Application Form has been published on ebizQ. This is the original text that I condensed for my post: Online processes are more than a web-based form

Thanks to Elizabeth for getting to it so quickly.


Technorati tags:

Right tool for the job

I picked up a post by Keith Harrison-Broninski on his IT Directions blog this morning. It talks about the use of collaborative tools and why they are fundamentally inadequate because of security and agililty to change.

Although Keith makes some decent points along the way, especially as he is leading somewhere over a series of post, I want to run with a counter argument.


My argument is that Enterprise collaboration tools:

  • Secure your collaborative data well
  • Enable flexibility for collaborative working
  • Exercise users' minds, enabling them to make decisions in context
  • Should not be used in business processes or with data not requiring collaboration


(1) BPM or collaboration
If you are talking a true collaborative world, where innovation thrives and structured business process is not a good fit, then avoid choosing tools like BPM in the first place. They are designed to enforce process and enforce inflexibility for several reasons:

  • Compliance can only be achieved if you can demonstrate that a strict process was followed in every case (otherwise the auditors will tear you apart)
  • Efficiency savings for high-volume tasks only come from the repeatability of the process
  • Many business processes that choose BPM do so because after initial introduction and bedding in there is very little about the process to change - the decision making can (see James Taylor's Decision Management blog for more information) - but the process itself rarely needs to, since it often has a natural progression anyway.


(2) Collaboration around financial systems?
The ERP, billing system, etc follow strict rule for the reasons described in (1). These are rarely systems to be collaborating around in the course of everyday business.

Changing your ERP or billing system may have a serious impact on revenue recognition, or one of the other key processes you have identified, documented, remediated, tested and audited year on year for Sarbanes-Oxley (SOX). Tampering with it often makes little sense and will cost your company hugely to rework the compliance side of things - probably more than the team of consultants required to make the IT changes.

(3) Security of collaborative systems
Security of collaborative systems should be an issue - you are presenting corporate knowledge out to the web.

Enterprise collaborative systems typically provide the type of security that you would expect to see to keep data secure when presented on the Internet. This obviously includes security at a technical level for authentication, secure communication, isolation of data from attack (SSL, SSO, DMZ etc).

At a user level the collaboration tools provide role and group based security, enabling user administrators of your own workgroup to point and click to enable different access rights, available functionality, etc, for different groups and individuals, if these privileges have to vary from a default.

(4) SOA
SOA is a tough one. If badly thought out it can break every business process you have that your business depends on to survive, since there are many processes hardcoded into legacy systems. There has been discussion about this in many of the SOA and related blogs (e.g. Sandy Kemsley's discussion of SOA anti-patterns on Column 2) .

Making your backend systems available as services can free access to them so that authorized users can access them through a single UI (a portal) rather than a set of discrete applications. This adds flexibility and improved user experience for users performing free-flowing tasks. SOA also enables new, more flexible processes to be put in place, typical based on BPM tools, rather than collaboration.

Summary
In environments where collaboration is necessary, it is essential to give users the tools they need, otherwise they will invent their own approaches that lead to a completely uncontrolled workplace (see my post about this). This is not innovative, just dangerous. Word and Excel on users desktops are not collaborative systems - they are individual editing tools based around a file on a hard-drive.

In a truly collaborative environment there are few indicators to the system to enable it to make automated decision. Therefore to enable security, or select business process, you have to accept that there will be interaction required from users at all level - from the user administrator setting privileges for document access, through to the reviewer of a document to go and select who is best to read it next.

Without fundamentally changing the tools available to users, for example removing MS Word from the desktop and replacing it with something that does not yet exist, collaborative systems will be highly dependent on the judgement of the users.

In structured business processes, the business would collapse if you tried to artificially inject a collaborative model, so BPM is a great tool. Again, I believe that there is a place for collaborative systems at points within structured business processes. ECM vendors such as Vignette call this Case Management - it provides the structure of BPM, mixed with the acceptance that there are tasks in a business process where users must make decisions based on the context of work in front of them. I believe that we could go a step further than this, but that discussion is waiting for a quiet moment for me to write about it.

Final words: don't force collaboration, BPM or SOA into environments and processes where they are not suited. The use of these technologies is not a dogmatic decision. Use the best tool for the job.

Technorati tags:

Wednesday, June 28, 2006

Migrating from paper to electronic application forms

Yesterday (I don't know how I missed this!) Cardiff announced a new release of their Teleform product. This is good news for them, since there had been fears that Autonomy, the parent company would not really understand the importance of this piece of technology that they picked up with the acquisition of Verity.

The value of the Teleform product (and similar products from other vendors) comes from its ability to automatically read handwritten and machine-print off of paper forms using ICR/OCR technology. The data that is read is matched with recognized form type and written out to a database or workflow, and the original form is stored to a document management repository. Automatic reading of paper presents a large ROI to many traditional paper driven business processes (like insurance claims), reducing the amount of manual data entry required and time to execute processes.

Add to this that Teleform is integrated with LiquidOffice suite and Cardiff is truly able to claim consistent handling of paper and online electronic forms, which is essential for the slow but eventual migration of customer communications away from paper.

Relationship of forms migration to New Account Opening

The migration of paper to electronic application forms is a key requirement of New Account Opening, as most organizations could not switch off their supply of paper forms tomorrow and assume that a large potential customer base would switch immediately to online applications. After all, many of the target audience for high-value financial products like annuities are still in the age-group that distrusts technology, especially for transactions that involve large percentages of their life savings.

For the industry in general that does not have a cohesive strategy for customer identity and electronic signatures, the need for paper forms remains.

Forms capture within the overall process

For a mixed paper and electronic scenario, LiquidOffice provides forms design tools and presents these forms online to enable the majority of an application to be completed electronically (directly by the customer, or by a sales agent). At the point that signature is required, LiquidOffice exports the completed form as a PDF, so that the customer can download it for printing or be sent a completed paper form, for a final wet signature.


This outgoing form is typically captured into a document management repository and a BPM workflow case is created to manage the remainder of the often complex application process. When the customer returns the signed paper form by snail-mail, Teleform scans it, reads identifying information such as an application number and the paper form is matched to a current workflow case.

Had the whole application been paper, maybe on a form a customer picked up in a branch or printed online, Teleform would have scanned the paper, read its content and automatically entered a PDF image of the completed form into the document management repository to create a new BPM workflow case. No manual data entry would be required from the paper form.

Final words

In reality there are handful of vendors offering very capable paper and electronic / intelligent forms capture systems. Cardiff's announcement just prompted me to talk about it with a name that is recognizable to many in the imaging and workflow industry.

The approach described should be the front end of many New Account Opening processes as financial services companies attempt to move from paper to electronic applications. The days of paper are numbered but the shredded tree is going to continue for quite a while yet.

Technorati tags:

Online Application Registry for Office 2.0

This post is in response to a post by Ismael Ghalimi on his IT|Redux blog, and seems to be a recurring theme for him with respect to Office 2.0.


For the uninitiated (a large proportion of the readers of this blog) Office 2.0 is in its most basic form productivity applications like the MS Office suite, completely hosted online and not requiring any local installation. Many people are familiar to the concept with Hotmail, Yahoo Mail and Gmail - fully featured email clients you can access anywhere in the world. Well, try Zoho Writer, which I wrote this on, for a lightweight alternative to Word available anywhere.

My aim is not to reiterate all of the great information and brainpower that many technologists and bloggers have contributed to promoting (and even writing code) for this concept. This is not my core area of expertise. But maybe I do have a little something to offer.

How does this relate to New Account Opening or any other enterprise process?

Most business processes are being presented on the intranet using web technology, enabling enterprises to avoid the costs of installing BPM client applications on hundreds or thousands of desktops. To enable the ad-hoc document production and collaborative requirements of these processes to be exposed online as well an approach is needed for the BPM, productivity and business applications to work together effectively, without the need for proprietary interfaces or browser plug-ins.

(Note that over time I will discuss on this blog the specifics of collaborative tools in structures business processes in more depth).

Passing files between online applications

In its raw form, Ismael, in assembling a bunch of decent online applications, has found that a basic capability of a Windows (or Linux or Mac) client is not easily translated to an online world. That is, the ability to produce a document of some form in one application and pass it into another application through a single/double click. An example is the need to be able to receive an attachment in an email, received in Hotmail in my case, and rather than having to save it to my desktop and upload it into my favorite online word editor (in this example, Zoho Writer) as is the case now, being able to just click it and have it open in Zoho (see Ismael's post).

In the thick client world this is done by registering which application you want to use to open a particular file type, then every time you open a file of that type (online or from your desktop) the OS launches the right application for you. And most of the time that is so invisible that you don't notice it. In the online world of Office 2.0 this doesn't happen yet.

One approach

An approach Ismael and his team have proposed (and pretty much rejected) has been to utilize a form of generic browser plug in to pass a file from one online app to another. This depends on a near impossibility - being able to persuade all browser creators to accept this as a standard approach, and implement it to the letter.

The other approach the team suggested is that the email application directly pipes the information to the word editing application behind the scenes. This requires each application to know how to communicate with every other (possibly competing) application.

My proposed approach for the transfer of files from Client Application to Editor Application

My suggestion is to use the successful implementation of the thick client world and mix it with the technologies of the thin client. This is going to take a few components to work. This is the first time I have seem this approach described in completeness for online user applications, Office 2.0 or Rich Internet Applications from different vendors.

Online Application Registry maps applications to file types

The key to the approach is the use of an Online Application Registry (OAR) service. This in its simplest form allows a user to create their own personal account recording their Editor Application (e.g. word, spreadsheet, etc) preferences. On signing up they go through each file type they might want to open in the future and select an Editor Application to open it with (from a directory of appropriate online applications). Extend this slightly so that the user can select from a list of online service providers for each application. And finally give them the option of just pointing the mapping at a service URL, for example for services they themselves host, or are not recognized by the directory.

Online Client and Editor Applications talk the talk

Editor Applications that can open documents (like a spreadsheet) need to provide a standard approach to consuming documents passed from a third party and opening in the browser on request.

This could take any number of forms, including being passed the URL of the document on a secure third party site, to being passed the document through a web service call. The result of this makes the document available to the Editor Application. The Editor Application generates a key, which it passes back to the Client Application to enable the Client Application to refer to this file transfer in the next step.

Much like feed-readers accept RSS feeds and blog update pings, editor applications could eventually converge on an accepted standard or two to achieve this.

Now Client Applications (e.g. an email client, or a wordprocessor with embedded spreadsheet) can be sure that they don't need to write a pipe connector to every other application their customers may ever want to use. And by adhering to the standard, Editor Applications can be sure that they are actually used at all.

Piping data in

When a user clicks to open an Open Office format file (e.g) they have received in an email in Yahoo Mail, the email client has to know what to do with that file. Now we have made it easy for the this Client Application - it just has to look at the user's profile, and pull out the URL to their preferred Online Application Registry, once and once only for all file-types. The Client Application queries the Online Application Registry for the user's Editor Application preference based on the file-type to open. Again this would be done through a standard interface (HTTP, Web Services, etc).

Now, through the magic of the standard interface described in the previous section, the Client Application requests that the preferred Editor Application, at its registered URL, consume the file using the appropriate interface. The key that is produced that references the 'transferred' file is used by the Client Application to open the URL of the Editor Application in the browser with the required file.

Custody and In-Place-Editing

If I suggested that the Client Application should request the file to open to the Editor Application as a URL there may be an advantage. First, the Client Application could retain 'custody' of the file, and the Editor Application would not have to deal with the random timing issues associated with clean up. Second, if the WebDAV standard was used to open the file, in-place-editing could be provided. OK, so I may have abused another term here, but what I mean is the ability to open and save files at a URL through HTTP standard WebDAV, without having to explicitly copy them locally then copy them back. WebDAV also provides the requisite file locking protocol.

And even better, this would open the interface up to a range of off the shelf, currently available document repositories as well. A simple HTML page could make a form based request to an Editor Application with a URL to open, and hey presto, it would open, be editable, and saveable, without having to go through the standard Editor Application interface.

Extending the Online Application Registry

What happens if the Client Application requests the preferred application for a file type the user has not registered? Many approaches could enable a user to select a new service, a default service provided by the Client Application service provider or Online Application Registry provider, or some other approach. Either way, the result should be that a user should be able to open ANY type of document in an online service. Last resort is to allow the file to be saved to the user's thick client desktop and opened in a locally installed application.

Single Sign-On

Identity of the user, and single sign on are an issue, as with any multi-party services. The smart Identity guys can answer this better than me. So I won't even pretent to know what the answer is.

Final Comments

I believe that this proposal represents a decent approach to solving the Client Application to Editor Application document transfer problem. It is obviously based on a range of concepts including Office 2.0, Web Services, UDDI, client application registration and Software as a Service (SaaS), and I do not proclaim to be an expert in any of these.

I believe that the Online Application Registry (OAR) is a service that could viably be provided by independent service providers (compare with how Feedburner sits as a feed intermediary), Client Application providers, and Editor Application 'suite' providers. As long as they play by the rules and enable user preferences to be set for file to application mappings, and accept request to query this database from anywhere, all will be good.

By publishing this, the idea is now in the public domain, so I hold no rights to its proprietary usage and neither should any one else. But if you like the idea then please at least mention my name to the next guy you tell. Especially if it is the CTO of a very forward-thinking, progressive enterprise!

Technorati tags:

Tuesday, June 27, 2006

Is a graphical process designer really necessary? - Part 2

My post yesterday exposed a closet reader of this blog. A colleague and friend of mine asked me how I could really talk about a 'workflow engine without a graphical process designer being a BPMS'?!

I'm not sure how to classify the tool I was talking about, but it certainly has integrated enterprise level workflow engine and document management system as key components of it, alongside records management, that make it fit a "EDRMS+workflow" tag.

The tool is ideally suited to certain classes of processes, which made me remember a recent discussion on the IT|Redux blog.

To my mind there are a two (or more) classes of processes that may or may not be a good fit for this EDRMS+workflow tool or even a classic BPMS. From the IT|Redux discussion, I use these two examples as a reference to describe the classes of processes I think exist:
1) a credit card company managing a dispute of statement items
2) a car company managing the process of taking a new car from concept to production

Here is my experience:

(Class 1)
I don’t think I have experienced a BPM implemented process providing true business benefit that exceeds more than 30 human steps. A level of granularity below that in my opinion is unrealistic for this class of process.

Firstly, it would involve a rip-and-replace of any legacy systems, since they typically handle the process granularity in code (COBOL probably!).

Secondly, it does not allow knowledge workers who interact with customers to make meaningful decisions based on the context of a case they are working on. In fact the person running steps at this granularity is probably not someone you want actually having contact with your customers at all.

[...]

(Class 2)
If I was looking at implementing a 100 step and up system spanning significant amounts of time, I would be questioning if a BPMS was the right tool. Why?

Well in this example process, real organizations view these things as projects. Its the sort of thing they love to track with a GANTT chart, and manage the process based on dependencies, rather than a flow chart. I know the two can be interchanged, but skilled project managers are used to modeling these lengthy processes in MS Project (despite its inadequacies), and unskilled department managers quite capably model their processes by representing tasks as colored blobs in Excel.

So I would suggest that unless there are workflow/BPM tools that consume GANTT defined project plans, and coordinate them through the process (I have seen one, I just want to see if the owner claims it!), while allowing some of the necessary intelligence afforded by a professional project manager, few people would claim to beat the challenge on this second class of process. I would not attempt to build a car, house or software project based on a BPM modeled process.

I know that the EDRMS+workflow tool, despite lack of a graphical designer can handle the execution AND definition of the Class 1 process. I also know that a BPMS (as long as its integrated with document management) can also do it.

As I say, I doubt that Class 2 is a good fit for either system in its raw state. So forgetting names, what are the other classes of business processes that really need a BPMS with graphical process designer?

Technorati tags:

Monday, June 26, 2006

Is a graphical process designer really necessary?

In my role of 'sales support' I regularly get asked by customers to justify why a particular workflow product that is part of an overall solution being offered does not include a graphical process / workflow designer. After all, BPM products have been offering that type of pretty functionality for years.

The workflow in question is targeted at automating document driven processes, for example HR job application processing. With the workflow, documents are managed with end to end processing from the point of scanning paper documents in a mailroom, through automated delivery to appropriate users, follow-up correspondence creation, automatically matching additional documents to current cases, reviews, approvals and case archiving. Conditional routing, user forms and web services integration are all configurable out of the box.

Unlike many standalone BPM engines, this workflow provides seamless integration with the built in document management system and image capture subsystems. This leads to a system that requires zero integration effort, enabling configuration to focus on the business process and security requirements as defined by the business analysts.

What it does not have is a graphical process designer - there are no attractive flow-charts here. Instead, configuration is through a pure web-based tool that enables process steps, data model, and functionality to be defined alongside the data model and routing rules of the document management system.

So, how do I justify the lack of what seems to be an essential component of a workflow? And more importantly, do you think I'm right?

Look at it this way. Most business analysts have a tool of choice when analyzing and designing processes. Let's imagine that in a fairly typical environment this is a MS tool like Visio, or my tool of choice, Powerpoint (OK, I know!). Here is how my actual HR job application process looks to a business analyst:



Despite non-standard flowchart shapes, this process is understandable, and captures the key requirements of the process: route employee documents through process steps based on specific conditions being met.

With traditional graphical process design tools the actual business process becomes visually confused when the additional requirements of documents, security, roles, etc are added to the diagram. For example:



Here, the intention of the business process as originally depicted is less obvious. Now imagine how this would look if we added the steps and annotations required
for implementing a specific workflow engine. OK, I tried to draw it, but it was a mess. In a real graphical process designer, a lot of the detail gets hidden in pop-up dialog boxes that hide implementation details, but also often hide the actual operation of the process.

Now imagine that the workflow system that I am justifying to a customer has all of these capabilities available at every step in the process:



In a traditional designer tool, each of these capabilities and routings adds to the complexity of the diagram and draws away from the
overall business process. Let's guess at how this looks when there is a complex business process - the businessanalyst can barely see his or her design behind the clutter ofimplementation. In fact in my experience, this leads to about 5 process 'shapes' on screen for every one of the steps in the original process design.

So in my case, I have the first diagram pretty fully describe theprocess. The second diagram doesn't provide much more value in ensuringthat the key requirements of the process are met. A diagram that alsoincludes implementation details of the specific workflow engine isuseless to the business analyst, but perhaps holds some value to an implementationguy. Add in standard capabilities that are available at every step and the intention of the process diagram probably becomes unreadable.

So, like me, the business analyst ends up doing their design work in Visioand transferring it at the end to the workflow tool (see a discussionin the IT|Redux blog on how simulation may encourage this two step approach even more).

In the tool I try and justify to customers, the configuration is tuned to this fact - that the business process design has been completed in a design tool most appropriate to the task. The process steps are entered rapidly, with the capabilities listed in the final diagram enabled and disabled with check-boxes, all alongside the data-model, security and conditional routing.

There are several things that matter:
  1. That the business analyst can accurately and effectively configure the process they have designed
  2. The implementation of the process is rapid, without integrations and code being required for standard capabilities
  3. The process runs as designed with reasonable hardware requirements
  4. The users can effectively and efficiently use the final process
The system I talk about with customers achieves all of these. When the process configuration is demonstrated to the customer, and a complete workflow process is up and running in less than 15 minutes, many are surprised and impressed. But the FUD is still there - if there is no graphical designer, how can this be a true workflow tool?

My question to them (and you) is this: if you can meet and deploy the key requirements of the business process in a tool like this, faster than any other, why is a graphical process designer necessary?

Technorati tags:


(UPDATE: apologies for posting twice - I noticed straight after publishing that Zoho had messed up a bit of formating on transfering to Blogger. I'll check more carefully next time. Oops)

Sunday, June 25, 2006

Scrubs up well

The blog for Improving New Account Opening (and related business processes) has a new look, to get away from the Blogger default. Love it or hate it, I'd appreciate your feedback.

Cheers!
Phil

Friday, June 23, 2006

AML Framework - a backdoor introduction

I wasn't going to post anything today, until I decided that I wanted to catch the flow of James Taylors EDM blog and his item on Using business rules for Anti-Money Laundering and a previous item he links to: Anti-Money-Laundering (AML) and EDM

The latter one really keys in with the broad set of issues around AML, incorporating New Account Opening, and gives me an opportunity to introduce an AML framework that I have had lying around for a while.

You can see James' post and my, rather lengthy comment on his blog, or just click to expand the section below:

Click to Show / Hide

I'm looking forward to providing more insight and opinions over time.

Technorati tags:

Banking on technology innovation

According to a MS FinServ blog post last week, banks will be restarting their innovation using new technology.

Its great to see that some banks are going to start focusing on the use of technology again. New account opening and currency exchange mentioned in the FinServ post are two areas that all financial institutions should have been concentrating for Anti-Money Laundering (AML) purposes over the last few years anyway, and technology should have been core to this. But in many cases the customer has not see any benefit.

Why are some banks looking at it now? Probably because they are seeing far greater immigration into the US of students and skilled professionals from Asia, and if they can attract a large number of them it would greatly improve their market share.

These new customers will most likely be attracted by the ease of transfering money to or from family overseas, opening a new account and getting credit.

I know from my own experience moving to Boston from the UK that:

  • transferring money was not hard but extremely expensive, partly due to the paperwork involved (AML CTR requirements)
  • getting a bank account was time consuming as I had only a hotel address at the time (AML KYC requirements)
  • getting credit was nigh on impossible, despite Equifax having huge amounts of data about me in the UK, it was impossible to reference it in the US
So its really no great surprise that retail banks with strong overseas connections would start to concentrate on streamlining the multitude of AML, identification and other risk and credit ranking requirements.

The question is, will this innovation really transfer to the domestic US banks, for everyone to benefit? If not, BoA et al. could see HSBC, RBS/Citizens's, Citi and others being able to attract new foreign customers far more easily than they could poach them from the internal market. And domestic customers may just get wind of how easy it is to move their business to another bank.

UPDATE: Sorry to anyone that has seen this appear twice in their feed - an error in Blogger meant I had to delete the original and re-submit this one.

Technorati tags:

Thursday, June 22, 2006

Building BPM: simulate or integrate?

I have been tracking Ismael Ghalimi's IT|Redux blog more closely, as the BPM2.0 dicussions are quite insightful. Anyway, he has a post that talks about the need for process simulation within BPM tools and projects. Although I agree in principle, I had this haunting feeling - I was involved in a couple of painful projects where this would not have worked.

Here is the comment I fired at Ismael.

Ismael, as ever there are many words of wisdom in your post. Here are my thoughts based on some front-line experience in different organizations.

Over the last 10 years I have worked with a variety of mainstream workflow/BPM tools - plus some product specific ones that were still reasonably strong. Having simulation out of the box would have made a few of the projects easier but for others it is unlikely it would have made us aware of some of the rat-holes we were going down early on.

Several years on from a painful project I have realized that simulation is NOT needed for many processes (i.e. less than 30 human interaction steps). Since the issue we had in many cases was not the logic or sequence of the process, but in fact the backend system integrations (and the UI).

In reality we should have worked backwards, proving the integrations first against a dummy process, ensuring that we could get at and alter the data in the backend ‘legacy’ systems and understanding the impact this would have. This would have avoided the need for many of the simulation requirements, since when it came to the real process we could run a test instance of the real workflow engine. More importantly it would have demonstrated up front that the legacy system internally controlled large portions of the process, which if we were not careful we would only duplicate (possibly incorrectly) within the BPMS.

So I would propose the methodology for BPM where legacy systems are involved:

  1. prototype the integration first
  2. test with an abstract process
  3. ensure that the legacy systems can handle the proposed integration
  4. identify constraints based on internal processes enforced by the legacy system
  5. refine the process given these limitations
  6. finally start testing the process with some real world and simulated data


I know that this seems like a brutal approach for some projects, so I’m interested in your thoughts based on your experience!


Hopefully you will be able to see a good discussion on Ismael's original post

Also, check out a (newly) related discussion around a challenge Ismael threw out to the BPM world.


Technorati tags:

Broken brokerage supervisory control

Jim Davies, Principal Research Analyst at Gartner recently wrote about his experience with a brokerage sending him a 'confirmation of a change of address' to his old address in this post.

He hit the nail on the head with his comment

OK, this process error isn't critical, and that's the reason why it slipped through the net, but it has impacted my perception of the firm's brand and professionalism.

The thing is, if this turns out to be a regular occurrence the NASD regulators will be down on the brokerage firm like a ton of bricks. This example demonstrates failure of a key supervisory control, in this case informing a customer of a change of details. Put in place to prevent fraudulent account transactions occurring out of sight of the account owner, these controls should be absolutely effective in the eyes of the regulator, and can lead to fines if they find them not to be. See this example of an NASD action.

But more than the one off cash outlay of the fine, or the suspension of a broker, publicity around these type of failures risk damage to the brand, and any good financial institution fears this more than anything.

A simple workflow or CRM implementation should be able to handle this. And if the firm need a hand fixing this problem, I know a few good people that would be willing to put in place effective controls for much less than the cost of a damaged brand.

Wednesday, June 21, 2006

Collaborate in structured business processes

Interesting feedback from Ross Mayfield at the Collaborative Technologies Conference in Boston. I wish I had a chance to be there, unfortunately my schedule didn't quite live up to the collaborative world this conference espouses.

Unlike some other types of business, the assembly-lines of financial services companies are essential to 80% of work, like signing up a new customer. I'm currently working on the best approaches for fitting collaborative technology into these structured business processes to handle the other 20%.

For example, when Amex wanted to sign me up for a corporate card, their usual highly organized process could not cope - I was a Brit with no credit history in Boston. Amex processes had to revert to significant amounts of paperwork, faxing, and phone calls to complete my account opening.

The 20% that has to be handled in this way represents a huge cost for the company. My hope is that collaborative technologies can reduce that cost and make the whole process more pleasant for the customer. I will be writing more about this as I formulate some real-world New Account Opening use cases.

Online processes are more than a web-based form

Some time last century the dream of lean, efficient, profitable businesses everywhere was to become paperless. In bulk, paper documents are expensive to process and store, and present a risk to a business through loss, whether that is misfiling or a natural disaster. So it seems smart to eradicate documents altogether and move to an online way of doing business, where you can put a web-based form online and automate your key processes.

Financial services organizations probably had the most to gain from the online world. Unfortunately this dream is elusive. For example, the Bank Secrecy Act (BSA) and the Patriot Act require organizations to implement effective anti-money laundering (AML) client identification programs. Since this requires physical documents to be presented as identification and contractual papers to be signed while comparing information against a range of business systems, this one process makes the efficient processing of applications using online technologies complex.

Since there are many other examples of why a ‘paperless’ organization with fully automated, straight-through processing, may remain a dream, a more pragmatic approach is required that enables documents to become part of efficient, integrated business processes.

Click to Show / Hide rest of the post



UPDATE: I need to note that this post is based on a piece I have had floating around on my desktop for a while. In the meantime I meant to reference a great item by James Taylor, The paperless office and decisioning, which puts some of this discussion into the context of Enterprise Decision Management.

Tuesday, June 20, 2006

Clever user interfaces break business rules and processes

After a great vacation I caught up with James Taylor’s blog on ebizQ. In this post, he was writes about business rules, and the importance of making them agile, to match the objectives of a real business. Given the importance of rules and process to New Account Opening, I wanted to add a few of my own observations and experience.

I agree with James’ description of agility in business rules. Experience has taught me that if you do not pay attention to these from the initial definition of any business process system, you will end up with a high chance of failure.

The problem as I have seen it extends beyond the pure implementation of rules and processes. Typically organizations want a nice usable/pretty application user interface (UI) to sit over the top of their new workflow process, to appeal to vocal users and executive sponsors.

The problem with this is that unless carefully designed and implemented, this UI becomes a major factor in limiting the agility of a process: rules may be unexpectedly implemented internally by coders; user level decisions may be enforced by UI design rather than workflow; highly specific requirements for UI operation for different user groups may lead to agility limiting constraints.

As an example, I learnt a lot from being on a team implementing a large insurance company’s workflow system that almost failed after 9 months of system design and implementation effort. The requirements up front demanded fairly simple process management and rules. The problem was that the user facing application was initially designed displaying each step in the business process in a form too specific to the role of an end user to enable the process or rules to be changed by business analysts. Every change required significant effort from the programmers. After significant redesign, the application was simplified to include a more generic user interface, reusable across process steps without change, enabling rules and process to be enforced and changed within the workflow/rules engine. The resulting application was released within 4 months and the brains of the end-users were re-engaged, rather than placed into production-line mentality.

Best practice says ‘start simple’. For business processes, rules and the applications around them this is essential.

Adobe to sue Microsoft - who cares?

I only just picked up this news from the Gartner High Performance Workplace blog. Adobe is threatening to sue Microsoft for including "Save as PDF" functionality in Office 2007. Normally, I wouldn't care much about corporate legal waving clenched fists. In this case, the outcome could be extremely important to financial services firms offering long-life products.

My comments on this blog were:

MS has no option but to offer "Save as PDF" functionality. Without it, the inclusion of records management capabilities in Vista would be heavily diluted.

MS Office products only save proprietary formats right now, and these do not make good formal records. So for organizations needing long term access to documents they produce, they absolutely must have PDF saving capabilities in the Office products.

Archiving is why Adobe offered PDF/A as an open standard and ANSI adopted it. If MS worked solely with the open versions of PDF (and adhered to the standards), I'm surprised that Adobe could have a say in its use.


For life insurance, annuities or even mortgage companies, ad-hoc customer correspondence and documents are often authored in Word. These usually constitute records that must be retained in the customer file, and be available beyond the end of the life of the account or policy. Which for life products could be well over 100 years. My bet is that the proprietary MS Office 2007 format won't be readable identically as published by any software in 2107.

Fortuately PDF/A will be, if NARA and other governmental records agencies have anything to do with it. It is essential to simplify financial services IT desktops that MS Office 2007 includes "Save as PDF" functionality, even if the firms do not adopt Microsoft's records management in Vista.

Let's hope Microsoft finds an acceptable way to use the open PDF format in its open standard form, and does not try and abuse it or add proprietary features as they did with Java. Adobe will hopefully support this so we avoid another cycle of proprietary MS Office formats making it into our records archives.

Friday, June 02, 2006

Vista in 2007 v. improved New Account Opening now

How much has the news that Microsoft intends to include records management and workflow in the next release of Windows (Vista) shaken companies’ plans to implement new account opening systems?

As Microsoft released several press releases of their plans to include key enterprise capabilities with the new operating system, many Enterprise Content Management (ECM) analysts commented on how this may have some CIOs rattled (see Dan Elam’s comments in an AIIM interview), and how.

And most CIOs won’t be paying all this attention because they believe that the Microsoft capabilities will be particularly good, or solve their business issues, especially not in the first release. But they will question whether the low price-point will allow them to use a bunch of services to make the product work in their environment (see my views on the way IBM works customers like this in a comment on the MS FinServ blog).

Alternatively, the CIOs believe that the downward price pressure will reduce the cost of real enterprise RM and workflow players. Until MS really releases something that represents a real business solution, rather than a toolkit, this won’t happen.

So it is down to the CIO to get some smart feedback from the operational and legal departments on what the business case is for doing something now against waiting another year. I think that the massive regulatory fines and legal actions that some companies are experiencing right now will convince them to do something sooner, rather than later.

Suitability determination in a vacuum

In a previous post I talked about using business rules software to enforce suitability determination for annuities – here.

Enforcing the rules in a vacuum is not enough. You need to be able to demonstrate that for a particular decision the rules were enforced as part of the broader business process, within the context of all of the customer information and documentation. This requires more than an audit trail provided by a rules engine, but real human or machine readable proof of what the rules were, and how the decision was made, at what point in the process, and based on what criteria. If you end up in court contesting a suitability claim, its going to be far more convincing if you don’t need a software forensics expert to demonstrate that the decision made did actually follow the rules.

To ensure legal weight, the suitability selection and decision proof should be captured and retained for the life of the associated customer documents. The customer documents (the customer file) should be in an electronic records management system. So this is where the suitability selection and decision proof should go. The problem is that the majority of organizations do not have a records management system.

Many vendors offer records management systems. Few target them effectively at financial services organizations, so the CIO has to work out whether a vendor’s interpretation of DoD 5015.2 certification really matters to them. More important than a specification written for defense organizations, and the de-facto standard for federal government, is the integration with the organization’s document management and business process management systems. This is where all the information is coming from, including suitability decisions, so having seamless integration with these core systems is essential to ensure the integrity of the stored information.

Unfortunately many of the big software vendors have only loosely integrated the records management software they acquired with their document management systems, and even less with their process management. IBM and Documentum are examples of this, and are often accused by analysts of the complexity of the end solution. Stellent is better, having packaged their products up well. Vignette has records, document and process management as a single seamless product, meaning that organizations can depend on all the components working as expected.

Enforcing suitability (or any business rules) in a vacuum is not good enough. Capture the results in the context of the business process alongside customer documents and an organization is more likely to stay out of court. If it comes to court, ensure that all the proof is retained throughout its life in a well integrated records management system and the software forensics experts will not be able to pick apart a defense on the stand.

Rules are (not) made to be broken

The old adage of ‘rules are made to be broken’ can not be applied to Suitability Determination for annuities. The financial penalties far outweigh the potential gains. So how do you enforce suitability rules?

Suitability determination is difficult to enforce. But it has to be enforced to ensure that the customer owns a product that meets his or her profile and the distributors of the product are not open to legal action further down the line.

Just hoping that the producers selling annuities follow the rules is not enough. The number of product combinations that are viable before applying suitability restrictions is enormous, so its tricky for producers to really be sure that what they offer follows the rules unless they list a very limited portfolio, which in itself may not be in the best interest of the customer.

There are software services such as Finetre AnnuityNet that enable the suitability rules to be defined by Broker/Dealers to meet their required thresholds and risk profiles, limiting which products a particular producer can see. Many NAVA members work with this service.

For organizations that want to bring suitability rules processing in-house, rather than depending on an ASP service, it is worth understanding a little about the underlying technology of ‘rules’. James Taylor writes a great blog Enterprise Decision Management that covers these types of issues. As an example, he cites the improvements that Auto Club Group saw using business rules to help reduce the go to market effort for new product features, while radically increasing the amount of business they could underwrite. An introduction is here.

My feeling is that implementing business rules in their own right is not enough. As I gather my thoughts I’ll post some more.

Thursday, June 01, 2006

Standards scope creep

I belatedly caught a short post on the MS finserv blog, just after I came off a lengthy conference call that made me question the scope of API and data standards - specifically ACORD standards, which should help organizations with the interchange of data about entities such as customer, policy, account, etc.

ACORD is doing this by defining metamodels that describe the entities in strict, and therefore interchangeable ways. And for the communication of information between organizations and their partners this makes absolute sense. Its what XML was invented for!

My call was discussing frameworks for annuities organizations, which have the problem of many moving pieces, especially the main constituents: customer, producer, broker/dealer and carrier. These guys absolutely need a common standard metamodel for data intercommunication, and can build off of the ACORD metamodels to achieve this.

Here's the point where standards scope started to creep. The framework discussion tried to pull standards inside the datacenter of an individual organization, to encompass the interoperability of individual tools or components by defining standard APIs. I believe this is a task that could lead to significant expense and wasted time, both for the standards committee and the organizations buying software.

Why? Because the main tools (workflow, document management, messaging, document and forms capture) from different enterprise vendors have been working together for many years, with proprietary, but accessible APIs. It is ultimately the responsibility of the financial institution to ensure all of the components within their four walls work together as expected, according to their specific environment and requirements. Standard APIs will not remove this responsibility, and still require software glue to be developed between systems for them to operate. So why waste everybody’s time?!