Saturday, December 23, 2006


I got tagged - Todd Biske got me. Which means I have to expose 5 things about me that are not commonly known (at least by the people that read this blog), then tag some other unsuspecting bloggers. So here goes:
  1. I spent 12 months traveling in South America. I knew that I was reasonably proficient in the language when I could argue with taxi drivers that chose to drive the 'turista' the expensive route.
  2. I used to play tenor sax. I was probably terrible but would love to have learnt how to play the instrument with some subtlety.
  3. I think I must be the only person in Massachusetts that doesn't own a car.
  4. I'm a pretty good skier. I'd love to be in Colorado right now up to my knees in snow. Unfortunately I'm writing this because I'm sitting in a ski house in the North East US watching the rain out of the window. Global warming makes me unhappy in so many ways.
  5. I enjoy playing football (soccer) despite being pretty awful when the ball is at my feet, but I'm fast and have good stamina so I make a decent defender. I miss the intra-office games that we played when I first starting working, as there is nothing like running out some work frustrations by kicking at your boss's ankles in a competitive setting.

Sorry guys, its the holiday season, so I'm having a little fun - you're tagged: Bruce Silver (a great source of thought provoking BPMS discussion) and David Rudolph (my new found colleague blogger). Happy Holidays/Happy Christmas!

Technorati tags:

Improving applications upsets users?

As many bloggers go quiet over the holiday season it gives me a good chance to catch up some reading on blogs I have missed over the last few weeks while working through the recent product release at Global 360.

A nice series of posts on Tyner Blain caught my eye: Going Agile, 10 Mistakes. Its interesting for me since unknowingly, the R&D groups I work with are so experienced and have so much background working together that agile techniques are implicit to much of what they do. They are ideally placed to head away from the 18 month major product release cycles they have been tied to in the past and towards a more incremental enhancement program.

The interesting challenge would be how customers react to products that can constantly deliver more benefits and features. On the web, product constantly update and expect their users to keep up. In the corporate world, users gain much of their effectiveness and efficiency from complete familiarity with the software applications they use.

I wonder if anyone has experience of this. Can the constantly improving world of the web translate to corporate applications? Do small incremental improvements help users adapt gradually to changes, or do they constantly unsettle and upset?

Technorati tags:

Thursday, December 21, 2006

Exposed as a blogger

And this is why any blogger should follow essential rule "would I cringe if my mother read it?": yesterday I had the pleasure of having this blog passed by email to a group of colleagues. I have been exposed as a blogger amongst my peers!

The whistle-blower? A blogging colleague, David Rudolph on Good Karma for Enterprise Software. And his is a nice blog too, talking about a range of enterprise software issues.

So, thanks David! Now I have to be less careful about making my mother cringe and more about a Global 360 salesman asking me whether our products have all the things I talk about in this blog. Of course, standard smallprint still applies: "None of the views expressed here represent those of my employer, they are mine alone.". In short, unless I state it directly, my colleagues should not infer the existence of anything in Global 360 products from my blog. That's why we have sales collateral, and very soon new website content that I'm writing between sips of coffee!

Technorati tags:

Tuesday, December 12, 2006

Preparing to release

I haven't disappeared off the face of the planet... Blogging will resume once I have shepherded the upcoming major release of Global 360's Case Manager BPMS product through to GA.

Its an exciting product, and this release really sets it up to be a leading human-centric, content-enabled and service-oriented enterprise BPMS. I can take no credit for anything in this release of the product, beyond the occasional powerpoint presentation and some typical product management administration. So, to the amazingly experienced set of development and QA engineers here in Hew Hampshire - keep it up, and thanks!

Technorati tags:

Wednesday, December 06, 2006

Process Analytics - separate is better

When it comes to sales scenarios, customers typically ask BPMS vendors for evidence of their built in process reporting and analytics capabilities. For many process deployments, having built in tools is desirable, since they typically come with canned reports that show the generic information a process manager could need. What I'm recognizing is that there is considerable value in having analytics tools that are completely separated from the BPM engine and able to provide information across a range of systems and processes.

My previous post included a discussion about why process analytics is more than just analyzing the process. Even in a BPMS that has a 'full-bodied' view of process data that includes descriptive business metadata, built in analytics tools will probably struggle to present more than just basic business and process information. This is partly because built in tools tend to work directly off of live data as standard SQL queries, and partly because they are still limited to the data within the system, however broadly defined that may be.

It seems that separating out the business process analytics tool from the BPMS has some advantages:

  1. Analytics is complex - separate tools are more specialized and capable
  2. Analytics should look at more than the process - separate tools are designed to access integrated BPMS and third-party information systems
  3. Analytics is processor intensive - avoid overloading the production system
  4. Analytics should be incorporating your SOA strategy - analytics can provide a more complete performance view if it can provide and use SOA based information

1, 2 and 3 are fairly obvious. Specialized business process analytics tools can provide analytical information broader, better, faster. I didn't include 'cheaper', as there is a cost attached, and this will be greater if a tool is selected that requires specific custom integration (or a lot of configuration) with the BPMS. Pre-integrated BPMS + analytics tools, although separate packages, will almost always be more manageable in the long-run than a custom integration.

So what about the SOA strategy? Here I will sidestep the SOA v. BPM debate about where they intersect. How ever you look at it, SOA presents several challenges and opportunities to process analytics. First, how do you make use of the data and process that overlaps from human BPM into the SOA orchestrated process, and second, how do you design an SOA that enables access to analytics tools?

The first point acknowledges that there is a large overlap between SOA and BPM. Imagine a human-centric business process that also consumes business services and orchestrates integrations - for example, a loan application process that receives electronic applications from a business partner, and coordinates a process involving human assessments, automated decisioning and updates of the main banking system of record and CRM system. Embedded BPMS analytics tools can easily provide a range of analytical information within the bounds of the human-interactive process. The problem is how to gain value from the systems interactions, in terms of:

  • business data - for the business
  • technical performance - for IT
  • cost - for B2B interactions
Using a BPMS that can seamlessly orchestrate service invocation and integration alongside human-interaction will greatly assist in providing the appropriate information for process analytics, since much of the data will be made available in one place. If this is not the case, the analytics tool must be able to look inside service requests and integrated systems and be able to reconcile the information it finds. If the analytics tool is open and extensible then this will be an option.

In many cases it will make sense for the analytics to focus on purely systems-based business services in the SOA. An example is an online loan customer inquiry. In a call center environment, process analytics would typically focus on attributes like the time to answer, abandon rate, loan value, type of request, all divided by class of customer. In an online world, similar information should be provided to the business to assess the effectiveness of their website, marketing and backend application processing, but the website analytics may not be the place to most manageably gather it, especially since much of the required information would not be exposed at this level.

By tying process analytics into the system that orchestrates real-time request services, valuable analytical information can be extracted from business information. This again is made simple if the BPMS that handles the human-centric backend processes also orchestrates and publishes the services used by the website. Even if the human and systems orchestrations are handled in separate systems, using a process analytics tool that is not trapped inside either engine will simplify access to the required information.

I'm really starting to understand the value of having true process analytics existing outside the process systems, enabling it to look across any information it requires. In the converging BPM/SOA space this makes more and more sense to me, though I'm sure that as I start to dig into this further with real tools I'll find out that I need many refinements to my thinking.

Technorati tags:

Wednesday, November 29, 2006

Analyzing [what is in] the process

In my post yesterday Process analytics is more than a pretty graph, I talked about the use of Process Analytics tools to do more than just monitor workload, rate of processing and so on. This fitted into the Execute & Analyze phase of a business process optimization lifecycle. James Taylor commented that:
Process analytics is also more than analyzing the process!

In my post I implied this a little by touching on the importance of strong analysis tools to provide information against business KPI and objectives, rather than just the process metrics. Though as James says, there is far more to it than that.
For instance, if I can predict that an account is at risk of going into collections I can route it differently. This is improving my PROCESS with ANALYTICS but it is not about analyzing the process.

This is the transition into the next state in the lifecycle: Manage & Improve. For example, goal management could fit in here, driving the automated routing of many process items based on business KPIs. Now accompany this with business rules, and complex decision analysis at an individual level, as discussed by James in his post. Use the right tools and work can be automatically routed both in bulk and individually to meet the complex requirements of enforcement, processing, performance and business goals.

Much of the benefit of process analytics for more than looking at workload requires a fairly 'full-bodied' view of process. For analytics to work well, managing the business goals, enforcement, and so on, I don't believe process can just be viewed just as abstract work items bouncing around a workflow touching people and systems. Process analytics needs to work alongside a formal business process that manages fully laden process instances:
  • Containing complete, descriptive business metadata
  • Linking to entities in other systems and providing access to their data
  • Managing and referencing content, documents, discussions and tasks
  • Enforcing the delivery of work to appropriate people, systems and services
  • Making available specific data that is required for your analysis and management aims

With tools that support this level of meaningful business process, implicitly 'analyzing the process' becomes 'analyzing what is in the process' - real work cases, customers and accounts. Having access to the valuable business information directly enables process analytics to positively drive the process based on this data. Sounds easy? I'm sure that there is a lot that I need to focus on in this area to get a full picture of how this actually works in practice!

Technorati tags:

Monday, November 27, 2006

Process analytics is more than a pretty graph

One of the technology areas I've been enjoying getting my head around in my new role is Business Process Optimization as it relates to BPM. As I'm starting to understand it, the optimization of business processes can be represented as a lifecycle, stepping through three main phases:

  1. Model & Simulate
  2. Execute & Analyze
  3. Manage & Improve
In the dim and distant past, long before I ever considered joining Global 360, and perhaps when my knowledge of the BPMS market was limited by the constraints of the products I worked with, I had a pretty good discussion about Model & Simulate. I'll probably return to this discussion in the future, since I think there is still some legs in the thinking around simulation where integration (or SOA) is involved.

As for Manage & Improve, in my past I naturally assumed that this was just a function of making sure that the process was well designed and roles flexibly assigned so that fluctuations in workload were well balanced across the available workforce. That is only half the story, but the need to Execute & Analyze effectively are still the foundation for effective processes.

In simple workflow environments a quick report and a simple graph can provide all that is necessary in terms of 'analytics'. It can show a manager at a glimpse where work is building up in a process. But in high volume, complex business process environments, that have constraints applied through contractual Quality of Service agreements or a need to provide exceptional customer service, the analytical capabilities of a system need to be a lot greater. Examples could be credit card dispute resolution, call center customer services, life insurance application processing, or brokerage account opening.

In these complex environments, managers need up-to-date data that can represent work sliced and diced across many dimensions. This enables them to see not only that there is a large mass of work collecting in one activity in the process, but whether that places their highest value clients or service contracts at risk. True process analytics tools can understand the structure and 'flow' of work in business processes, enabling them to produce OLAP cubes for complex analysis. And since it does this by capturing an event stream representing work being processed and routed, by taking the data offline there is not the huge processing impact on the live system that complex database queries would have.

Now that managers can see and respond to predefined analytics, as well as having the tools that enable them to simply visualize the data sliced according to their own local requirements, the job of analytics is done, right? Not really. Process analytics enables slices of data to visualized over time, enabling trends to be spotted or the impact of specific conditions (for example a spike in volume of high value work) to be assessed.

Being able to understand how a business process responds under real conditions seems like the ultimate proof of performance. In a crazy day, where everyone is working flat out, a manager may not be able work out from 'gut-feel' alone how well his process is responding to this big spike in demand. Given data and easy to drive analysis tools after the fact, he or she can quantitatively understand what was different to other days, what went well and where improvements could be made.

I'm really just a beginner in this business process optimization world, but I understand that business process execution can be run in several way: just get work through and out of my sight, or get work done that really benefits the business. With experience, Key Performance Indicators (KPI) can be developed that provide the manager with an 'at a glance' metrics showing if the process is running to plan. The aim of the business is not necessarily to hammer out 10,000 cases an hour, but really to beat the true goals of the business that his teams should be bonused on - be that profitability, customer satisfaction, value of new business, etc.

With a business process that has been optimized based on quantitative experience applied to real metrics, and with analytics that have been built to be meaningful in the heat of the moment, a manager can really work to exceed true business objectives.

Technorati tags:

Friday, November 24, 2006

Feature ticklist trap

A great post on Tyner Blain by Scott Sehlhorst comments on the Fifteen Ways to Shut Down (Windows Vista). This enormous set of ways to shut down a PC led Scott to talk about how software that has every feature under the sun to satisfy every requirement actually makes everyone unhappy. Quoting Joel on Software

Inevitably, you are going to think of a long list of intelligent, defensible reasons why each of these options is absolutely, positively essential. Don’t bother. I know. Each additional choice makes complete sense until you find yourself explaining to your uncle that he has to choose between 15 different ways to turn off a laptop.

As I work with a well established product that has many customers, in a company that prides itself on customer satisfaction, the desire to provide functionality for every customer requirement and request is a challenge. There is little option in this environment but to implement the functionality, so it is fortunate that the development team have a huge amount of experience and handle this balancing act with the appearance of ease.

Even for a skilled team like this, the holistic approach that Scott talks about can be valuable to help ensure excessive complexity is not introduced to the software. He highlights ways to handle structured requirements that focus on the interaction by personas (categories of users), rather than directly trying to satisfy each requirement in turn:
Multiple requirements can lead to multiple, locally-optimized solutions or features. A holistic view needs to be taken to assure that we aren’t introducing too much complexity with these variations of similar features. Interaction design gives us a big-picture perspective to make sure we aren’t making our software too hard for our target users to use.

This persona based design is valuable not just to make sense of a vast array of requirements, but in general so that business analysts and software developers can understand what it is they are proposing and developing, beyond the individual feature / functionality of the standalone application, business process or composite solution. A while back I attended a Pragmatic Marketing course on Requirements That Work that introduces this really well, and was a really enjoyable interactive classroom day. I'd recommend this to anyone that has to work with software requirements.

The advantage that many enterprise software applications have is that they are supported, configured and customized to meet the needs of their own user base. This enables IT departments to hide the unnecessary features and options from their users. Windows Vista, Office and Outlook don't have this luxury. They need to meet the requirements of a disparate, varied audience, with a range of skills and needs.

It seems that Vista needs to learn from the potential failings (and subsequent enlightment) of the Linux world - that if a user doesn't understand an option or function they will ignore it, maybe select the default, then avoid using the software in future. The Debian OS gives skilled users everything, complete with the associated complexity - Ubuntu, which dubs itself 'Linux for human beings' gives 80% of users the defaults they would have selected anyway, and hides the rest. Debian is not well adopted by average desktop users. Ubuntu seems to be doing better in that arena.

Microsoft needs Windows Vista to be a success, and it will only be that if seen by users as being a worthwhile step forward. But it needs to fight its way back out of the feature ticklist trap if it wants to ensure user adoption. I hope that there are not 15 ways to buy and upgrade the OS as well. If there are I may not ever get to the point of selecting one of 15 ways to shut it down.

Technorati tags:

Monday, November 20, 2006

SOA for viewable documents

As customers extend their SOA strategies more and more, a question seems to be arising - is SOA a good fit for documents (like PDFs, TIFFs, Word Docs) and other binary content? Of course SOA powered by an Enterprise Service Bus (ESB) or some other mechanism for composing services can handle binary data, its just a question whether it really makes sense to push all of this data through it.

My background is in document imaging. Imaging and high volume document management systems typically have built up extremely functional image and document viewing capabilities over the course of their often lengthy existences. These viewing capabilities were built with the following design criteria:

  • Responsiveness - how fast can a user be presented the specific information they are requesting so they can continue working.
  • Server performance - do not request more data from the server than is really required by the user to view the document. Don't send unnecessary resolution, color or pages, dependent on predefined user requirements.
  • Network performance - in the days before even 100Mb networks were commonplace, managing network usage was important. In large scale, or distributed implementations it still is.
  • Seamless presentation of multiple types - documents come in many different types, and it does not make sense for easy processing of them for the user to have to navigate different native applications, let alone deal with the load time for some of them.
  • Onion-skin annotation and redaction - the ability to mark up any type of document you can view is essential in some environments, without damaging the original document.

Viewer technology was largely based on a thick-client paradigm to allow it to meet most of these requirements. Stellent (to be acquired by Oracle) offers a range of image viewer technology as its Outside In product line, which has been the backbone of many thick-client image viewer apps. Spicer offers a range of viewers, especially focusing on complex CAD formats. There are others as well, but their number is limited.

Even now, there are very few applications that can present thin-client views of documents and meet the previous design criteria. Daeja is one third-party Java applet that can be integrated to meet this type of requirement for image and PDF documents. Global 360 has powerful thin-client image viewing, annotation and capture to support its BPMS and Content products.

The thing with all of these options is that they have traditionally been designed to plug directly into their imaging repository, either through TCP-IP proprietary file transfers, or for the thin-client versions as standard HTTP GET requests. And that is just for the image viewing. The upload of annotations was specific to the application. This does not fit SOA well, where organizations take the approach to extremes and insist that the ESB sits between all end user applications and their servers, using pure SOAP web services.

Many of the advantages of the viewers is the smart ways they access their servers to get best performance. It seems like a poor use of resources to build an SOA layer between a viewer application and its related repository server, just to enforce a dogmatic approach to SOA. Unless it is really justifiable to be able to reuse any viewer technology (which as I say there is limited choice), or allow a single viewer to access any repository (a complex proposition to do well).

Even if ESBs can handle this type of binary files effectively and efficiently, I am struggling to see if this is really a pragmatic approach to SOA. There must be some value to doing this that I have missed, probably based on my outdated background in this area.

Technorati tags:

Saturday, November 18, 2006

Customer relationship management from Bankwatch

Colin on Bankwatch has written a nice summary of CRM and how it relates to banks: Customer relationship management | Wikipedia

As he says:
The vision for CRM is entirely relevant for Banks, but is way ahead of either capacity to afford, or certainly capacity to implement due to disparate systems.
As Colin talks about in Its not about transactions, its about relationships! CRM can provide better relationships between banks and customers, and this can lead to strong upsell of services over time.

This is the key point as I see it for CRM. For any vendor to be able to help customers feel comfortable across the many different interaction points they have, and the many different call center personnel they may speak to, the vendor needs to provide enough knowledge of the customer's profile for the interaction to go smoothly.

CRM can provide this, as well as ensuring that information across the many disparate systems that contain customer information are easily accessible, and present useful information to the vendor on demand.

Integration of systems is essential, and being able to make a good guess of what is needed from which system, in advance will make any call center worker's job easier. BPM, SOA and CRM together can be very valuable if the technology helps the vendor's user, and doesn't just add more complexity to their job.

Thursday, November 16, 2006

Integration of BPMS and Portals

A recurring question is the integration of BPMS and Portals - how would organizations benefit from the interaction of these two technologies?

'Integration' is a word often avoided by Portal vendors. Vignette preferred the word 'surfacing', since it reflects more closely what is happening when an application's data is presented in the Portal. The chosen data or predefined portlet (a visual application component) is pulled through the Portal and 'surfaced' on the screen.

With a strong Enterprise Portal, formal integration is not required, instead just the configuration of what is displayed where. That said, often implemented as a precursor to a broader SOA platform, the portal becomes the central meeting point for many system integration activities that provide composite applications through 'integration on the glass'.

Outside of this, Enterprise Portal value is sold on:

  • Providing end-users with a single, consistent point to interact with systems and find information
  • Enabling personalization of appearance and applications, to help users feel comfortable with the system and more rapidly find information relevant to them
  • Implementing a consistent technical and visual framework for integrating disparate systems and information services
  • Simplifying, delegating and enforcing administration of Intranet and Web Sites, taking the burden off IT and web designers
The value associated with Portals can be related to many systems deployed in organizations, including the BPMS. A BPMS can incorporate some of the features of a Portal into the native BPMS application, such as consistent presentation, delegated administration and componentized user interfaces. The latter though often demands webpage development to perform.

A BPMS may benefit from being surfaced in a broader Enterprise Portal system, providing consistent access to BPM and the other resident information systems, potentially showing information from both on the same page. For example, this could provide the end-user with contextual information from a knowledge base, automatically displayed based on the type of process or activity he or she is working. In flexible, browser based BPMS applications there is of course no reason why this type of functionality could not be integrated by IT, but the value of the Portal is that once each system is integrated once, it can be reused in other parts of the application without web developers getting involved.

The previous example implies that the most valuable way of surfacing a BPMS in a Portal is for the presentation of end-user work lists, individual work items and associated data. For users who work with BPM occasionally, for example for Expense Claim Approvals, pulling the BPMS into a broader Employee Intranet may help the ease with which users can find and use the system. For the other extreme, heads-down users processing large volumes of work, or customer services focused users, demanding Portal presentation needs to be carefully assessed to ensure that it truly offers value and does not in fact hinder the fast, effective and efficient interaction with the BPMS. Composite display and interaction may slow the system and detract from the user experience.

From another point of view, supervisors or administrators of BPM may benefit from the flexibility and user driven configuration of applications available through Portals. Being able to simply configure the most important process analyts charts for his department may help a supervisor to most quickly and effectively monitor the performance of his staff and spot potential red-flags. Being able to display this information alongside information from other systems enables a better contextual view of what is happening, and why.

Portal 'integration' can be approached in several ways:

  • Surface main web page components directly in the Portal
  • Use the Portal's application / portlet designer
  • Develop highly customized portlets using JSP
For example, Global 360's Java BPMS provides flexible enough browser-based presentation components to be effectively surfaced directly in an Enterprise Portal. Use of the portal's own portlet/application design tools can use the full BPMS Web Services or Java Client API to provide highly configurable displays of data from the BPMS in ways not invisaged by the original user interface. Finally, web developers can easily use familiar JSP page design to create standards-compliant (JSR-168) portlets for highly customized presentation.

True Enterprise Portals can provide a lot of value to an organization. For the right scenarios a BPMS should be surfaced within a portal. The right scenario is typically not one where the BPMS is used for very high volume human-centric processing or data entry, where the Portal is just being used to provide a consistent user interface and slighly simplified page layout capabilities. If a BPMS vendor insists on the importance of a portal for a high-volume processing application, see a red-flag. The BPMS that salesman is pushing you is probably not really designed for your high-volume requirements, and is more likely designed to look pretty for managers approving occasional expense claims.

Technorati tags: BPMS Portal

Monday, November 13, 2006

Virtualization - more bits to break?

Virtualization or Virtual Machines (VMs) hosting Operating Systems in production environments... My reaction to this is, why would you do it?. Why run what is effectively an operating system emulator inside an otherwise decent operating system?

I understand the benefits of VMWare (and Microsoft's new Virtualization) type environments for demonstration, development and maybe even QA environments. And the VMWare player for 'safe' Internet browsing is a clever application of the technology. The main advantage here being that a VM image is a self contained environment that can be copied as just a bunch of standard files from one machine to another, enabling easy backup and snapshotting of an environment at a point in time. This is great for QA Testers and Sales Engineers everywhere. But in a production environment, where you want to squeeze every little bit of juice out of that expensive piece of hardware, I'm struggling with the idea.

I have heard IT shops suggest that VMWare can be an incredible boon for them. Now they have the opportunity to really use large, multiprocessor servers, running many systems, each within its own self-contained 'virtual machine'. This provides some advantages:

  • Limit the resources an individual system uses, for both performance and licensing
  • Enable the deployment of otherwise conflicting applications on a single box
  • Provide a level of manageability of servers, being able to start and stop VMs independently

Each of these reasons carries some advantages. I'm just not convinced that the overhead that a Virtual Machine hosted Operating System carries really helps organizations get the best out of their servers. My experience has been that a VM represents another layer of software to fail, either in terms of the hosted operating system, or just the VM environment - more moving parts means more bits to break. And the hosted VM/OS consumes resources on top of the host OS. Unless this really is an insignificant amount, it seems hardware intensive. Of course, IBM has been successfully doing this for quite a while with Dynamic Logical Partitioning (LPAR), which is tried and trusted (but also supported directly by the hardware).

So as customers ask me if Global 360 products will be officially supported on VMWare I still ponder the question. Sure, we support our products on a range of operating systems, including IBM AIX LPARs. If you want to run the products on a supported OS, that is fine - we don't typically state what the underlying hardware is, as long as it meets the minimum spec that any off the shelf hardware is likely to meet (of course, you need a bigger box to support 10,000 BPMS users!). The problem is, that by specifically asking if we support VMWare implies that we need to QA this environment as well, which is an expensive proposition, as any software vendor understands the costs of adding additional platform support.

This is how I am thinking of addressing the VMWare question. For systems that do not require specific hardware attachments (like scanners or optical jukeboxes) I'm starting to believe that it is safe to assume that VMWare is equivalent to just another hardware platform. But I'm really really struggling to find out, behind the murmur of requests: How many real production systems of a decent scale are out in IT-land that are running on VMWare? Do they run well, all of the time? How well are they supported by the application vendors? Oh, so many questions!

Technorati tags:

Friday, November 10, 2006

Master Data Management - just another load of inaccurate data?

As I continue my offline delve into all things SOA and BPM, the concept of Master Data Management (MDM) keeps coming up. In the past I just assumed that MDM was about data warehousing and maybe even a grand vision for CRM - really just a way of collecting all of the information you have about your customers that is spread across a bunch of disparate legacy, COTS and home grown systems, and having it available whenever you need it. When you bring SOA and BPM into the MDM picture, things seem to get complicated.

Pre-SOA the problem with disparate systems was that applications typically required swivel-chair integration through re-keying portions of data from one system to another, or at best providing ocassional batch loads of information. A single accurate view of a customer's information rarely existed, and in any organization that has not replaced all its systems you will see how much data is spread around. For a different example, when you start a new job, look at how many times you enter your same personal information onto different forms. Each form exists to simplify the entry of your data into a separate system, and the duplicated information on each is an insight into this swivel-chair IT world.

When BPM was brought into the picture, traditional workflow systems typically added to the issue by copying data into a process instance at the start and never sent updates to the data back. Where workflow did use live customer data, it tended to extract the data from a point system, completely disregarding the reliability of that data, more focusing on the simplicity of accessing that system due to lack of other good integration mechanisms.

Customer data reflects our current knowledge of our customers, and should be affected by everything we do with a them, every transaction that is made, every interaction that we have and any background tasks that are going on. If every system that records customer data for itself is not effectively synchronized with others, even SOA is going to struggle to pull disparate systems into meaningful and accurate business services. To me this seems like a fundamentally unreliable piece of SOA. Each service has to rely on not only the actions backend systems perform but also understand the data that a system uses and how that may be inconsistent with another system used by the same service.

MDM provides a way to synchronize and pull data together from the underlying systems into a central place, and this consistent and current layer of master data does appear to have some value. I can also see that it is useful to be able to build new business logic on top of reliable master data, abstracted once from all the underlying sources of unreliable and disjointed data. This makes data reusable and new business logic easier to build and more reliable.

The problem is that I don't see how MDM helps SOA. SOA needs to work with disparate backend systems largely intact, benefiting from the logic they already provide. It should not be trying to replicate or rebuild the business logic in underlying systems, since if you are going to do that you might as well rip and replace those systems, not duplicate the logic in the integration layer.

To round it all out, SOA in combination with BPM needs to be aware of data inconsistency when using disparate backend systems. For them to be effective, all processes and services should ensure the feedback of up-to-date data into the backend systems that own it as the result of processes and service calls. For BPM this requires strong data modeling and integration (with SOA interoperability) to prevent process data duplication - something we have not seen in traditional systems, but I'm seeing more of now.

Maybe MDM can be useful as a component of SOA/BPM , but right now I'm struggling with how it doesn't just become another layer of data that disagrees with everything else in the enterprise.

Technorati tags:

Wednesday, November 01, 2006

"You've Got Work"

When put on the sharp end of a sales meeting showing some process management, it has been typical for customers to ask me whether the workflow engine notifies users through email that they have been assigned a new work item.

Although this shouldn't have been a tough question to answer, it always seemed to be an indicator some deeper issue: notifying an individual user through email that he or she has been delivered a work item actually implies to me (and perhaps other seasoned workflow professionals) that actually the user rarely gets new work items. In which case, is the BPMS I am proposing really the right solution for the customer that is hinting (with the email question) that he doesn't really need the capabilities of a low-latency, highly scalable, heads-down, production workflow product?

What was my answer (beyond - 'yes we do that')? To me, in structured business processes that are being worked by teams of people, email as a notification mechanism is all wrong. In high volume workflows, process instances (work items) should be delivered to groups of users that work in a pool and not addressed or delivered to individuals, as email notification suggests. Pool working is essential to handle peaks in load, prevent the need for reassigning work because an individual is on vacation for a day, and reflects a common skillset of the workers. For this model, email is the wrong approach.

Of couse, my background was the high-volume and heads-down workflows of health insurance coding (data entry and payment decisions), financial services new account applications, and call center dispute and enquiry handling. In these environments the only way to process work effectively and efficiently is to use a BPMS that is designed to keep the work flowing through the system.

My background, until I took a short spell with a corporate "compliance and governance" focus was not based on deploying systems to ensure delivery of my travel expense claim to my manager, then accounts payable. That required a very different structure of workflow delivery. As an employee submitting a claim, my expense work item didn't follow any different logical path than anyone else's - it went from me, to my boss, to AP. But only my boss could do the initial acceptance of the claim, not another manager at his level. My boss was not a heads down worker for administrative tasks. In fact, he would have been happier if he could have accepted my expense claim through Blackberry (and given the delay for him to get to a real PC sometimes, so would I!). There are many business processes that could benefit from workflow automated delivery, not really for efficiency but more for enforcement of policies and processes, that fall under this 'person to person' enforced delivery model.

Given this, is an enterprise, high-volume BPMS the wrong tool for processes like expense claims, SEC reporting and HR recruitment? In some cases the answer is yes - some BPMSs do not have the capability to effectively manage these processes that must direct work to individuals rather than groups. They can do it, but it is a stretch for them to make the relationship between one originator of work and the next person to deliver it to: my expense claim must go to my boss, not any old manager. Customization is not really a good answer for what should really be out of the box processes.

Then there are some BPMSs that can do a great job of these personal delivery workflows, despite their enterprise heritage - I use Global 360's Case Manager product for my expense claims. The weak link there is ensuring that my boss picks up his email notification from the mass of other stuff in his mailbox.

An email notification requirement can be an indicator that the workflow does not require an enterprise BPMS. But selecting the right enterprise BPMS provides the other positive features that come with a product of that class. IT essentials, like: scalability, high-availability, failover, enterprise management, auditability, rapid deployment, flexible platform support, and integration with third-party systems. These are things that years of hammering in call centers and financial services mailrooms have taught enterprise BPMSs (and it will take document review workflow products years to learn). Why should your HR or AP group be allowed to select a less dependable solution than enterprise BPMS?

Technorati tags:

Saturday, October 28, 2006

Rules and Business Processes

It has been raining like crazy in Boston today, so I've had an excuse to sit and catch up on some blog reading that has been a little sidelined since I started my new job at Global 360.

So I was really pleased to run across Bruce Silver's analysis of Where Rules Engines and BPM meet. Business Rules Management Systems (BRMS) are something that I am getting more contact with now, and Bruce really summarizes them well:
In fact, the heart of a business rule management system is in the rule repository, which supports rule discovery, governance, versioning, traceability and reuse across the enterprise. The business rule engine executes business rules defined in a structured rule language supported by the BRMS rule design tool.

Corticon is the system that I have been looking at, with its integrations with the Global 360 enterprise BPMS products. Before moving seriously into this space I believed that most of the value of BRMS came from the rules engine and a user's ability to powerfully define rules in a fairly understandable manner. Now I'm starting to understand the potential of the modeling, central rules repository and rules 'lifecycle management'. This really allows customers to define their policies and rules for decision making and guiding processes in one place, enabling their reuse and consistent application across multiple processes.

BPMS can really benefit from the rules engine, and centralized rules repository during execution of processes. Pulling the remaining rules management seamlessly into place alongside process management is more of a challenge than just integrating the process and rules engines, but is essential to ensure that there is a seamless enterprise view of the way IT systems are managing the way the business runs. Pulling systems together for seamless modeling is always hard - especially when much of their value can also come from them working independently. This is something I need to spend more time concentrating on.

Technorati tags:

Thursday, October 26, 2006

Consider policy over process

Its not often I will directly 'advertise' vendor specific events or information, but in this case I found that this webinar, pointed out by Sandy Kemsley on her Column 2 blog, was a great reinforcement of why I made a move to Global 360.

Stories that relate to large brand names like Nike doing things better using a company's technology are OK. When they also communicate best practices and lessons employed and lessons learned the story becomes valuable to a wider audience. In this case, as quoted by Sandy:
Their lessons learned:

  • Consider policy first, then process, then people, then tools
  • Get senior management buy-in
  • "Eat the elephant one bite at a time" -- this is so key, and something that I've written about many times before: do something small as quickly as possible, then add functionality incrementally
  • Rent experts -- how can I disagree with this? :)
  • Leave the rocket scientists at home -- in other words, it's not as complicated as you think it is; keep it simple
  • Build a team that you trust and have confidence in -- provide direction and support, listen to what they need, and stay out of their way

I agree with the points, despite having to think through the first one a few times before it really sunk in. After a moment though I realized the obviousness of it. This really is just a drill down from high level to low level consideration:

  • Policy: defines the goals of the business and the top level constraints / approach for how to get there

  • Process: starting at a high level and working deeper, this reflects how to model, track, manage, enforce and run your business to meet your policies

  • People: now that the process is defined, the right people can be used to handle the tasks that are needed to run the process well. Modeling a process around the people and what they do now is 'paving the cowpath' and just automates bad processes that do not help acheive the business' policies.

  • Tools: as ever the best technical decisions are made last - pick the technology and vendor because they help achieve your goals, not just because they carry the currently fashionable tags.

It all sounds obvious, but often organizations will miss that first stage of setting policies, goals, objectives etc and jump straight to 'process'. That gives no frame of reference, no view of what is 'success', and no view of how much to actually spend on solving the problem. With the right policies in place the next lesson learned, "senior management buy-in", has probably already happened.

Technorati tags:

Sunday, October 22, 2006

Next Generation BPMS - does it need CRM?

The Bankwatch blog has been churning out a lot of content that is really interesting to me at the moment, thanks to Colin (where do you get the time for all this stuff!?). Today's post, Forrester Research: CRM Market Size And Forecast, 2006 To 2010 uses the quoted research as an entree into a discussion on why CRM is unlikely to really improve customer satisfaction in call center, and presumably other customer facing environments:
Bottom line, today the text based notes stored within CRM systems are relatively useless. The time taken to read them during each interaction nullifies much or all of the benefit, depending on the speed reading capabilities of the agent/CSR. I am starting to see the next generation of thinking on this, and its going to be controversial, because of the enormous spend on CRM to date.

Colin's comments come along at the same time as I'm looking at the "Next Generation BPMS" - not BPM 2.0, which when described by Ismael Ghalimi has a central theme of standards based modeling, simulation, analytics and of course process execution. I'm thinking of an all-encompassing business suite that incorporates not just process and all of its supporting capabilities, but also content, integration, collaboration, security, identity, metadata repository and a host of supporting admin and configuration features. Should CRM be a component of a Next Gen BPMS?

James, writing Enterprise Decision Management (EDM) blog often touches on CRM for customer service. A recent post talks about the use of EDM or BPM for call routing in a call center. Both EDM and BPM can make use of customer data or profiles to help decide how to route a call most appropriately. In this environment, the customer data would probably be held in the CRM. It is only the customer data that is used for automated decisions, and many of the facets of CRM including the text-based history of customer interactions are not required. Maybe a relational database schema would have been sufficient to capture this data.

I have tried looking at BPM vendors that have addressed this question in the past. Pega obviously shapes its process offerings around its CRM background. As I understand it they have a strong (although not alway rapid to define) object model that represents the customer data, supported by a process engine. The analysis and definition that is performed is typically approached from this standpoint, rather than the process standpoint that would typically be attacked by other BPM vendors. Activities and display are driven from the data, so this is important, and makes customer data central to their focus.

Way back pre-TIBCO, Staffware acquired a CRM vendor. When I saw it (pre-Vignette, when Tower Technology was a leading Staffware partner in the UK), the CRM component was not really integrated with the workflow and it was hard to visualize how the two were going to work together effectively. Looking at TIBCO's website it is hard to see if the CRM technology exists in any of their offerings. Perhaps some of it was fully embedded into the iProcess engine. If that is the case, it is more of a relational data model that is important, not the full blow CRM capability.

At this stage in my Next Gen BPMS definition CRM is not a component. Instead I'm going with a BPMS that provides flexible case folders supported by definition and access to relational data, internal or external to the system. This BPMS allows me to represent any data I want to within a standard relational DB, or external system, using the data directly for process execution and displaying it seamlessly within a configurable user interface.

I'm open to offers on whether there is any more of CRM than a relational data model that I need. Anyone?

Technorati tags:

Friday, October 20, 2006

Integration alongside BPM - a comment

Colin who writes the great Bankwatch blog commented on my post yesterday about Integration alongside BPM:

RE: "Once processes or integration requirements become complex the initial integrations may become a larger effort"

Not being the technologist, but isn't that where web services, and middleware are supposed to layer the problem into manageable, similar processes.
Colin, I've read your stuff - you get technology better than most! Sorry for reusing your comment in a post, I just thought it was important to follow up for the masses (well I can dream!). Rereading my post I can see why Colin makes his comment. I seem to imply that there is no solution to the problem. So I'll try and clarify...

BPM, document management/archiving and collaboration are integral to most real-world business processes. Many BPM/ECM vendors provide each of these components standalone, and even for their own suites of software provide toolkit based approaches to integrating them. From my experience, the requirement for this routine, infrastructure based integration is what absorbs a lot of project time when it comes to implementation of business processes. That is why I would recommend using BPMS suites that seamlessly incorporate all parts. Full disclosure: I have worked for two vendors (including my current employer) that provide this and am still impressed by how fast real processes can be deployed when you don't have to do this basic integrations every time you deploy a system.

Colin is also right where he states middleware can help break the integration down and make it maintainable and more reliable. But to me, Web Services in my mind are an enabler of this. They don't simplify the application integration at a business level, just provide a standard mechanism for integration at each of the touchpoints. That is why I don't associate SOA directly with Web Services. SOA to me is what can really break down integration into manageable and maintainable chunks.

The best that most BPMS can do now is to talk to the SOA 'middleware', or 'enterprise service bus' (ESB), or maybe straight to the exposed Web Service endpoints provided by an SOA project. This final approach is fast, but feels again a little brittle when it comes to changes. But the middleware or ESB requires integration to the BPM process - another layer of integration is born.

I'd like to see this scenario improved, with BPMS and SOA working side by side without integration of yet more infrastructure pieces! Maybe its out there. I think that most of what is there is marketing fluff...

Technorati tags:

Thursday, October 19, 2006

Integration alongside BPM

Integration of third-party business applications and data into Business Process Management (BPM) applications is essential, since very few real world business processes run standalone. Users need to apply their knowledge to information in their line of business system, making decisions and entering data that is reflected in the process. Automated transfer of data between BPM and business systems aids automated processing.

Most BPM Suites offer some level of integration within their capabilities, usually being able to talk SQL to a database or maybe configure a call to a Web Service within a process activity. For UI integration of business applications to avoid swivel chair integration by users, while presenting a seamless user interface for best productivity and experience, developers are called in and API bonding commences.

For a few points of integration at a handful of points in a process this 'in-process" and "in-UI" integration is bearable and maintainable. Once processes or integration requirements become complex the initial integrations may become a larger effort than deploying the process, and maintenance becomes difficult since every change in any component on the system carries dozens of dependencies on the process, data models and hard-coded integrations of multiple other components.

With a pile of separate BPM, content, database and business components I don't know of any real way of avoiding this. If you know of great examples of disparate platforms that really do this well, let me know.

So in my view it makes most sense for IT teams to make things easy on themselves to start off with. Using vendor products that provide as many of the key process, collaboration and content features within a seamless product can avoid the most common and repeatable requirements for integration. Why do unnecessary integration?

Then make use of Web Services or messaging components like JMS and MQSeries messaging to decouple process from application wherever possible. Finally, ensure that the BPMS UI provides components that can be built into custom applications, JSP or portals, and work hard to develop a flexible integration, not a pile of brittle spaghetti code.

It is possible, and I've seen it done. It just takes a strong foundation and pragmatic steps. If you have great examples, case studies or supporting integration approaches, please let me know.

Technorati tags:

Saturday, October 14, 2006

How to web 2.0 your bank - use technology well

Colin at Bankwatch posted a great (and even he admits it, broad) discussion of How to web 2.0 your bank. At first I thought to myself that maybe this could be true for any organization with an online presence, but thinking again I believe that banks are one of the few categories that has really provided really pervasive online interaction with its customers. As much as it could be useful, I rarely interact online with other organizations that I have a long-term customer relationship with.

To really allow banks to run a successful online service, especially a web 2.0 one that provides more interactive, real time, self service applications, the infrastructure not only has to support access to appropriate backend systems, but needs to pull the bank's employees into the interaction in a manageable and auditable manner. SOA, BPM and Rules, often deployed in the backoffice need to be pulled closer to the main point of interaction with the customer. This would enable more automation of decisions, more sharing of customer information between employees in separate customer service centers when an issue can't be handled in one place, and more feedback to the customer for the status of processes that can't be completed immediately.

By using technology more effectively, the community interactions that organizations like banks are afraid of presenting in the web 2.0 world could actually be more valuable and less damaging than online groups complaining about poor customer service and unintelligable call center operators.

Technorati tags:

Friday, October 13, 2006

Customer focused product management

As with any new job, the first few days are exciting as well as 'brain crunching'. I'm suffering the usual thing that everyone probably experiences, where I'm trying to get my head around a dozen new products and modules, a hundred new faces and names, and thousands of TLAs (three letter acronyms).

Fortunately I'm able to cope with the product side of things better than the first time I did this 8 years ago when joining Tower Technology. Back then imaging and workflow was a mystery to me, and even now I look back on that and wonder why on earth it took me so long to understand what it was about. I think that then I was introduced to the concepts from a pure technology background, so the actual reasoning behind the whole imaging and workflow space made little sense. This time around I have the 8 years of Tower and Vignette to help me get my head around the products faster.

I'll be product managing Global 360's G360 EX and G360 Case Managemer product lines, working closely with the other product managers, product marketing managers, engineering teams and so on. My first impressions are that the product engineering is really well run right now by the engineering directors, hopefully giving me some time to look at the more strategic side of role, while juggling the numerous customer related requests that are going to come through the door. Certainly, the customer piece is going to be especially important at Global 360, since the company is organized to provide a very high level and personal customer service. Quickly understanding, distilling and communicating all of the requests, issues and questions will be a key part of my role. Bring it on, these are exciting times!

Hopefully I'll be able to blog a little about some BPM and case management issues that strike me as important over the next few days and weeks. With hundreds of financial services customers, New Account Opening will likely get covered, but you can also expect governement, healthcare and industrial discussions as well.

Technorati tags:

Tuesday, October 10, 2006

Reflecting on ECM

Russ Stalters at BetterECM was good enough to write me an email wishing me good luck in my new role at Global 360, as well as asking for some reflection on how ECM vendors like Vignette might survive the Microsoft Office Sharepoint Server (MOSS) 2007 opportunity/threat.

Here are the thoughts that I shared with Russ, and thought that with a little editing they were good enough to be make it as a blog. None of these thoughts represent an insider view of what Vignette is doing, instead they just represent what seems like good practice to me.

I truly believe that Vignette has some great technology. Unfortunately, like any independent software vendor that has acquired a lot of technology they probably have too many products in the portfolio to keep them all ahead of the game - the investment required to retain leading technology in every product through R&D is beyond what Wall Street wants to see from independent software companies. I would suggest that ECM vendors may need to rationalize what they offer by repositioning some products into a support role for the primary products (leaders) in the portfolio, thus not needing to compete directly with the competition on feature/functionality on every product.

Open Text may, for example, struggle keeping the multiple products offering the same capabilities alive for any length of time, let alone the second tier of unique products they have. Leading with every product into a vast set of sales opportunities does not seem smart to me. But having the huge range of capabilities in support products in the portfolio will allow deals to be won where a wide range of boxes have to be ticked.

Selling infrastructure is not going to work long term, although I think that an Enterprise play is still something that Microsoft will struggle with convincing CIOs of in the short term. I wrote a post a while back (in response to one of Russ's) that addressed this: ECM in a post Microsoft World. This was before the FileNet acquisition, and what I believe is IBM's play to capture the whole enterprise infrastructure market: IBM platform for massive systems reengineering

I think that smart independent ECM companies with large portfolios need to look at what they do best and focus on that. Too much time is often spend trying to chase every deal. The approach I would choose is to find a market segment where they have specific experience and expertise already, and can offer solutions that are a good fit for the business requirements. Hyland are still playing well, with a strong and effective focus on Healthcare and Stellent seems to be doing well in compliance and governance. The sales of these solutions probably makes them more money than trying to sell technology alone.

Finding ways of embedding ECM software into organizations where others have failed to penetrate is likely to be the next wave of ECM success. This will only be done (I think) if the vendors can find a way to make this happen, and I believe that a strong business requirements (as opposed to technology) focus is essential; in the past the technology sale has not got ECM much further than the website and focused operational installations when led by IT.

I'm looking forward to watching the ECM vendors as more of an outsider. My view is that BPM is core to really addressing business problems in many organizations and has great potential to become more pervasive. Content and BPM technologies are complementary, so I'm sure that I'm going to enjoy my future in Global 360.

Technorati tags:

Sunday, October 08, 2006

Welcome to a new era

Things have been a little quiet for me on the blog scene as I've been trying to get myself together. I resigned my post at Vignette as a Solutions Architect last Wednesday. I had been part of the company since March 2004 when Vignette completed its acquisition of Tower Technology. I met some great people and worked on some interesting projects while I was there. As much as I was enjoying the challenge of working in the Industry Solutions group in the company, the innovation and excitement of the key solutions I had focused on had passed. It was time for a new challenge and my final day was Friday. The company has some great technology and talent and I wish everyone luck in the future.

Fortunately, I have been talking about an opportunity with a company outside Vignette since early July. After a long haul with visa applications, and a huge amount of patience demonstrated by the company (especially Ben Cody and Steve McDonald), I will be starting as Director of Product Management for Global 360 on Wednesday. I'm extremely excited to get myself fully embedded in pureplay BPM (something Vignette was lacking) while being able to leverage my experience in ECM and case management.

This is going to be an great new challenge for me, and so far Global 360 has lived up to their reputation and desire to be a great company to work for even before I get going. I know that they have seen a lot of change over the last few years and so I'm looking forward to driving further change and growth.

Technorati tags:

Tuesday, October 03, 2006

Microsoft and EMC - Documentum is just an archiving hub?

By now, most people interested in the ECM space have seen the press release that EMC Documentum and Microsoft announced an alliance. Guy Creese on Pattern Finder suggests the EMC (like other vendors) is getting out of the way of the Sharepoint train:
Given the viral adoption of SharePoint 2003, and the new architecture of SharePoint 2007, the major ECM vendors are realizing that they can no longer dismiss Microsoft as a company that just generates documents.

As OpenText, Vignette and EMC Documentum look at Sharepoint 2007 and how it is effectively bundled with Office 2007, they have all probably realized that they need to evolve to either embrace Microsoft or find a new way to package their products. The huge marketing budgets that MS is bringing to their ECM offering is an alure, as well as a threat I'm sure.

Russ Stalters on BetterECM suggests that EMC Documentum should extend to the integration to workflow enable Sharepoint documents, which sounds reasonable given the need for strong workflow/BPM to support Sharepoint, although this may be a little shortsighted as MS enhances its document workflow capabilities. Though it obviously plays to Documentum strengths in its raw form.

I'm not surprised that EMC have stepped in here, but I don't believe that they are really trying to sell more Documentum software on its own merits. Documentum provides a single point of integration to hook directly in to a bunch of heavyweight storage boxes, which is probably the reason EMC is most interested in the partnership. Basically, Documentum becomes little more than a 'hub' for mediating storage to a range of EMC devices. But then as the press release states:
As part of this strategic alliance, EMC will introduce a set of new content and archiving products that enable tighter integration between the industry-leading EMC Documentum ECM platform and Microsoft solutions and platform technologies.

"Archiving" is a typical word used when the value of the content is minimal, a company is just looking to store huge amounts of it. If this is how EMC really sees Documentum + Microsoft, this is a perfect play for them in their role of tin merchant.

Technorati tags:

Adoption v Selection of BPMS

My little series of posts last week on document classification and tagging seemed to get away with a lot of other relevant blogging I could have been doing. As ever, some of the most interesting blogs I read were not really that time sensitive, but instead did a good job of revealing an individual blogger's experience at that moment in time.

The Workflow Blog talked about one of these experiences in "the look of workflow":
The interesting thing about this project was not the hours spent creating working workflow applications, but the weeks making things look pretty. Or in some cases not that pretty but to customer specifications. As a techie I am still struck by how most users would much rather have something semi-functional that looks pretty as opposed to something that actually works. I think in the end adopting a workflow system is not a rational decision based on return on investment but rather still an emotional decision.

When I first read this I thought that the comment on ROI surely couldn't be right. After all, every vendor gets pushed for the ROI they can offer, to help the internal project team further justify a hugely expensive IT project.

Thinking more deeply, it seems that the ROI between one deployed and running BPMS and another surely can not be that different? The thing is that much of the cost of a BPM project comes from analysis, design, and deployment upfront then a bunch of similar running costs. TCO of mature systems can't be so different to make a huge dent in an ROI to assist in making a justifyable decision, so emotional decisions start to play.

Re-reading the blog quote above, neither the adoption nor the selection of a BPMS is actually based on ROI. It is users that adopt systems, choosing through their own cooperation to work with a new system and work around its minor issues, or to choose to be inflexible and make the project and adoption of the system fail. As the Workflow Blog goes on to suggest, it is the emotional piece that vendors typically influence through "prettiness", and this affects both adoption and selection:
However as more and more companies start competing head to head it will not be the feature set that customers will look at in making decisions. It will not be support. It will just be, is it pretty to look at and easy to use. I am not sure any analysis by Garter et al really comes close to capturing this information.
Adoption is an essential thing to bear in mind when selecting a system. Every BPMS client application can be made pretty and functional though, so beware of being influenced purely by skin deep beauty. The real ROI that will make a BPMS a success can only be judged by how it can affect the bottom line and how quickly. The differentiation between systems comes from the day to day improvements to an organization's operations, how adaptable the system is to changing processes, and even more importantly can it react and optimize the processes under different circumstances like sudden shock loads and extended excessive demand. If you can manage user adoption by meeting the usability requirements, the deeper capabilities are what provide true business improvement.

Technorati tags:

Monday, October 02, 2006

Business process mashup - effective but maybe brittle

On the IT|Redux blog, Ismael blogs about Enabling Complex Workflows with Office 2.0. Here he presents his experience of connecting a series of standalone online applications, to provide a useful end to end business process.

The aim of Ismael's process was to automate the "lead to cash" process for conference sponsorships through the use of purely online services. As Ismael says himself, this isn't the most complex workflow he has seen, but has enough complexity to make it a useful illustration of what may people would just try and do manually because of the effort of trying to set it up with a traditional BPMS.

I'm impressed with what Ismael has working. For this level of complexity what he has done is probably more effective and appropriate than modeling and enforcing the process in Excel as would be done by most organizations. At the same time, by plugging components together end to end without a central point of orchestration or tracking makes the implementation feels brittle. This is not a reflection of the individual tools, more that I prefer to see automated business processes being coordinated from a central point. In any mashup where loosely coupled components pass messages from one to the next, if one component breaks or fails to pass on a message successfully to the next component, the process will not be completed, is unlikely to be recoverable and this will not be noticed unless a human goes back to check everything end to end.

I strongly believe in the importance of approaches that enable non-business users to implement business processes, and with the added complexity of integrating multiple components this is a challenge. Selecting the right tool for the job is important. In this case I might have chosen a system that incorporated more of the components that Ismael needed in a single package, to reduce the need for mashing up so much stuff. By using a system that already had seamless Forms, Process Management and Email, you could produce a less brittle system, hopefully with less effort. There are no Office 2.0 tools out there to do this at the moment, but there are vendors that offer Case Management tools, targeted at rapidly implementing just this type of business process.

It sounds like Ismael put this together partly to prove a point, and in doing so he has demonstrated that online Office 2.0 tools offer a decent approach to automating processes that others would have just run in Excel. That is a great step forward!

Technorati tags:

Friday, September 29, 2006

Converging classification schemes of documents and records - Part 5 - Tagging and pulling it all together

All this week I have been exploring some of the approaches that enable the classification of documents and records in Electronic Document and Records Management Systems (EDRMS), with the aim of being able to see how the convergence of all of these schemes may lead to a more usable classification mechanism, as we have seen emerge with tagging on the Web. I have covered:

Combined classification schemes

During the discussions, I have continuously reiterated that each of the classification schemes described is not complete on its own. Structured indexes are great for business process driven document managment, like my insurance claims example; full text indexing is great for unstructured, work in progress and documents that are fully described by their content; fileplans, the staple of traditional records management are great for organizing documents into big buckets representing the business activities around them; thesaurus enforced keywords provide taxonomic classification for items related to a specific domain.

All four approaches are in fact used together in EDRMSs, to provide a fairly complete classification of the documents for recordkeeping purposes. The structured index metadata captures a lot of information about the description, authorship, ownership and status of a record. A component of this, a single item of metadata, may capture multiple keywords driven from a controlled thesaurus enabling consistency in domain classification of the records. In traditional recordkeeping environments, the fileplan will provide a further big-bucket classification and may drive high-level security and retention. The fileplan may also drive the records management for physical documents and assets within the same environment. By far, most of the information is added by records managers at the point of declaring documents as official records (the definition of what makes a document a record was in the first post of this series).

This is a lot of metadata and classification that is captured, based on the original business use, future retrieval requirements and the content of the document. If documents have been electronic throughout their life, or at least prior to being records, often there will be collaborative tools and document management systems that have also captured their own metadata along the way. Much of this remains valuable to the business users to retrieve documents to do their jobs, even after documents have been passed for records management. Using this business information without having to duplicate it or the documents is a challenge. Vignette has some great approaches to this business document/metadata and record/metadata integration problem. They exceed the standard mechanism of a separate records management system referencing or copying documents in document repositories or filesystems that risks duplication , damaged data and broken custody issues.

Business users can't be trusted to classify records

Part of the reason for this week-long discussion of records management has been to get to the point of understanding why business users can't be trusted to classify records. Partly, this is because many business users can't even be trusted to store documents in a secure place without some carrot and stick persuasion. The feedback from users has been that even if they store documents, classifying them is a time consuming process that seems complex - being faced with a long form of data to fill in feels like completing your tax return.

Even if business users were inclined to store records, their business is not to understand the details of records management to ensure effective, consistent classification. So is there a halfway house?

Tagging for user friendly classification

Many Web 2.0 sites use tagging to help random Internet users find documents, blog posts, URL bookmarks, photos, videos, music and whatever else may be put out there by contributors. Contributors have a vested interest in tagging their content effectively, since decent tagging is likely to lead to more viewers.

Although the information contributor attaches may not meet the Dublin Core standard element set, the information he provides is likely to provide a fast and concise classification of the content, enabling it to be found by users when a full text Google search may not help.

Tags are just keywords that describe the content, but they typically don't come from a thesaurus, instead being added by the contributor to meet his current requirements. Taking a look at my Technorati tag cloud, you can see how there is a mass of tags that perhaps may have only been used once, and may never be used again. From a classification point of view within the context of this site, these one off tags probably reduce the value of the cloud for users trying to find relevant posts. From an external viewpoint, an anonymous user searching Technorati for interesting blog posts may find the pointedness of this classification valuable. If the user searches explicitly for tags like taxonomy+indexing, he or she gets 2 back blog posts that are identified as being relevant to those tags. Searching blog content with taxonomy+indexing leads to 452 posts, many which may just mention each word once in the course of the text.

The Wordpress blogging tool goes a little further, providing predefined categories for tagging posts. This enables some consistency to the blog categorization, useful for more focused blogs, like my (personal) pet project, Bruncher. Here, restaurants and diners that serve brunch are categorized with a predefined set of tags based on their location (e.g. MA, Boston), style (bar, diner), and most importantly how good a Bloody Mary they serve (0 - No Bloody Mary in sight, up to 4 - The best!). Wordpress allows custom tags to be applied to any post, but within this fixed domain of interest the fixed categories seem to work well and further tags may not provide any value.

Tagging for business users

In the business world, users are unlikely to be working on documents that require user friendly tagging that also has a very narrow domain or fixed set of tags, otherwise this whole discussion would be a non-issue; classification would be fast and easy. Therefore it seems that the freeform tagging technique may be a good start for providing some contextual information about the document they have produced. The accuracy of tagging is not enforceable, since it is completely freeform, but maybe it provides enough information to enhance document search and retrieval beyond pure full text indexing prior to a document becoming a record.

How can this be used to enhance record classification? A records manager could choose to use some of these freeform tags as keyword that they further refine at the point of filing. Alternatively, the records managers could collaborate with business users to provide a very limited set of fixed keywords for typical documents that translate directly to the records management environment, supplemented by freeform tagging to actually represent the context of the document.


Records management is a highly involved discipline, requiring specialist records manager to ensure the consistency of record classification to ensure that records are retrievable and are retained according to corporate policies and legal mandates. At the same time, there is a huge amount of electronic information being produced by the business that should be kept as records. Reusing as much of the business metadata as possible is essential to ensure efficient and scalable records management resources.

Tagging is a form of classification that seems to be acceptable enough to Web users that it could be applicable to non-threatening corporate document classification by business users as well. A combination of predefined categorization and freeform tagging may not only help users searching for documents find them prior to record declaration, but also assist records managers (and maybe automated systems) in the formal classification.

This is the end of this series of records management classification posts. I hope that some of the reference information will be useful to people over time and maybe some of the newer ideas will trigger some discussion. I have many thoughts for corporate document tagging, none of which I know of have been proven in practice in a corporate setting. I'd love to hear of any examples.

Thursday, September 28, 2006

Converging classification schemes of documents and records - Part 4 - Keywords, thesaurus and tagging

For last few days I have been posting about some of the approaches that enable the classification of documents and records in Electronic Document and Records Management Systems (EDRMS), with the aim of being able to see how the convergence of all of these schemes may lead to a more usable classification mechanism, as we have seen emerge with tagging on the Web. So far I have covered:

In this post I want to look at the thesaurus as an approach to help users in accurately applying metadata, since I believe that this is getting very close to the Web tagging paradigm.

Thesuarus / Controlled Vocabulary

In records management a thesaurus or controlled vocabulary is used to assist records managers to apply metadata to records that is more consistent and falls within a recognized taxonomy.

In yesterday's post I talked about taxonomies as being a way of classifying documents according to a predefined scheme. This has the advantage of guiding users to pick from available and recognized items when identifying their documents. A fileplan is a specialized form of taxonomy that provides a representation of the business or filing structure that documents relate to, and also provides some extra notation to assist in efficient filing and retrieval (the filecode).

A thesaurus is a specialized way of representing a taxonomy. It is used to add identification metadata to documents according along a specific classification dimension - it is limited to specific a domain or topic, not intending to fully define the record.

A language thesaurus that organizes the English language vocabulary and defines relationships between 'literary' words within it. A records management thesaurus focuses on a specific domain or type of activity (rather than the whole language), laying out a set of acceptable words or keywords that make up the vocabulary, defining the relationships between them. A typical way of doing this is to provide a tree of words, starting with the most general or broader terms within the topic and working towards the most tightly defined or narrower terms. The aim of the thesaurus is to ensure consistency of use of the keywords, so additional descriptions and scope notes are provided to help elaborate and reduce the chance of different people interpreting words differently. Within this hierarchy, there can also be relationships that cut across branches to show related terms.

The Keywords AAA thesaurus is a well recognized example from Australia, used in New South Wales government record keeping. The NAICS is a scheme that defines standard industries for commecial or employment classification, which many people will be familiar with when classifying themselves. Both of these schemes provide identifications for things within their specific domain and therefore do not usually fully describe the thing they are attached to.

In an EDRMS a thesaurus is a tool to help users pick the correct keywords to apply to an item of metadata for a record. Dependent on the definition of the metadata attribute, one or multiple keywords may be selected, to fully identify the meaning. Thesaurus keywords may be used alongside any other metadata to classify records, so metadata from multiple thesauri may represent the classification of a single record in the multiple domains of its use. Alternatively, a thesaurus can be used alongside more straightforward index metadata. The thesaurus really just provides a tool to help guide records managers to provide the most consistent and exact information to classify a record for a single item of metadata.


When used in records classification, a thesaurus is a tool that enables a specific item of metadata representing to be set with the most tightly defined term or set of terms available within the set. It can provide a tight definition of the records within the constraints of the recognized terms, a specific domain's taxonomy.

Typically a thesaurus is not used alone and much like a fileplan is just another way of more accurately identifying records for storage and effective retrieval. It is a tool to help users pick the correct keywords to apply to an item of metadata for a record. This enables the document to be 'tagged' with keywords from a well defined vocabulary. This is similar to the category tags used in Wordpress and other blogs, so I feel I'm getting close to my original aim. In the next post I will round up all of the classification schemes I have described, and try and relate them to the use of tagging on the Web.