Wednesday, April 28, 2010

Making it harder to steal your data - but will the auditors trust it?

Customers want their data to be secure, firms don't want to be liable for the financial and brand damage due to lost client data, and governments are pushing in new regulations to tighten privacy and security policies in businesses. For once, the standards bodies seem to be keeping up, more or less, with the needs of the industry, helping to clarify once and for all the use of more advanced security technologies and their legal aspects. For example, if a firm encrypts data making it hard to get at by unauthorized people, how are the auditors, who are used to unrestricted access to everything going to respond?

The new set of standards from the Payment Card Industry Data Security Standards (PCI DSS) is expected to be released in October 2010 by the PCI Security Standards Council. A Thales press release discusses some related research on the impact of the changes, in a survey sponsored by them.
The Ponemon Institute, an information-management think tank, designed the survey to focus on identifying trends, recommendations and preferences of QSAs involved in PCI DSS compliance. Specifically, the survey questions focused on the background, experience, client observations, expected changes in PCI DSS, preferences on how to achieve compliance, and typical client recommendations. The results are available in this newly released report, sponsored by Thales entitled: PCI DSS Tends 2010: QSA Business Report. The report can be downloaded at www.thalesgroup.com/iss
"Our research continues to validate that 60 percent of QSAs believe encryption to be the most effective means to protect card data end-to-end, and 41 percent of QSAs said that controlling access to encryption keys is the most difficult key management task faced by clients using encryption. It remains clear that QSAs consider encryption to be one the best techniques merchants can use to keep information safe and comply with PCI requirements. The current version of the standard, however, is ambiguous about how exactly encrypted data should be treated in audits, so QSAs seem to be confident that the October 2010 update to PCI DSS will provide clarity," says Dr. Larry Ponemon, chairman and founder of The Ponemon Institute.
Encryption and electronic signatures are technologies that are well within the reach of companies handling client data, payment transactions, sensitive information and the need to prove non-repudiation of agreements and contracts. The approaches to implement the technology still need a little wiring to get them to work, though we are seeing solutions from companies like Thales, and the vendors we expect to see at Finovate on May 11th, that are making this easier.

Now it just needs clear, unambiguous judgments on the use of these technologies at the level of compliance, audit and legal to say once and for all that a well implemented system of a certain design, with appropriate management, will satisfy the courts and the regulators the way paper, a signature and a locked safe once would. The new PCI standards help, but there is still a long way to go.

A post from the Improving It blog


Wednesday, April 21, 2010

Your browser preference won't keep you safe

The Zeus Trojan, a nasty piece of malware has been rewritten to target users of the Firefox browser. With an estimated 30% market share, Firefox was the natural next step for the trojan writers to target, after getting such 'great' results attacking Internet Explorer. The fraudulent financial success for the trojan has been its ability to perform online banking transactions within the user's browser directly through the real website, rather than having to try and compromise banking security in some other way.

So one of the options for protecting yourself from this nasty piece of software has been removed. Previously, your easiest option was to avoid using Internet Explorer for online banking, instead using one of the major competitors out there: Firefox, Chrome, Safari or Opera (yes, I know there are other browsers, though most regular computer users are probably not going to use them).

As I discussed in Protecting online bank accounts - is there an app for that?, another option may be the use of a browser that resets itself to its initial installation after each use - a common implementation of this being delivered as a virtual machine appliance that saves no data or changes, running a whole operating system and browser within your PC to simulate a fresh, new install every time. For a while VMWare was pushing an appliance to do this, although I don't see it now. The advantage of this is obvious: even if a trojan does make it to your browser VM, the next time you run the machine will have reset itself to 'pre-trojan' state, so your risk of anything bad happening is much reduced.

What other options are there? As suggested by some, a completely separate, secure hardware device may be an option. A cut down netbook style PC that is limited to only your online banking account, though I see this being less cost effective and only equivalent in security to a virtual machine.

Another alternative is an application written for the native operating system, making it easier to lock down and harder to subvert than a browser, which by design has to be open to developer 'hacks'. Or, we could all jump in bed with Apple and their highly restrictive software practices, to use only online banking applications on the iPhone or iPad.

Wherever we go with online banking security, the current approach to blaming Microsoft for our woes is not going to continue to work. The writers of trojans that perform fraudulent transactions for criminal money-making do not have a care which businesses or organizations they damage along the way, so open-source and Microsoft are equally fair game if the opportunities are there to extract cash out of unsuspecting consumers.



A post from the Improving It blog


Tuesday, April 20, 2010

FinovateSpring 2010


I'm started to get excited for the Finovate Spring 2010 event in San Francisco. As the organizers put it:

On Tuesday May 11 in San Francisco, FinovateSpring 2010 will again showcase the most cutting-edge financial and banking technology innovations to Silicon Valley and the world.

With Finovate's signature mix of short, fast-paced onstage demos (no slides allowed) from handpicked companies and intimate networking time with their executives, this conference packs a ton of unique value into a single day.

The format of these type of events provides for a great energy. Great presenters, overstuffed on Red Bull and product hype put on a show that tries to appeal to the crowd and their specific target audience. And the audience gets a great opportunity to see what is hot in the financial services technology space.

The companies presenting are already putting a lot of time and effort into building the most compelling demonstrations, tuning their messages, and making sure that this investment in time converts to getting investors, customers and the press interested in their products.

For me, it is the range of products is what makes this event the most interesting. There are a bunch of electronic payment solutions, consumer focused sites for 'financial planning', and even some of the back-office systems providers. I am hoping to get the chance to chat to many of the presenters, while I attempt to blog live from the event (only if my brain and fingers decide to cooperate). So if you would like to catch up with me while I'm there, drop me a note.

If you are interested, the companies presenting are:




A post from the Improving It blog


Friday, April 16, 2010

I am not a number - unless I trade $20M a day

As the SEC proposes a new trading system that requires individuals or firms to register and receive a trader identification number if they plan to trade more than $20 million or 2 million shares per day. The aim is to allow better tracking of the activities of individuals and firms. In doing so, the SEC is reinforcing the concept that an client's activities are reflected by their overall actions, not just the transactions in a single account. This is not always the way securities firms view, or have viewed their clients.

Of course, at a fundamental level the transactions within an account are what are of most interest to a firm. The account defines what a client can do, what their limits are, and what stock they hold. But from a compliance and risk level, surely it is the health and overall activity of a client that makes more sense. If a client is close to bankruptcy, it may only be the aggregate of all accounts that indicates this, rather than any one item. If there is a risk of insider dealing, perhaps it is only the spread of transactions across accounts that would give this away. Clients with seriously nefarious intentions are unlikely to be dumb enough to not try and obscure what they are doing, and spreading activity across accounts in firms that don't pay close attention may allow them to do this.

Even at a day to day level, having a single view of a client is better than a per-account view. It allows simplified compliance around proof of identity, it allows better opportunities to advise the client well and holistically, and it shows that you really know your customer, not just the stove-piped accounts they hold.

Will the SEC's proposal really help them identify dodgy-dealing? Maybe, or maybe not. But if the SEC can have a single view of a trader, perhaps it is time for the firms executing their instructions for trading to do the same.

A post from the Improving It blog

Thursday, April 15, 2010

Reporting processes - scary stuff

My excuse for my recent blogging hiatus is that I was kicking off a project (or three) with a new client - a securities firm in Vancouver. Rather than spend time expelling ideas, it was time for me to sit, listen and learn. It seems that deluge of incoming information in one direction stopped too much coming back out in the form of this blog. So now that we are all refreshed, its time for me to restart the outpourings - with a thought or two on 'reports'. Hmm, interesting stuff, all those numbers printed on pages of paper - I can't wait to hear (you might be saying to yourself with a forced smile). Hold on though...

In previous lives I have worked directly with electronic reports management tools, either selling, implementing or defining a strategy for the product. Although essential in an ECM vendor's stack, the tools rarely got much attention despite the obvious benefits they could offer customers:
  1. Reduce costs printing and storing paper reports
  2. Simplify the distribution of reports to consumers of the information
  3. Secure notes and reviews of the reports, with a far more effective audit trail
These are just three of the many advantages that can be seen from diverting the flow of reports from real printers to a virtual printing environment that generates an on-screen representation of the original data. There are many other clever features of electronic reports management that allow the reports to be split up automatically based on text on certain pages, and indexed so that a quick database search could pull a specific piece of information (such as a customer statement) from the thousand upon thousands of pages collected each print run over the years (you've seen this with the pretty PDFs of your credit card statement online). But what happens when the report is just that - a report?

As I have been seeing on my travels, the reports generated for back office operations are often not a print-stream from a mainframe that really represents documents such as statements going to customers. They actually live up to their name and are just what they claim to be - reports: a point in time snapshot of data. Examples are: the commissions owed to your agents; the trades made in the previous 24 hours; the status of credit on customer accounts for the last week. From an audit or regulatory standpoint, these snapshots show you are paying attention to your business in a way that a dynamic query of your database can not (things change in databases, therefore generating a realistic point-in-time snapshot is often hard, ineffective or inaccurate).

Storing these reports is easy enough. Replicating what some of them are used for in the business is harder. As I have seen, it takes a real technical accountant to review a securities trading report, ensuring that there are not exceptions or compliance issues. The markups these professionals make in the review process are hieroglyphics to the untrained eye, and essential evidence to the regulators. This is something that traditional reports management software would struggle to handle - allowing someone to rapidly and naturally mark up a report in 5 hours over the course of a day.

Why do I tell you this? Because the obvious technology approaches do not fit every requirement once you really take a look at the issues. Reports management tools will not solve all the issues with real reports. I don't have an answer today. One proposal is to implement more automated solutions for spotting exceptions and helping the technical accountants work more on those exceptions than the 99.9% of things that aren't issues. But that's a challenge for somebody else. For the next week or two, those reports are going to get printed, marked up and stored with paper-cuts included.

My excuse for a blogging hiatus is over. The one way valve is broken, and information is ready to flow both ways again. Watch out!

A post from the Improving It blog


Tuesday, April 06, 2010

Improving processes as companies merge

Mergers and acquisitions can lead to chaos at the best of times, with inconsistent processes and un-integrated teams. Can business process management (BPM) or general workflow automation help in M&A related organizational integrations, or is it just money being thrown into an inferno of change?

BPM marketing often uses words like 'agility' when describing how your processes can be adapted to respond to change. Agility implies an ability to dodge new requirements and keep ahead of the chasing threats, all requiring constant attention from Management to work. I prefer a process improvement solution that is 'flexible' and can absorb the impact of unexpected actions and requirements, automatically taking on the new shape required by a rapidly changing organization.

In a chaotic organization, there is the temptation to avoid process improvement, since it is too much to handle and things are changing too fast. Despite this, bringing structure to the processes that enable a company to operate is essential, whether there is chaos surrounding them or not. With a strong starting point, a department can not only run its own operations better, it can start to influence the organizations with which it interacts. In an M&A scenario, having a solid base of process can make it more appealing to retain that backbone of operations in the new organization, rather than defaulting to the typical political decisions for what stays and what goes.

The problem is that traditional BPM takes a significant time and investment to implement, which is rarely a possibility when approaching M&A. A better approach may be instead to consider a 'quick and dirty' fix to the key processes that will be interacting with the new organization, in the knowledge that everything will change anyway. To be successful, the 'quick and dirty' approach needs to also be flexible (that word again), to adapt to the constant change around it, rather than just collapsing by being too brittle.

Picking the right technical products to build your solution is one part of it. Collaborative technologies, like wikis are great for sharing information in a loose team with minimal effort, but they don't help reinforce processes that require repeatable operations and decisions. Business rules management may be better for processes that rely on complex decisions that must evolve over time, where the end-to-end process is limited in length. BPM is fine where there is rigidity and constant repeatability required and you have time to put it together. Software as a Service (SaaS) or prebuilt solutions running in the cloud can help to get things up and running faster.

I think that companies need a combination of all of these, even when they are not in the chaos of M&A. And rapidly built solutions using flexible blended applications may not be a bad approach to follow.

A post from the Improving It blog

Let us help you improve your business today. Visit www.consected.com

Thursday, April 01, 2010

Electronic Health Records - pick your regulator

How can I grow my sphere of influence, and therefore my budget? Is that what some government agencies are thinking when it comes to electronic health records (sometimes referred to as EHR) in the US? I agree that it is important that there is oversight to ensure that individual health information is protected -- without it, we could end up in a free-for-all of litigation and hurried legislation. Though as an over-simplification, its not like the storage of a person's health data is not already electronic in major hospitals, or that they can't share information between them, electronic or otherwise. The e-health records debate seems to think it is blazing new ground, when really it seems there is a large foundation to start with.


Anthony Guerra on InformationWeek Healthcare discusses some of the issues and the challenges around bringing in an agency such as the FDA. I should avoid the politics, and instead focus at the key issues that need to be covered:

  • Setting standards for privacy and security of health information
  • Providing a control framework around the different transactions and business processes related to health data
  • Setting standards for availability of data (including disaster recovery, business continuity and even escrow scenarios), to ensure 100,000 health records don't just go 'offline'
  • Identifying and enforcing the quality of information, and highlighting substandard information for future clean-up, or at least careful scrutiny at the time of use
  • Providing a way to identify the experience and expertise of doctors adding information to a file, to enable the validity of data to be easily identified
  • Developing a framework for better sharing of only relevant information with different consumers
  • Enabling anonymous data mining by researchers
  • Setting boundaries for insurer access to information, especially in organizations where insurance and delivery of health services fall under a single corporate umbrella
  • Managing the certification of systems
  • Handling issues and failures in controls and security

The reality of the situation is that there is much more to electronic health records than just privacy and security. Once records are available, the many transactions and processes associated with them, some old (create a new patient, add an event to the history) and some new (handle a request for access to information, transfer custody of records to another organization), will start to fall under the same banner. It is a times like this that the FDA, with its history of managing the processes and controls of drug approvals, can offer some good background. There are times when other organizations are more used to working in the actual environment of health care providers.


There are challenges around formalizing the new 21st century approach to health records, but I wonder how long we should see the Office of the National Coordinator for Health IT (ONC), FDA Centers for Medicare and Medicaid Services (CMS) and others getting dragged in and out of the frame for being the 'regulator'. What is needed is someone who can coordinate the formulation of best practices from inside and outside the industry into a meaningful and less than 600 page definition that can be followed by software vendors, systems integrators and health information consumers alike. Then we can decide who should police it.

A post from the Improving It blog

Let us help you improve your business today. Visit www.consected.com