Friday, December 02, 2011
A combination of new technologies, systematic methodologies and sheer necessity is driving firms to finally address their legacy issues, writes David Bannister.
For as long as anyone can remember, the financial technology world has had to get to grips with the problems caused by the plethora of legacy systems in use in most organisations.
Banks have always been heavy users of technology, but they are also strong believers in the old maxim, "if it ain't broke, don't fix it". The result is that most organisations have a raft of systems held together by sealing wax and string.
In the recent Banking Technology Awards, the trophy for Best Use of IT in Retail Banking was picked up by Metro Bank, the new kids on the block in the UK. While one member of the judging panel declared that he was impressed by the "simplicity and rigour" of Metro's approach, others were less sure: they've just gone out and bought a load of off-the-shelf software, said another judge.
Exactly - Metro has pretty much gone down the packaged software route 100%, and why wouldn't it? Anyone setting up a similar organisation would surely do the same.
While many words have been written about the trials and problems of living with legacy systems, the tide seems to be turning, with a number of new approaches and technologies bringing hope that the ancient thickets can at last be cleared out.
The first is cloud computing. This is making it possible for developers to quickly prototype new systems in the cloud without having to decommission the existing systems. Such a sand-box development approach removes one of the major risks in legacy replacement -there are so many complex dependencies in modern systems that taking out one part of a system will almost certainly have unexpected consequences somewhere down the road.
This is not a panacea, cautions Sanat Rao, vice president at Finacle: "In our assessment, legacy migration would move towards replacement of legacy modules by those from new age software vendors. Typically large banks, especially Tier 1 or 2 banks have a plethora of systems based on legacy architecture that was written decades ago. These systems are interconnected and moving any part of these to the cloud will be fraught with risks. Banks would first like to do a modular replacement. However, large banks might take steps to move their systems onto private clouds to optimise their hardware costs or they may move few non-mission critical applications onto cloud."
Rao says that the cloud works particularly well for smaller banks. "We've seen that movement to the cloud is more driven by Tier 3 or 4 banks - what we could call small or community banks - which have much less complex landscape of internal systems, follow standard processes and are very cost conscious. These are the banks which are likely to use community/public clouds where banking applications would be hosted."
A second shift is that people no longer talk aggressively about Legacy Replacement - over the last year or so, white papers from large IT vendors like IBM, HP and Oracle have started to talk instead about Application Modernisation. It may just be a change of words, but somehow it sounds more achievable.
At the same time, the economics of banking operations continue to be grim - recent forecasts show that the total number of people working in banking will reduce by "at least 25% over the next 10 years", according to the analyst Ralph Silva of Silva Research.
In the past few months HSBC has announced global job cuts of 30,000, as has Bank of America Merrill Lynch, while Lloyds Banking Group has cut 15,000. UBS, Barclays and Credit Suisse 3,500 each, another 1,000 each at JP Morgan, Goldman Sachs and Nomura, and similar numbers at ABN Amro, RBS, Société Générale and Deutsche Bank.
Yet there are still large areas of the industry where automation has not resulted in headcount cuts, says Elizabeth Gooch, the eponymous founder of EG Solutions, which specialises in operational improvements, particularly in the back-office.
"Despite waves of investments in technology, we still have an awful lot of people in back offices - there are three times more in the back than in the front in some parts of the financial services industry," she says, adding that this is less true in investment banking than in areas of retail, such as in mortgage processing, which is very labour intensive.
"The economic factors are the main drivers in pushing businesses to consider how they can address this messiness in the back office and take control to have customer-led demand so that they can optimise the number of people they have to employ. For optimise, read minimise, because no one wants to employ more people than they have to," she says.
As well as legacy systems, there are the effects of legacy business decisions that have been made, some of which are blind alleys, she says. "I don't think that outsourcing of processes has necessarily helped - technology can be pretty nimble, but outsourcing of processes can make it pretty turgid: we have one client where it takes them 10 weeks to make a simple change to a database because they've outsourced the process."
Colin Rickard, business development director at DataFlux, says that managing data is a central part of systems migration. "Banks are updating core operational platforms to free themselves from the costly constraints of legacy systems and to improve flexibility," he says. "In addition to selecting the right systems to move to, ensuring adequate and well planned data management is paramount. There will need to be extensive data migrations, often from multiple legacy line of business systems to new integrated risk systems. This necessitates a process that captures all data from across the organisation. It also necessitates data quality routines that identify existing errors in data before that legacy data is moved to the new system, where it is much harder to remove duplicates, coordinate interrelationships between data elements i.e. customers and their products and to fix basic formatting errors."
Might it not be better to leave well alone and apply the "ain't broke" rule? No, says Rodney Nelsestuen, senior research director for financial services strategies and IT investments at TowerGroup.
Application modernisation can address three major drivers of change, he says: the need for increased business agility; the need to reduce IT costs across the board; and the increasing requirement for compliance with new regulations that are rolling across the industry.
In a report published towards the end of last year, Nelsestuen argues that business agility is a competitive advantage. "The ability to anticipate and respond to changing business conditions has become the new currency of value in the financial services industry," he wrote.
His argument is that there are business needs and functionality that is required by most organisations in a more timely fashion than a traditional IT department can deliver.
A rip and replace strategy would have its own risks, so he recommends that FIs modernise incrementally - as discussed in regard to cloud computing above - "without taking the time, making the sizable investment, or facing the operational risks associated with a total systems replacement project".
Within that there are a number of things that FIs - which he describes as "essentially tech companies" - can do on a tactical level to get immediate day-to-day improvements: keep development and testing off production platforms; rewrite apps in newer languages, and move them onto more modern hardware. Alone or in combination, these will increase performance and lower the ongoing costs of IT operations, he says.
This is not terribly different to the issues faced by some vendors with large numbers of installed users on ageing systems. ACI Worldwide, for instance, is migrating users of its integrated payments system from the older BASE24 software to the newer BASE24-eps.
According to Andy Brown, director of product marketing, there are a number of drivers that can make it the right time to switch. One of these is regulation - the introduction of EMV cards effectively pushed its user base to upgrade, but the introduction of the Single Euro Payment Area has not yet, largely because the volumes have been so low users haven't felt justified.
Another driver can be consolidation - BASE24 had five connections to Swift, for instance - but whatever the reason, it is not to be undertaken lightly, he says. "Payment systems are so integrated into the systems of the bank that it is not a trivial task,"
For a vendor, there is also the balance between easing the customers onto a new platform and being able to reap the benefits and twisting their arms to force them into an unwanted change. "We won't be discontinuing support, but we will be actively encouraging people to move across over time" he said.
ACI has produced its own guide to the issues, Replacing Legacy Payments Systems, downloadable from its website.
As well as giving an excellent overview of the history and reasons for the complexity of existing systems, the guide lists some of the cost implications:
"It is estimated that managing and maintaining legacy systems still consumes up to 90% of North American and European institutions' IT budgets, leaving a little more than 10% for innovation and program development. Even at Asian financial institutions, where legacy infrastructures are typically less complex, maintenance consumes 70% of the total IT budget. When compliance management is factored in, the amount remaining for advancing new and original projects can drop even further. In such a highly competitive market, leaving such small sums for innovation and the support of new product development is dangerous."
These costs come from a number of areas:
■ Redundancy: When platforms, functions and data are locked away in individual stovepipes, expensive and error-prone duplication is inevitable.
■ Maintenance: Multiple systems, with multiple vendors and multiple architectures, simply require more time and effort to maintain than rationalized, streamlined IT estates. A larger support team is required and there is no opportunity to secure preferential rates with vendors.
■ Resourcing: Software developers and system administrators who specialise in non-standard programming languages attract a premium. Applications written in common languages provide organisations with a wider pool of professionals from which to recruit their staff.
■ Upgrades and customisations: Enhancing business processes authorisation, switching and routing decisions can be drawn out and require extensive testing across each payment silo that is affected.
■ Regulation: Compliance and auditing requirements are becoming more stringent. Financial institutions with archaic legacy systems must implement multiple instances of their compliance processes or face heavy financial penalties for breaches.
At TowerGroup, Nelsestuen lists similar sources of costs, looking across a number of channels, but he also places them against some of the harsher budget realities of today's world.
It is no longer a question of a CIO writing out a shopping list and then asking for the budget to spend on it. "Instead, the budget process must be based in value creation and both quantitative and qualitative return on investment," he says. "The IT department is now viewed by FIs as a business unit that faces the same demands as other lines of business - to add wealth, not to consume it."
Fortunately, he reckons, application modernisation rather than a rip and replace core systems replacement "can increase efficiency dramatically at lower costs".
Modernisation has similar benefits when it comes to systems rationalisation - traditionally this involves getting rid of whole applications, but that doesn't really improve things if the underlying systems are not also changed.
Overall, it is not just about cost savings - modernisation affords the ambitious CIO the opportunity to be a hero to the business by cunning redeployment of the savings effected. "A CIO can create and internal IT cash flow from which the allocated resources can be spent on higher-level activities, including new business functionality," writes Nelsestuen. "Funding a value-added service by reducing IT maintenance cost turns information into a strategic asset."
Case study: the Co-op transforms itself
As part of its core Financial Transformation Programme, the Co-operative Banking Group is installing a centralised Teradata data warehouse platform to re-engineer the bank's accounting processes.
Teradata won the contract in partnership with Microgen, which develops specialist software that will be used to feed financial data into the data warehouse, which integrates credit risk and accounting data into a single environment. Predicted benefits include increased usability of data, reduced management overheads and improved time efficiency.
The bank's Financial Transformation Programme began in 2008 and involves the replacement of its core banking systems and complete overhaul of its payments systems with the installation of a Clear2Pay Payment Hub.
The bank wanted to augment this new operational environment, which has already been delivered in its corporate banking business, with a business intelligence environment, for which it chose Teradata and its Financial Services Data Model as a solid foundation for credit risk and accounting-based business intelligence. The model is expandable to the entire enterprise so that other functional business intelligence can be added in the future.
Within the programme, the Microgen Accounting Hub and Microgen Aptitude is used to feed the general ledger and the Teradata data warehouse with finance information.
"The Microgen Accounting Hub supports customer account level journal information as well as summary general ledger data. It is this level of detail that enables the bank to align risk and finance data back to the general ledger and group financial statements using the transparent and open Accounting Hub architecture on Teradata," said Elizabeth Sipiere, divisional managing director for EMEA and Asia Pacific at Microgen.
The use of Accounting Hub also lets the bank de-risk the implementation of other strategic systems by using both it and Microgen Aptitude for account mapping and accounting rules creation, replacing older, less open interfacing into the general ledger.
Over the last 18 months Microgen has continued to optimise its software to make effective use of Teradata's highly scalable MPP architecture. The Co-op project is the first retail bank implementation that optimally integrates Teradata and Microgen solutions and offers a new scalable accounting solution for banks.
"We are delighted to have signed a contract with Teradata and Microgen. Together, they offered us a cost effective, scalable solution that offers new insight for our accounting and credit risk business community," said Ian Wade, change manager at Co-operative Banking Group.
New Capco & Commerzbank model for measuring & mastering IT complexity
"What if complexity was a discreet, measurable metric rather than a discretionary, ambiguous term? Could a complexity metric reshape the decisions and activities of a CIO?"
That's the question that Capco and Commerzbank set out to answer in developing a model for measuring IT complexity in the financial services industry.
The problem had been preoccupying Peter Leukert, chief information officer at Commerzbank for some time, and a chance conversation with Capco a few years ago revealed that they were also thinking of creating some sort of metric that could be used to master the issue of complexity.
The two have now completed the model and methodology, outlined in a White Paper entitled IT complexity: model, measure and master.
The model consists of several complexity indicators covering the relevant dimensions of application complexity. Capco and the bank have statistically validated these complexity indicators through quantitative research using Commerzbank data on approximately 1,000 applications over three years.
The complexity indicators of the model cover four dimensions:
■ Functional: One driver of functional complexity is the functional scope of an application cluster. The sum of weighted use cases serves as the figure to measure this. In each case, a weight factor is assigned, from "one" for simple use cases, such as changing one data field, like address, to "four" for cases with involved logic. Use cases have been classified by expert judgment. Other indicators measure the functional redundancy and standard conformity of the solution architecture.
■ Interface: Interface complexity is determined through the sum of weighted interfaces. The weights are calculated by type of interface, such as API, file exchange, database view and broker Web Service. This indicator shows the interface intensity. The ratio of internal to external interfaces is another relevant indicator.
■ Data: Data complexity is measured by the number of database objects in an application cluster.
■ Technology: Technology complexity is monitored through a number of drivers, including business criticality and prescribed time for recovery of applications under consideration. Another technology indicator is the variety of operating systems employed in the application cluster.
Currently the model only contains data from Commerzbank but the plan is to make it available to other financial institutions, increasing the credibility of the metric based on comparative data points from across the financial services industry.
Ultimately, by sharing and anonymising data, institutions will be able to benchmark their complexity against peers, but the primary application will be in using the metrics to identify where things can be simplified - or where the effort would not be justified by the result - and to be able to predict the impact of change programmes.