Thursday 3 March 2011

Guest blog: Basel III – The compliance conundrum

by Linh Ho, director of product marketing at OpTier
 
Critics have not been kind to Basel III. Whilst few will argue that championing greater visibility and tighter regulation of liquidity controls is a bad thing.  It’s widely believed that the legislation lacks teeth and is, in truth, a fairly weak ‘knee jerk’ reaction to the economic crisis. Regardless of sideline criticism, implementation of the legislation in some form will be vital to the future of the banking system. In practice, it is potential confusion around the cross over between Basel II and III that will be the real sticking point.
The fact is, even if analysts and banks do come around to the new legislation ideals, they are likely to have trouble implementing the processes effectively within their current IT environment. As Alison Ebbage noted in her recent article for FST, one of the key challenges that banks face in their drive towards Basel III, is fragmented and siloed IT infrastructures. The article noted that this can make things cloudy in terms of generating a holistic view of events happening across trading platforms. This poses a significant operational risk to banks.
In terms of mitigating unforeseen risks, the onus is still very much with the banks to ensure that their internal processes and failures don’t let them down. In particular, The Accord specifically cites business disruption, data loss and security breaches arising from system failure as events that banks need to protect themselves against. As these processes are very much enabled by technology, IT needs to ensure its got its own back.
If banks are to enforce the latest legislation, simplifying these complicated IT landscapes will be the key to success, but it’s certainly a tricky business. For years banks have invested in sprawling systems, adding more and more layers as they were needed. In this situation, identifying how, why and where an IT problem has occurred is arduous, time consuming and expensive. With a complicated mismatch of systems and pressure to implement new regulations, I’d bet my bottom dollar that most IT managers wish they could clear out their IT cupboard and start again. In this day and age, a rip and replace strategy isn’t viable. So, with operational risk a much overlooked - yet pivotal - part of Basel III, what’s a bank to do?
Flipping IT management on its head is a good starting point. Rather than thinking about individual applications and how they’re performing, banks need to generate an end-to-end view of all business transactions in real-time. Traditionally this hasn’t been possible because IT management has been just as siloed as the systems it monitors. Because of these siloes, blind-spots have been created for IT, making it harder for IT to quickly find and fix problems. This situation has made it very difficult to avoid downtime or application slowdown. Steering clear of these potential threats is crucial in order to reduce risk, which in turn makes it easier to monitor compliance processes. If comprehensive records of IT performance are the norm, potential areas of risk can be readily identified and acted upon. By ensuring these records are in place and are constantly updated, the humble IT department will become recognised as a vital, reliable and valuable business unit.

No comments:

Post a Comment