Risk and Regulatory reporting: Centralisation is not the answer

In this article, Mike O’Hara, publisher of The Trading Mesh talks to Tony King of Violin Memory, KPMG’s Steven Hall and Rick Hawkins, Stream Financial’s Gavin Slater and Ash Gawthorp of The Test People, about how some of the major challenges that banks face around intensified post-crisis risk and regulatory reporting requirements. Why does it make sense to take a federated rather than a centralised approach, and how can new technology, particularly in the area of flash storage, help banks achieve new standards in performance in risk reporting?

 

Risk and regulatory

 

 

 Introduction

The post-crisis environment poses many challenges to c-suite executives at major banks, particularly in the areas of risk and compliance. Regulators want information that is detailed, comprehensive, consistent and above all, accurate. And they want it quickly. But meeting new risk and reporting requirements is a challenge for the often-siloed data management infrastructures of banks with diverse business lines across multiple geographies. While efforts to centralise are often thwarted by budgetary, organisational and legislative barriers, technological advances such as the use of enterprise-class flash memory are offering new ways to persist data at previously unachievable levels of performance, enabling consolidated and aggregated information to be delivered much more rapidly for better risk control, while also providing senior executives with user-friendly insight into the performance of the bank’s business lines.

New regulatory landscape

Banking is currently undergoing significant regulatory reform. Some products, markets and organisational structures have been effectively outlawed. Others have been separated out and restructured for the sake of transparency and removal of conflicts of interest, while still others are subject to new capital charges or reporting requirements on account of their inherent risk.

 

Regulators, compliance and risk officers are demanding more granularity, based on regular high level reporting, and often at increased frequency.”

Tony King, EMEA Financial Services Sales Manager, Violin Memory

 

The pace and scale of regulatory change is both bewildering and burdensome, but there are common themes and requirements that banks should take account of when reappraising their internal data management capabilities. “Regulators, compliance and risk officers are demanding more granularity, based on regular high level reporting, and often at increased frequency,” says Tony King, EMEA Financial Services Sales Manager, Violin Memory. “This means that large and complex data sets need to be drawn from multiple sources, consolidated, aggregated and reported on. Banks that try to take a central data warehouse approach are finding that not only can this be an incredibly complex and expensive undertaking, but that meaningful results are very difficult to achieve in a timely manner, particularly if that data warehouse is disk-based”.

In 2014, major European banks conducted different types of stress tests at the behest of the European Central Bank and their national regulators. Many took a tactical and labour intensive approach that is unlikely to be sustainable in the longer term. “Regulators have made it clear that approaches to Basel III and stress-testing reports should be seamless, run off the same data and systems as other reports. Many banks’ stress testing processes have been manual and exceptional. But it is beginning to hit home that both sets of reports will be subject to the same level of microscopic examination and must be based on the same data,” says Steven Hall, Partner, KPMG.

Whether watchdogs are concerned about the impact of a 30% swing in a major currency or a cyber-attack, banks must respond quickly to unexpected demands for risk assessments based on detailed and consistent data for all their different units and business lines. “There is an expectation that banks will have to deliver on much shorter time frames, perhaps working to a 48-hour turnaround rather than 28 days,” adds Hall.

 

Many banks’ stress testing processes have been manual and exceptional.”

Steven Hall, Partner, KPMG

 

For senior management, this means establishing not only a coherent, responsive data management infrastructure for the whole organisation, but also both an enterprise-wide approach to data governance.

“The increased need for timeliness demands a greater focus on data governance, ensuring that the calculations and assumptions on which the reports are based are appropriate and consistently applied,” says Rick Hawkins, Director, KPMG. “Banks must show the regulators that they are using the data in a consistent manner – and demonstrate their understanding of that data.”

 

The increased need for timeliness demands a greater focus on data governance.”

Rick Hawkins, Director, KPMG

 

Limits to centralisation A quick assessment of banks’ existing approach to data management highlights many internal and external challenges to meeting today’s reporting requirements let alone tomorrow’s. There are many reasons for an inconsistent approach to data management for a bank that competes in multiple geographies, legal jurisdictions and product lines. Whether organic or by acquisition, growth brings complexity and expediency, as well as myriad different regulatory and reporting requirements.

But efforts to centralise and harmonise data management processes and systems are often fraught with organisational and technical barriers. Even when power struggles – both with the central functions and between departments – can be resolved to achieve a common aim, such as compliance with a particular regulation, enterprise-wide data warehouses are often of limited value because they are designed to meet a distinct purpose.

“As soon as you try to centralise something in a very large organisation it becomes monolithic, and unable to adapt to a very rapidly changing environment, because if something changes in a particular business, you’ve got to change the entire centralised system just to cater for one small change in one little sub-business,” explains Gavin Slater, Co-Founder, Stream Financial.

The autonomy granted to branches or divisions across the world can also lead to major gaps in databases if those divisions have historically only collected and reported data required by their local regulators. And when it comes to centralising data management, local jurisdictions can pose barriers too. The need for certain types of data to stay in country for regulatory reasons has historically been addressed by banks through the copying of data between local branches and centralised units, but that brings its own problems.

“In theory, data could be aggregated by regular copying of operational data to a single reporting system, much like a data warehouse – based on systems which are optimised for complex analytical processes, but updates to this centralised system are often conducted daily, with poor data integrity until all source systems have supplied periodic data, and no ability to deliver near real-time actionable data,” says Violin’s King.

 

New approaches required

Advances in technology – driven by increases in processing speeds and bandwidth availability – are offering banks and other large, complex organisations an alternative to prevailing centralised or highly manual approaches to risk reporting compliance. By harnessing such advances across hardware, software and networking capabilities, it is becoming increasingly viable for firms to implement a standardised approach to data management and governance which spreads the workload and the responsibility across the organisation. Such approaches ensure that inputs are appropriately structured and configured at the front end, and outputs. for example, aggregated, detailed risk reports required by regulators – can be delivered quickly, accurately and with consistency on an enterprise-wide basis. From an operational infrastructure perspective, this allows data to stay within in its original jurisdiction and avoids both ‘turf wars’ and the implementation challenges of imposing centralisation from above. The increased flexibility and agility offered by this ‘federated’ approach can also provide senior executives with easy-to-manipulate and up-to-the-minute information for commercial and strategic planning purposes.

“These systems provide a centralised query capability that federates queries to those operational systems in real time, and aggregates the data to provide real-time compliance and risk data. This means no copying of data, and no lengthy data integrity exposure windows. But it assumes that complex and ad hoc queries can be made on operational systems at any time,” explains King.

In terms of underlying technology, consolidation of operational databases on flash memory, a non-volatile and reprogrammable storage medium, is the key to a federated, real-time reporting and query capability for compliance and regulation. “It’s a classic case of the entire stack working optimally, from analytics application, to dataset to storage medium. Until all operational systems are based on a memory-based storage solution, compliance and risk systems will either be out of date, or operate at the performance of the slowest operational system being queried. This delay is becoming simply unacceptable for most financial institutions,” King adds. This can be mitigated of course by using flash storage as a caching layer, or choosing to use flash as the persistent memory store for those core applications. The trend to moving active datasets to flash from disk is well underway, as it provides memory-like performance, with the persistent safety of enterprise storage. Using flash at the ‘edge’ is part of an end-to-end strategy to support real-time risk applications.

 

The hard part is mapping the terms used in the many disparate local data sources to the central schema. It requires a mindset shift.”

Gavin Slater, Co-Founder, Stream Financial

 

Among the most important pre-requisites for a federated solution is for data to be sliced and diced and standardised in a manner appropriate for the queries subsequently expected, to optimise speed of retrieval. As such, a critical step is to devolve responsibility for standardisation of database inputs to the appropriate level, leaving local front-office staff to decide how best to adapt centrally agreed principles to local realities.

“Standardisation doesn’t have to mean centralisation,” says Slater at Stream Financial. “The data schema can and should be developed centrally, but the hard part is mapping the terms used in the many disparate local data sources to the central schema. It requires a mind-set shift at the centre and locally, but the original data source owners in the front office must be told: ‘you have a responsibility not only to provide data to your customers, but also to give data to the support functions who need to access your data in some standardised schema’.”

It is also important that banks’ data input standardisation efforts are conducted with due awareness for industry-wide standardisation initiatives such as the use of legal entity identifiers (LEIs), and message standards frameworks such as ISO 20022. “All the efforts at standardisation at an industry level and the use of standards such as LEIs in incoming regulation are very important, but there is no getting around the hard work needed to map the global schema with the data that actually sits in individual systems and business units,” adds Slater.

 

Implementation and fine-tuning

Although a federated approach to data management has a number of advantages, it is no ‘plug-and-play’ quick fix and requires senior executives to ascertain significant levels of buy-in across the organisation. At the outset, both the active involvement of technology teams that support the platforms on which the data management structure relies and the requirements of the senior executives that signed off on its implementation must be present.

“Without a clear a requirements spec and ongoing direction from senior executives, a lot of time can be wasted building unnecessary functionality,” says Ash Gawthorp, Technical Director at The Test People, a performance, engineering and testing consultancy.

 

Even the fastest technology relies on data being organised in a fashion that allows it to be processed and interrogated efficiently.

Ash Gawthorp, Technical Director, The Test People

 

A number of important decisions must be taken up front to ensure a federated infrastructure to data management will perform its tasks to the standard and speed required. For example, the maximum eventual size of the database(s) should be estimated to gain a sense of the number of records that will need to be interrogated over the coming years. Although regulations may change on how long data must be kept available for live interrogation before being archived, planning will give the best chance of optimal performance.

“Even the fastest technology relies on data being organised in a fashion that allows it to be processed and interrogated efficiently,” says Gawthorp. “Another question that should be resolved early on is acceptable recovery times from a failover. In-memory databases do not get back up and running immediately.”

Stream Financial’s Slater points out that ‘caching’ strategies can overcome some concerns over access to and control of data. In traditional data warehouses, fears that a major query can drag down the performance of a system has reinforced the tendency to centralise. But it is possible to put a protective layer around the system so that a query hits the cache rather than the underlying database. This allows users to access the full history and eliminates the need to keep local copies, as many did previously rather than go through the hoops required to access the central database.

Even with many factors agreed and understood in advance, a federated data management infrastructure will require a programme of detailed testing and tweaking to the specific needs of the individual organisation at every tier to reduce query response times from hours to minutes.

“It’s important to start testing as soon as possible and be willing to accept the need to reconfigure. The sooner you begin performance testing, the sooner you will know whether the system can achieve the task set,” explains Gawthorp.

With regulators demanding greater responsiveness from banks, Stream Financial’s Slater believes performance will become ever more important. Fast feedback loops, he suggests, lead to more accurate, reliable reporting and more reassured regulators. “If your system can provide the lowest level of granularity, and aggregated across many dimensions, within a day, you will have a faster feedback loop, better information and be better able to react to change. By using aggregated data to inform and change positions and the overall risk profile, banks will be able to respond to unexpected market events more effectively,” says Slater.

 

Conclusion

Today’s data management capabilities have largely failed to meet today’s expectations from banks’ senior executives or their regulators. The chances of them coping with tomorrow’s demands with the current approach are slim to none. Banks are already braced for the requirements of regulators to intensify and accelerate.

“The changes in the regulatory environment mean banks must be able to aggregate risks within business units and across the group, and aggregate exposures across risk types and be able to drill down within that,” says KPMG’s Hall.

As such, it is no surprise that many senior executives are exploiting technology to help them handle the onslaught of requests. The holistic interrogation of operational data to give a complete picture of risk and compliance positions in very short time frames and in granular detail requires a very responsive storage platform. “From a risk and regulatory viewpoint, banks must service ad hoc requests with immediate effect. Flash memory offers exactly this – a nearly limitless ability to service data requests at very low latency,” says Tony King at Violin.

Importantly, banks are also using these capabilities to better understand their own businesses. The combination of faster database interrogation and aggregation with user-friendly front ends is increasing the utility of federated data management structures beyond regulatory reporting. By adopting such solutions, banks gain extremely high performance access to relevant subsets of distributed data sources, allowing them to query operational systems directly without impacting the performance of those systems.

This means that timely business decisions can be made on data that is spread across many geographical locations & many systems. This ability to rapidly combine and aggregate the results of queries from a central point not only improves the speed of decision-making, but also the accuracy as decisions are made based on high quality source data.

Stream Financial and Violin Memory have demonstrated that enterprise flash-based solutions, combining the speed of in-memory technology with the scalability and persistent storage capabilities of disk – and where additional flash-based storage can be added very easily and scaled up very efficiently – are an excellent fit for the challenges faced by today’s financial services industry.


For more information on the companies mentioned in this article visit:

 

Menu