It’s a classic dilemma where companies find themselves with one foot in the past and one foot in the future. Financial services firms are confronted with the challenge of managing reference data in legacy systems and simultaneously introducing new technology that can better identify and manage risk, improve operations and support new strategic priorities.
The technology for delivering and handling reference data has advanced significantly – slow, clunky batch files are no longer the only way for firms to keep their reference data accurate and up to date. Yet many companies are stuck with old embedded systems that can lead to costly data problems as well as missed business opportunities. So can firms take advantage of new technological solutions without implementing massive change management programmes?
Check out our podcast episode: Leapfrog your Legacy Systems on Spotify
Reference data is generally not thought of as among the more exciting aspects of a company’s trading operations. Mark Woolfenden of Euromoney TRADEDATA has jokingly referred to it as “the poor cousin to market data” but it is also “the glue”.
That glue has business impacts in a variety of ways. In terms of cost, problems with reference data can cause broken trades and force companies to devote resources to identify and address the sources of problems; or they can require slower-than-desired workflows. In terms of revenues, how a company deals with reference data can make the difference between launching a new service in a timely fashion and not being able to launch one at all.
Many financial companies do recognise they have an issue when it comes to reference data. One recent survey of senior executives who oversee data operations across the sell-side showed that more than half of all Tier 2 and Tier 3 banks (53%) reported that their current set-up was inefficient. Tier 1 firms were in a better position as they have the scale and resources to devote more time and investment into tailoring their systems to their needs. But even some of them faced issues, with 14% reporting inefficient operations.
At the same time, it can be difficult for companies to articulate the value of reference data systems. To make that case, firms need to understand the full dimensions of the problems that old-fashioned reference data processes cause. They need to consider the growing demands that are being put on their systems. And they need to understand that addressing these issues ultimately is not only about data, but also about workflow.
Are there ways that companies can address their data pain points and create business-centric workflows without completely overhauling systems and bringing in skillsets? This is where technology providers such as ipushpull and Symphony Technology Group and data suppliers like Euromoney TRADEDATA are seeking to build new operational solutions.
Some companies have system configurations that go back decades. At the same time, they are constantly introducing new components into their systems to ensure their processes are up to date without undertaking major overhauls.
“A lot of that may have been also forced by the pandemic, and the need to really work remotely. The division has been reinforced by what has happened in the past 12 to 18 months,” Woolfenden said. “Various companies, such as ourselves, who represent key parts of the trading, clearing, settlement lifecycle trading operation, are having to gear up but also to think holistically how it all works, and how our solutions complement each other.”
In fact, financial firms need not think about a major system overhaul to modify their reference data operations and workflow. “The agility of the technology has enabled firms to move incrementally now, and to patch around big legacy systems to get the functionality that if you were designing it from scratch, you would get with a lot fewer miles of pipes in the works,” says Woolfenden.
This is typically done via APIs and distributed data models. API-based solutions are not just about increasing efficiency. Cheung noted that when firms build licensed reference data into products, API technology can automate specific data sets to feed into downstream systems that generate products. “All this is interoperable. That’s the way the industry is moving. We’ve seen more momentum in that in the last 12 months than years before.”
Data security is also enhanced. The data itself may sit in the cloud, where it is encrypted. And then all access is permissioned and can be audited. A good example of where reference data and workflow come together is in the resolution of broken trades. For instance, firms could build a workflow that allowed secure chats to be created without context switching to address broken trades, with an audit trail incorporated.
By giving the end user direct access to the data, with whatever controls the firm wants to establish, it means the user has a richer set of information to work with. App-based technology means that firms can build workflows around the data so that, for example, changes in reference data could set off alerts that appear in chat windows.
What is key, however, is that firms recognise that app-based delivery and consumption of data, hosted in the cloud, is the way of the future. Nowhere is this truer than for the Tier 2 and Tier 3 firms, where the pain points are most acute.
“The second and third tier firms get squeezed because the overhead on certain costs of participating in the market is fixed. There are some big overheads, in setting out your reporting processes and compliance processes, that a bigger firm can amortise and spread the cost more easily,” Woolfenden said. “You’re carrying a disproportionate overhead relative to your order book, and that’s a classic margin squeeze. Therefore, they have to be really clear in creating and sustaining a lowest cost operating model.”
At the same time, second and third tier firms are less wedded to legacy technology and in general think in terms of being agile. There will still be a cost to change, but not at the level where they need to justify a massive switch.
The direction of travel is all about eliminating wastage in the marketplace. For many firms, that means making a break with the past and embracing data on demand. For Woolfenden, it’s actually about the power of omission: “What don’t we need to use from the historical way that the industry provided data services? That’s what occupies most of our focus”.