Get real: The changing face of trade reporting and data monitoring

In this article, Mike O’Hara and Adam Cox of The Realization Group investigate the issues and implications of monitoring data and reporting trades in real-time, a technological and operational challenge that has significant business ramifications for market participants. As regulators call for more information faster, trading firms are scrambling to work out how to adapt their systems so that transactions are captured as quickly as possible without interfering with the basic business of making profitable trades. But real-time trade reporting is easier said than done. Legacy systems, organisational issues, costs and strategic factors can all play a part in the solution a firm takes, whether buy-side or sell-side. What analysts and technology experts say is not in dispute, it’s whether modern technology can enable firms to fulfil both their regulatory requirements and their business objectives at the same time. Mike and Adam hear from Tony King and Allan Fenwick at Violin Memory, Paul Spencer of Velocimetrics, Simon Appleton at Duff & Phelps’ Kinetic Partners division, David Grocott and Matt Dangerfield of Financial Technology Advisers, and Rebecca Healey of Incisus Group.

 

Get real trade reporting image

Introduction

It’s been more than half a dozen years since the world’s political leaders set out to revolutionise the way financial markets operate by calling for a massive increase in the scope of trade reporting. The 2009 G20 summit in Pittsburgh marked the start of a regulatory campaign that has led to a lot of hard work and head-scratching at both buy-side and sell-side firms. Since that political kick-off moment, Dodd-Frank in the US and EMIR in Europe have both made their marks by setting out detailed reporting requirements for various types of transactions. MiFID II is now fast approaching and it promises to up the ante by aiming for trade-related information to be made available as close to instantaneously as is technically possible. Yet post-crisis regulation is hardly the only driver behind the real-time data capture movement. Internal compliance, risk management issues, the competitive landscape, costs and technological advances are all factors that are encouraging firms to overhaul their systems. The good news is that cost-effective solutions are feasible, and a number of companies are already leading the way by showing what is possible once you toss aside old methods for capturing, storing, retrieving and transporting your data. The decreasing cost of flash storage technology and the advent of new hardware and networking capabilities have offered firms a pathway to efficient real-time reporting. That means that companies have the potential to improve not only their bottom lines through cost savings, but also their top lines through increased revenues.

 

New millennium, new headaches

When many established firms set out to build their current financial trading systems, they had no idea what was about to be asked of them. Up until 2008, trading volumes had been on a steep and steady incline and the onus was always on enabling the front office to trade as much, and as quickly, as possible. Regulation, meanwhile, was decidedly more ‘light touch’.

The environment has changed completely, but in many cases the technology has not. “What it comes down to is that current infrastructures were designed for the regulatory environment of the 20th century,” says Tony King, UK Manager at Violin Memory.

“And clearly, the complexity and volume of data that people are storing now – added to which you’ve got the pressures from the regulator and compliance within the business – means that those traditional infrastructures just cannot cope with the scale of the data that needs to be retained, the length of time of it needs to be retained for, and also, the complexity of analysis you need to do on that data.”

 

Current infrastructures were designed for the regulatory environment of the 20th century. The complexity and volume of data that people are storing now – added to which you’ve got the pressures from the regulator and compliance within the business – means that those traditional infrastructures just cannot cope.
Tony King, UK Manager, Violin Memory

 

King’s juxtaposition of 20th-century requirements with 21st century reality offers a clear illustration of the scale of the challenge that market participants face. On top of the regulatory demands, the explosion in trading volumes over the years, the introduction of new financial instruments, the proliferation of trading venues and the hyper-acceleration of trading speed have radically altered the landscape for firms looking to capture and store data effectively and efficiently.

Paul Spencer, COO of solutions provider Velocimetrics, which specialises in business flow performance monitoring and analysis, has extensive experience wrestling with just those issues.

“What we have seen in a number of projects is that all of this detailed data, at very high volume, is squirreled away to a massive database with a view that, at some point in the future, it will all be analysed,” Spencer says. “But it becomes overwhelming. The volumes are too great and the data isn’t being stored in a way that enables the cause and effect relationships between different pieces of trading data to be easily identified. This makes it very difficult to perform meaningful analytics in a timeframe that will give you information that’s of any use to your business or to your client’s business,” he adds.

What is more, Spencer notes that regulators have become much more precise in terms of what they mean by real-time.

 

Without the real-time capture and analysis of trading data, firms risk only becoming aware of an issue once trading losses have already been incurred. In today’s ultra high speed trading environment, this approach can prove very expensive.
Paul Spencer, COO, Velocimetrics

 

“It’s actually being spelt out in terms of the requirements for clock synchronisation, accuracy and resolution of timestamps, etc. This was always left to the individual participants’ common sense previously,” he says. “Now it’s quite clear and, for many organisations, that means that their existing way of doing things won’t comply and they’ll need to upgrade. Even excluding the regulatory requirements, if you look at things from a profitability and operational management and control perspective, without the real-time capture and analysis of trading data, firms risk only becoming aware of an issue once trading losses have already been incurred. In today’s ultra high speed trading environment, this approach can prove very expensive.”

But there are solutions. A number of experts say that the decline in the cost of flash storage technology coupled with hardware and software innovations offer the possibility of meeting regulatory challenges, addressing internal risk and compliance issues and increasing business opportunities all at the same time.

To begin to unlock that potential, firms first need to appreciate the complexity of what they’re dealing with, especially if they want to prevent systemic issues from slowing down their trading or impacting their business models.

 

Compliance may be using a different toolset than trade support or infrastructure. Since they are all essentially using the same data, they need to make copies of it to avoid one system disrupting another. But nobody wants any of that interfering with the main objective, operating the trading platform.
David Grocott, Co-founder, Financial Technology Advisers

 

A world of complexity

For any sizeable firm operating in multiple markets, one of the first issues that arise when considering the need to monitor and capture trading data is how fragmented a company’s operations are.

“You’ve got different departments responsible for different things, coming at it from different angles,” says David Grocott, Co-founder of consultancy firm Financial Technology Advisers.

“Compliance may be using a different toolset than, for instance, trade support or infrastructure. And since they are all essentially using the same data, they need to make copies of it to avoid one system disrupting another. At the end of the day, nobody really wants any of that interfering with the main objective, operating the trading platform,” he says.

All of this was less of an issue in the past when a trade reporting platform, probably based on the compliance software, would be capturing data at slower intervals, possibly every 5-10 minutes or even as slow as once every half an hour. Older infrastructures did not need to have the monitoring systems sit so close to real-time platforms.

 

If a firm deploys agents on a trading or market data platform, the tasks those agents conduct can affect the platform’s operations. The more intrusive the agent, the more load it puts on the system.
Matt Dangerfield Co-founder, Financial Technology Advisers

 

Grocott’s fellow co-founder Matt Dangerfield says the problem with most monitoring systems, regardless of who makes them or what they do with them, is that they need to capture a number of points of information from the platform they’re monitoring and much of the time that can impact the underlying system. For instance, if a firm deploys agents on a trading or market data platform, the tasks those agents conduct can affect the platform’s operations. The more intrusive the agent, the more load it puts on the system. IT specialists have become better over the years at minimising that load, but it still ends up having a latency impact.

Beyond the question of how agents collect data, there are a host of other issues that complicate matters. The need for a global view of a firm’s positions, the prevalence of algorithms, and the need to accommodate data collection for a variety of asset classes all pose challenges for IT engineers.

Grocott notes that in today’s environment, senior executives are now expected to be on top of their firms’ aggregate positions. “You more or less instantaneously need that sort of global view of what your current positions are within the bank at any given moment in time.” But that presents a particular challenge once the various asset classes are considered.

Simon Appleton, Director of Regulatory Consulting at Duff & Phelps’ Kinetic Partners division, says that at many firms, different asset classes are held in different systems with separate data repositories. “So it can be quite difficult to reproduce everything centrally,” he says.  One idea would be to have a central hub of all transactions, but that can be a tall order, with firms having to sign off on multi-million-pound projects.

“You have to store data for five years now, and you do need to be able to reproduce everything, including things like the alerts that have been generated, the commentary that you’ve put against these alerts, your trade surveillance alerts, and so on. Good record-keeping is essential,” says Appleton, who has previously held senior positions for the British financial regulator and the Swiss stock exchange.

He notes for instance that MiFID II has quite detailed requirements around record keeping for HFT firms, algo firms and trade venues. In fact, for sell-side firms representing clients on various venues, they have an obligation to know exactly what’s happening with those clients on any given venue at any point in time. In other words, what once was considered best practice now is a firm requirement.

 

You have to store data for five years now, and you do need to be able to reproduce everything, including the alerts that have been generated, the commentary that you’ve put against these alerts, your trade surveillance alerts, and so on. Good record-keeping is essential.
Simon Appleton, Director of Regulatory Consulting, Duff & Phelps’ Kinetic Partners division

 

Yet another complication arises for those firms that rely heavily on algorithms, which today is a significant portion of market participants.

“If you’re a large sell-side firm and you’re running your own algos, then clearly you need staff there that are monitoring those algos on a real-time basis. Maybe you’re giving direct electronic access to other participants. So you must be monitoring their activity on a real-time basis, because you need to have a monitoring system and control framework in place to prevent a disorderly market from occurring,” says Appleton. Given all of these complexities, an immediate question is: what’s the solution?

 

Going straight to the source

Traditional methods rely on extracting information from the trading platforms themselves, using log files as a means of getting information about what has been happening. But that only provides a single point snapshot view of what’s going on in a particular system. It doesn’t provide an end-to-end view of the complete transaction flow from input to exchange gateway and beyond.

“Currently, to piece things together, the owners of those systems would have to go and look at multiple different systems, multiple sources of data, and manually tie together the story of what actually happens,” says Spencer. “That really isn’t good enough for the kind of proposals that we’re discussing now. They need to have a more instant, end-to-end view of the trade’s complete lifecycle, so the factors that influenced decision making at the various points, regardless of how complicated the trade may have been, can be quickly identified in a way that doesn’t require significant manual intervention to reconstruct the story.”

>To achieve real-time data capture, the key is to go straight to the source, which is the trading platform itself. While Dangerfield of Financial Technology Advisers has noted the problem with placing data capture agents on the platform, there is another route – and it relies on persistent flash technology.

Flash memory has actually been around for more than a quarter-century, after Intel introduced a flash chip in 1988 based on invention by a Japanese scientist at Toshiba. It is capable of much faster data capture than traditional disc-based methods, although historically it has also been much more expensive. As flash technology has spread throughout the commercial market and the technology has improved, costs have come down, to the point where large amounts of flash memory are now available at much more affordable costs.

Dangerfield says that flash memory’s ability to do so many millions of writes and reads per second lets IT architects consider engineering in a different way.

For instance, an adaptor could be written inside the system, rather than having agents capture at source, and so long as the data is moved away from the critical path – trade flow or market data dissemination – then it could go directly into a single point of storage, where it could be replicated in the event of a fail-over.

“In essence, a central hive of data is then streamed directly to this memory, via this flash unit,” Dangerfield says. A schema sitting on top of that can then allow multiple users to interrogate that data, in real time and in a variety of ways.

 

Firms are really going to have to start asking some hard, fundamental questions about the approach they are going to take, because this is not just buying the next widget; it’s actually fundamentally understanding how you are going to achieve your business model, or if that’s even possible now.
Rebecca Healey, Managing Partner Incisus Group

 

Dangerfield says there are a number of ways data can be captured, including using a remote agent or a hardware acceleration card. Once trades are captured, a company can scrape all the log files from the processes and string them directly off to remote flash storage. That lets users look at the data from a number of different points, which in turn allows the company to investigate it not only from a technical perspective but also in terms of business and regulatory objectives.

Such a system, he says, has multiple benefits. It can give the business far more insight into its trading, effectively giving the ability to generate instant transaction cost analysis. It also can help improve the testing of algorithms. “All of a sudden, you’ve got one system that can do three, four, five different things. Through one actual CapEx or OpEx cost, you get all these benefits.”

From a risk management perspective, one of the most significant benefits is that real-time data monitoring can help firms detect rogue algorithmic activity. “If it’s human-based trading, an experienced human trader would spot when something’s wrong. But an algo system probably wouldn’t, and because it’s working in such a high-speed timeframe, a lot of damage can be done before it gets spotted,” Spencer says.

It can also help firms in their relationships with their clients.

“Having a real time view and having analytics that are at least as good as the clients’ you’re servicing, is really very important because it will transform the relationship that you have with those clients,” Spencer adds. If a firm can quickly spot problems, whether the reasons are internal or external, it is in a better position to proactively address them and to inform the clients of the situation, so the client can then take effective decisions to limit the impact on their trading activities. It changes the entire working relationship between the provider and the client.

“The pitfalls for providers who don’t respond to these challenges is that they’re going to lose business. It’s as simple as that,” Spencer says.

Rebecca Healey, Managing Partner of Incisus Group, points to another benefit: the ability to achieve – and to prove – best execution. With real-time trade capture, a firm can better understand why it took a given trading decision. “Best execution is a big focus for buy-side firms at the moment. Not just from a regulatory standpoint; this is also a fiduciary responsibility,” she says.

 

Taking the plunge

While both the requirements and the benefits from real-time monitoring and trade reporting are clear, there are still many firms that are cautious about overhauling their systems.

Healey for instance says that a huge number of firms are still playing catch-up. “The question for them now is what do they do? Do they make that level of investment, or do they partner, or do they outsource? Firms are really going to have to start asking some hard, fundamental questions about the approach they are going to take, because this is not just buying the next widget; it’s actually fundamentally understanding how you are going to achieve your business model, or if that’s even possible now.”

Healey adds that the firms that have taken the necessary steps to prepare for the new requirements for now will have a competitive advantage over those that have either ignored or put this issue on the to-do list.

Appleton of Duff & Phelps’ Kinetic Partners division says sophisticated monitoring is important for firms that have algo-based systems routing orders to multiple venues, including dark pools. Ultimately, real-time monitoring can have an impact on the effectiveness of the pre-trade controls a firm has. He says there is a mix of capabilities in terms of the range of firms he gets involved with.

Furthermore, management at some firms may not even be aware of what infrastructure is supporting the solutions they have. “The status quo has always been to use what was available,” says Tony King of Violin Memory. The business, or the application teams, simply buy a certain amount of storage capability or compute capability but don’t actually question what “sits under the hood”.

But with regulators moving decidedly towards real-time reporting, this kind of stance is unlikely to be viable in the future. “You need to be able to respond at the drop of a hat to the regulator on any number of things that you may or may not be able to predict. But you need to make sure you have all that data,” says King.

Even though some firms may have been slow to embrace the challenge that real-time monitoring and trade reporting represents, there is still a real opportunity to consider the system changes they will need to make within a wider business context.

Whether it’s achieving best execution, gaining insight into trading activity, preventing rogue trading or improving the relationship with clients, the move to real-time offers firms the chance to enhance parts of their business. In fact, some might even argue that the regulators are doing the financial sector a favour by forcing them to make changes that will eventually have significant business benefits.

And it’s that marriage of regulatory requirements with attractive business opportunities that makes the move to real-time such a different story to so many of the previous post-crisis regulatory situations. King of Violin says one day firms will start to see that the system changes which they are required to make could end up creating far better service or client solutions.

Not just some flash in the pan: how memory technology makes a difference The Realization Group talks to Allan Fenwick, EMEA Sales Engineering Director of Violin Memory to learn why flash technology is so critical when engineering real-time capture systems.

TRG: Let’s start by talking about the fundamental difference between flash and traditional methods?

Allan: For many years disk-based technology was based on a mechanical medium, so it’s actually a physical moving part inside a system to store information. And this has been the status quo for roughly 50 years.

Now, as people look at newer, faster technologies, such as the adoption of smartphones and iPads, you will have noticed the difference between the performance and speed between the two things. Disks take up a significant amount of footprint and power and cooling in your infrastructure. Also, they cannot cope with the speed of information flow that flash can. Flash, for argument’s sake, can actually cope with at least 20 times more transactions than a disk-based system.

So in laymen’s terms, disk is very good for storing information that you never really access. However, flash is a very different medium in terms of the speed of performance and also the density. It’s a lot smaller per footprint than you will find with a disk drive.

TRG: So how does flash technology fit into the question of real-time capture?

Allan: Clearly, the investment in memory and what I would classify as non-persistent areas of cache is quite prominent in the first part of the data part in the trade system. This is instant response, which, it could even be via a bespoke FPGA-style environment initially. Or in some form of in-memory or in-processor, very, very fast response to acknowledge that particular part of a trade, for instance.

The challenge is that once you’ve performed that action, there’s a workflow after that where you would need to either archive or acknowledge, or keep information, potentially in real time, that’s queryable. So it’s not just about acknowledging the fact that a trade is completed, it’s also about being able to analyse that trade and provide regulatory information post that process.

Now, that requires some form of persistent storage mechanism to keep the data. If you do things in DRAM or you do this in FPGA, there’s no actual persistent mechanism for keeping that acknowledgement. You need to rely upon some other form or system to track what’s happened in that period of time.

TRG: Are companies recognising all of this?

Allan: What we’ve noticed with the systems and the conditions out there is many of those are based on traditional disk or hybrid solutions on the backend. And there are numerous applications that are common amongst most trading environments that they use.

Now, in many of our examples where we’ve replaced those legacy environments with flash, we are able to cope and acknowledge with numerous volumes of data or orders of magnitude of data faster and more efficiently than any disk-based system. And this allows users to be able to perform more advanced analytics upon that post process data without affecting any of the other users.

In essence, we provide that system much more stability over time, much more capability of ingesting the volume of data that they require, and also give them the opportunity to have much quicker responses on their potential ad hoc queries that they run.

TRG: The usage of flash and the feats it can perform have grown in leaps and bounds. Does flash essentially follow Moore’s Law?

Allan: It’s actually accelerated faster than Moore’s Law. And there are several iterations of flash that are in the pipeline at the moment that mean that this is going to get even more aggressive in terms of the capacities, the cost and so on.


For more information on the companies mentioned in this article visit:

Menu