In this article, Mike O’Hara, publisher of The Trading Mesh – talks to Matt Barrett of Adaptive Consulting and Eddie McDaid and Tony Foreman of Software AG, about how sales productivity at global investment banks can be significantly improved by bringing together four “pillars”: management information systems; straight-through processing; control; and pro-active outreach; and how this can be achieved through a combination of business intelligence and technology innovation.
Since the global financial crisis – and particularly as markets have moved towards greater electronification – many investment banks have reduced the headcounts on their sales desks and directed their technology investments more towards low-touch electronic, as opposed to high-touch, voice-based trading.
However, despite this trend, many markets – particularly rates and fixed income – are still predominantly voice-traded and are likely to remain so in the short to medium term. This raises a question. How can banks leverage the technology investments they may have made in the electronic realm, to improve sales productivity in these voice-traded and high-touch environments?
According to Matt Barrett, Director of Adaptive Consulting, a software development and integration firm that works with a number of global tier 1 & tier 2 investment banks, the answer lies in bringing greater automation to four key areas, what he refers to as the “four pillars of sales productivity”: MIS/analytics; straight-through processing (STP); control; and the monitoring and reporting that enables pro-active outreach to clients.
“It is important to capture every negotiation, because that’s what will build up the real intelligence.”
Matt Barrett, Director at Adaptive Consulting
One of the problems with voice-based markets is that workflow tends to be very ad-hoc, so capturing data in an electronic format to facilitate any kind of MIS is not easy. And, as Matt Barrett explains, in order to achieve meaningful analytics, not only does data need to be captured as early as possible, it also needs to include all client interactions, not just those that have resulted in a trade.
“It is important to capture every negotiation, because that’s what will build up the real intelligence”, he says.
“Today, if a client calls and does a trade, obviously you will manually book that trade onto the system. Generally what isn’t captured is where a client calls, you give him a price and he doesn’t trade or trades away, and the various reasons for that. For example if you were in competition with two other dealers and your price wasn’t good enough. Clients sometimes give you this kind of useful information, so if you want rich MIS and analytics, you need to be able to capture it somehow.”
Sales people in voice-traded markets are notoriously reluctant to enter more than the bare minimum of information onto trading systems, so one of the challenges here is how to ensure such data gets captured. The key is in user interfaces that enable sales people to immediately see the value, according to Barrett.
“If the stick is management telling sales people that they need to enter this data, the carrot on the other side is the sales people seeing the results in rich analytics in real-time, which can really add value”, he says.
“Ideally you want a user interface where, if a negotiation results in a miss, the next time the client calls, the sales person sees that previous miss, even if it’s just 30 seconds or a minute later. Not only that, but they would see that client’s interactions with the bank across the entire desk or across an entire asset class, so that trends can be visualized, understood and acted upon”.
What technology components are necessary for this?
“You need things like real-time multi-protocol messaging and a strong CEP (complex events processing) engine, which bring different benefits but are both necessary. The CEP engine is able to pull streams of data from a number of heterogeneous sources, and monitor all those flows for specific conditions that have been set. All of this analysis takes place in real time, which is incredibly powerful. You need push-based messaging so that you can make your real-time analysis shine. Being able to sending an alert to an individual based on an relevant confluence of events, originating from different sources and occurring in a specific window in time is an incredibly powerful tool, and we find clients get addicted to the functionality once they are exposed to it. The benefit of course is that these tools are arming you with the necessary business intelligence to be able to increase your hit ratios, if used correctly”, says Barrett.
“STP is all about enriching trade data with specific information so it can flow through the various systems automatically.”
Matt Barrett, Director at Adaptive Consulting
If MIS is all about how to enhance hit ratios and therefore increase revenue and profitability, STP is more about lessening the number of manual interventions on trade flow in order to reduce costs and lower operational risk, factors which can also help improve sales productivity.
But how realistic is true STP in a voice-traded environment? What are the challenges around having trade information flow seamlessly through the various systems that need to read, enrich and report on that data? And how can those challenges be addressed?
“The traditional problems here are around the impedance mismatch between the bank’s different messaging and data store platforms, some of which may not be exposed electronically”, explains Barrett.
“If you think of all the sources this information needs to be enriched from, many of them within different organizational units built on different technology platforms, when you build one of these message flows, you’re forced into a vast array of different integration exercises”.
An alternative implementation is to flip the traditional data-lookup pattern on its head, and have different data sources contribute the required data to the CEP. This keeps an up-to-date cache of all the slowly changing data needed for STP enrichment, meaning it can be accessed quickly and easily when it is required.
“STP is all about enriching trade data with specific information so it can flow through the various systems automatically”, he says.
“Enriching the client ID with details of the book or the clearing instructions for example. The steps may be different for interest rate swaps, or for FX, or for bonds, but at the end of the day a workflow needs to happen. With a powerful CEP engine and the right sources of data, you can implement a generic workflow giving you one STP solution for a specific product, and then duplicated for other products”.
While complex event processing and universal messaging technologies can help facilitate the first two pillars, an additional technology – in-memory data management – comes into play in the area of Control, the third pillar of sales productivity, as Tony Foreman, Financial Services Sales at Software AG, explains.
“In-memory data management coupled with complex event processing can enable systems that give the sales person a real-time view of all their customers’ activity across all asset classes, highlighting anything that seems abnormal.”
Tony Foreman, Financial Services Sales at Software AG
“From a control perspective, in a typical investment bank, there is more and more data that needs to be stored and retrieved extremely quickly”, he says.
“Looking at market surveillance for example, there is an evolution going on and there has been a growing requirement beyond standard deterministic alerting toward the spotting of the abnormal. There are more data requirements than simply factoring in market and trade data and standard out-of-the-box alerting to things like front running, spoofing, insider dealing, etc. Identifying and responding to abnormal behaviour has many aspects, particularly as the concept of normality can be quite fluid in this environment. For example a trader or a client might be suddenly making – or losing – money in a particular asset class and the bank needs to be alerted to that. Whereas if it’s a gradual change it might be considered normal”.
Analyzing these factors in real time can take an incredible amount of memory, according to Foreman, which is where fast in-memory data management comes in.
“In-memory data management coupled with complex event processing can enable systems that give the sales person a real-time view of all their customers’ activity across all asset classes, highlighting anything that seems abnormal”, he says.
“The two technologies are complimentary, because a lot of the stuff coming through the CEP engine has to be persisted in memory in order for it to be retrieved, checked and analyzed quickly”.
Adaptive’s Matt Barrett adds his take on this.
“There’s another aspect of control that can ease a regulation problem, where the risk to the bank is no longer easily calculated just by the size of the position, it also has to take into account the specific assets and equity that need to be held based on where the client is planning to trade”, he says.
“If a client is doing trades out of Japan versus trades out of North America, there’s a difference there in the amount of risk that the bank is exposed to because of the underlying regulatory conditions in each location. This all comes back to the need for the sales people and the traders to have an expert system on their desk that collects information from lots of different places to give them an accurate picture of what needs to happen, both before and after they make the trade”.
According to Barrett, the challenge around the regulatory and control aspect is how to source, integrate and present the data in a constantly changing environment.
“It’s not solely a technological problem”, he says.
“We know how to build, deploy and run these systems. The problem is in sourcing the data, some of which may not exist in an electronic form in those banks that think of regulation as a static thing. They talk to their regulation departments once every six months to figure out if how they’re operating today is okay. But with regulatory changes being introduced monthly or weekly, you need to electronify the ability to change and respond”.
Bringing all of this together, where the value can really be delivered to increase sales productivity is in the area of ‘pro-active outreach’, where real-time business intelligence is incorporated into the sales people’s workflow.
Eddie McDaid, Head of Product Management for Streaming Analytics and Big Data at Software AG, discusses the idea.
“In the world of capital markets, what’s interesting is the concept of analyzing data in flight, this idea of bringing together the streaming analytics with in-built distribution orchestration”, he says.
“The combination of CEP, fast messaging and in-memory data storage enables banks to be smarter about how they distribute prices, who they distribute prices to, what margins they apply to those prices and what kind of heuristics they use to make those decisions. In the old days they might have asked how much brokerage they had taken from a particular customer today, last week, last year, whatever. But now you’re talking about a much more rich, much more granular set of data. So not just how much business a client might be doing, but also perhaps some predictive analytics modeling, gauging whether or not someone is going to trade. Whether they need that little nudge over the line, or whether they were going to trade anyway”.
“In the world of capital markets, what’s interesting is the concept of analyzing data in flight, this idea of bringing together the streaming analytics with in-built distribution orchestration.”
Eddie McDaid, Head of Product Management for Streaming Analytics & Big Data at Software AG
Predictive, heuristic analysis that pre-empts the client’s likely moves and informs the sales person what the client is likely to do next, based upon multiple input streams – including previous and current patterns of behaviour – can obviously provide a competitive edge. But is this a holy grail?
“It’s about democratizing the flow of data, aggregating different data sources via the CEP engine and then setting up rules to warn when specific events happen, so that appropriate action can be taken”, says Adaptive’s Matt Barrett.
“The real-time nature of today’s markets means that sales people should be able to look at all previous negotiations that have occurred with a client and feed that data back into their pricing models to drive more successful hit ratios”.
One thing that Barrett is keen to point out is that introducing the four pillars of sales productivity does not have to be an all-or-nothing exercise. They can be introduced a step by step basis. The technology that facilitates the four pillars – CEP engines, universal messaging and in-memory data management – can be introduced and layered over existing systems.
“You could go away for two years, build something, put it into production and then discover that things have moved on so you’re not getting any real value from it”, he says.
“Whereas if you drip-feed these things in – in a more tactical way – you’re likely to see much better take-up and usage. In reality, you’re never at your strategic, target architecture. But as you get to use all these different components of the four pillars that we’ve talked about, you will see improvements in your bottom line, depending on what you’re looking to do, whatever key indicator you were looking at when you kicked off the project”.
The key takeaway seems to be that these four pillars of sales productivity can certainly be achieved by making best use of the resources and assets that already exist in the bank. Assets such as high touch sales people, existing technology components that may have been introduced for low-touch electronic trading and – importantly – interactions with clients that are not currently being captured or analyzed.