Future-proofing for FinTech with Network Innovation

In this article, Mike O’Hara and Adam Cox of The Realization Group explore the world of network infrastructure and connectivity as banks and other financial services providers find themselves in the midst of an explosion of Financial Technology (FinTech) activity, one which is driving more and more firms to higher levels of digitisation. There are a variety of reasons why companies are going down this route. It may be part of an effort to expand their services through new apps; it may be due to efficiency programmes as firms look to reduce costs; or it may be because of regulatory requirements. Whatever the reason, there is no question that banks and services providers are finding they need more network capacity, productivity, flexibility and security.

Mike and Adam hear from Matthew Lempriere of Telstra, Michael Ritter of ADVA Optical Networking, Vinay Rathore of Infinera and Paul Jones of Citihub. Together, they describe a dynamic environment that offers enterprising firms the chance to not only improve their offerings and their bottom lines, but also to provide a degree of futureproofing.

 



Introduction

Make no mistake, the trend of ever-rising data traffic in the financial arena is showing no sign of letting up. The trend is fuelled by factors such as increased competition, innovative services, regulatory requirements and new technology. Thanks in part to a vibrant FinTech sector that has seen start-ups rush to compete with established players, the demand for better, cheaper, more secure and more flexible data networks has never been higher.

But for firms faced with long-term decisions to make, what are the best approaches? Should they look to squeeze as much as possible out of legacy networks or take the plunge and upgrade? Is there a middle ground? Should they lock in long-term deals or opt for arrangements that allow for data capacity to rise and fall as needed? There are pros and cons whatever the answers to these questions. But by striking the right partnerships and taking a holistic, common sense approach, experts say firms can turn their networking decisions into business opportunities. Put another way, when making decisions about data, firms need more than ever to be data-driven.

 

Data on the move

The world’s biggest financial firms are not always the quickest to respond to new innovations. But when they do, they do so in force.

“If you talk to insurers, retailers or global banks, they are all digitising,” says Matthew Lempriere, Global Head of Financial Services Market Segment at Telstra, a leading telecommunications company. “Everyone I speak to is either going through the process or has plans to go through the process. This trends looks set to continue growing at an exponential rate.”

A big part of that is due to a changing technology scene. Whether it’s a revolutionary development such as Blockchain or more specialist technology such as SDN1 (software-defined networking) or OTN2 (optical transport network), the financial sector is undergoing profound shifts. “I’ve been working in this industry selling technology solutions since 1991, and I’ve never seen it changing so fast.” Lempriere adds.

The evolving technology is both a cause and a symptom of what’s happening in financial services. It’s a cause in that each new FinTech development creates opportunities that result in rising demand for network capacity. But it’s also a symptom because that technology is reflective of new markets that are being pursued and new competition in those markets. Take cloud technology, for instance. “I think you wouldn’t find a bank that’s not considering cloud at the moment,” says Paul Jones, Associate Partner at IT advisory firm Citihub Consulting.

 

“You wouldn’t find a bank that’s not considering cloud at the moment” 
Paul Jones, Citihub

 

Increased usage of the cloud can have a direct impact on network infrastructure decisionmaking. Vinay Rathore, Senior Director of Field and Segment Marketing at Infinera, notes that there is a push to drive more cloud applications into the financial sector.

“In this case, you need to make sure you can get to other cloud services. It could be provided by a third party company. Maybe it’s providing financial news, or real time data. It’s coming through the cloud and then it’s being aggregated and provided to you,” Rathore says.

Why exactly does all that matter for the network?

“The point here is that the role of the cloud is driving huge amounts of data into the network. And unless the public facility, the private facility and the third party facilities are all located in the same place, you’ve got to have a network that connects them together,” he adds.

Jones notes that while start-ups can take a more relaxed attitude towards adopting new technology and making quick decisions about suppliers, established banks generally don’t have that luxury. Apart from anything else, they require multiple providers because they can’t afford to concentrate risk in one supplier.

But while banks may be more risk-averse than those nimble start-ups, they’re keenly aware that they need to roll out new applications. And they’ve been doing so aggressively.

“Some of these are for productivity reasons,” says Lempriere. “They’re trying to get new customers, they’re trying to access new markets, they’re trying to do more with less and streamline processes.”

A key market driver is the need to roll out services that meet regulatory requirements. Many of those are data-intensive, involving voice recording and storage, KYC (know your customer), and data analytics. Jones, for example, points out that in Europe, the latest Markets in Financial Instruments Directive (MiFID II ) is the biggest project that financial firms will be working on during the next year.

 

Changing needs

The upshot is greater demand for capacity, low latency and encryption across the industry.

Michael Ritter is the Vice President of Technical Marketing and Analyst Relations at ADVA Optical Networking, a firm that builds optical networks designed to handle rising data traffic around the world. Ritter says big banks are no longer thinking just in terms of the traffic from one data centre to another.

“They are now looking at global connectivity,” he says. Previously, much of the heavy-duty, high-capacity infrastructure was focused on data centre interconnect. But now there’s increasing attention on connecting all the various sites, including local offices.

Ritter says the old model of a main data centre and back-up facility, with everything mirrored, has been augmented. “FinTech very often is based on a more distributed model,” he says, noting that means communication between different sites as well as low latency have now become critical components.

How much data are we talking about?

 

“You wouldn’t find a bank that’s not considering cloud at the moment” 
Paul Jones, Citihub

 

“It used to be that 10-gigabit, 40-gigabit or 100-gigabit was really pushing the envelope for some of these banks. You find the biggest banks might need half a dozen 100-gigabit connections. Now, 100-gigabit is the currency that is being traded in the network world for all the banks,” says Rathore.

“Firms want to make sure they’re not being limited by the bandwidth in the network”, Rathore says.

He adds that this kind of demand is not only coming from the biggest banks but also from the tiers below. Some of this is due to a desire to be able to grow on demand, while some is due to the relentless aim of driving down latency.

In these cases the OTN is a good fit. “Optical gear is well known for its low latency. Latency is a critical thing, not only because of high frequency trading, but also for synchronised mirroring,” Ritter says.

And there is more to OTNs than speed.

 

“Since this is sensitive data, since nobody wants this data to be disclosed or compromised, encryption is a critical component” 
Michael Ritter, ADVA Optical Networking

 

“Since this is sensitive data, since nobody wants this data to be disclosed or compromised, encryption is a critical component,” the ADVA executive adds. His firm has numerous installations where encryption is used on the optical layer, allowing for low latency and very high capacity. Ritter also points out that regulatory demands and individual company security policies are pushing firms towards more encrypted and secure connectivity. Some firms stipulate that all data centre interconnect must be encrypted, whilst others go even further.

“People in general are not ready to say, ‘I don’t need it’,” says Ritter. He says there are so-called safe harbour agreements, which set out terms for when data gets compromised and which make allowances for those firms that have encrypted their data. “The trend clearly is they want to be sure, they want to be safe.”

Rathore makes the point that encryption needs are more intensive than they used to be. For instance, in the past, just putting encryption on higher layer applications was sufficient. If a firm had an algorithmic trading app, and if, once it connected to the Ethernet port, it would have encryption, the company was generally satisfied. “The rest of the network could be unencrypted because the data running across it was already encrypted,” he says.

“Now they’re asking for encryption all along the way. They’re saying, ‘When it comes into the data centre it must be encrypted. When it leaves the data centre it must be encrypted.’ They want layer zero encryption3 at every data centre exit across the board.”

What was once considered a “nice to have” has become a “must have”. As Rathore says, the finance sector has always been data heavy, whether it be trading data or regulation-related data. “What we’ve seen is that continues to grow,” he says, noting regulation, encryption and security are all important factors.

 

Finding the right approach

For some banks, little appears to have changed in recent years. A bank may have connections between one major centre and another, say between Hong Kong and Singapore or London and Frankfurt, and as soon as the pipe gets to 80% capacity the bank will look to upgrade it.

The problem is, that involves getting quotes from multiple suppliers, potentially even engaging in an e-auction, creating multiple contracts, raising orders and, most problematically, arranging for the downtime and all the project management that goes with the upgrade.

There are two main ways companies can deal with this situation. The first is to try to buy enough capacity so that the issue does not come up often.

 

“We’re now seeing a trend where banks are trying to put in networks that will last three to five years, without having to have all that process to keep them at the right size” 
Matthew Lempriere, Telstra

 

“What we’re seeing now is a trend where banks are trying to put in networks that will last three to five years, without having to have all that process to keep them at the right size,” Lempriere says. “So they’re implementing much bigger links now and benefiting from the new technologies that give access to much higher performance services, because prices are more reasonable.”

That in turn allows them to grow their digital business and build new applications, without having the costs and time delays associated with building out networks to keep up with demand. The Telstra executive says some firms are well on their way in this direction, while others have yet to embark on the journey.

But building a network with more than enough capacity is not the only way forward. An alternative option is the use of SDN technology which allows a firm to have flexible bandwidth, increasing it or decreasing it depending on customer demand and business priorities. That flexibility could be based on different applications that might run on different parts of the network at different times. Or it could be based on decisions to temporarily expand or shrink a firm’s presence in a geographical location.

“It all depends on the company’s business aspirations as to what’s the best approach for them,” Lempriere says. Large global banks may want to lock in large amounts of bandwidth. Regional banks, particularly those with a play into Asia which are interested in dipping their toes into the water, may want to go for software defined networks due to the flexibility.

 

Operational models

Apart from decisions about whether to lock in network capacity or go for a flexible approach, financial firms also need to consider whether they want to be hands-on with their network or outsource the work. Ritter says smaller institutions probably do not have the capabilities to run the network and data facilities on their own.

“This depends quite a bit on the organisation,” he says, referring to build versus buy decisions. “We have customers which are using both approaches. So part of their connectivity is operated on their own; the other part is leased, simply for security and availability reasons. Then we have others who don’t have departments operating networks, so they obviously lease.”

If a firm has access to fibre, it may be more of an open discussion. In some countries – for instance in parts of Europe – it may be difficult to own fibre networks, so leasing is the only option. At the same time, there are plenty of firms that do not want the hassle of running – much less building – a network.

Rathore describes one hedge fund that simply wanted a plug-and-play solution. “Their view was, ‘I don’t want to relearn how to do networks. I just want to plug this in and turn it on. I should be able to have one of my technicians just turn up and configure it.’ That piece of the puzzle has become more and more important,” Rathore says.

He says his firm has seen a shift towards appliance-like boxes with plug-and-play functionality. The idea is to make optical networks foolproof. Rathore says that all of the “complicated optical stuff” is now smaller, uses less power, is super reliable and easily operated by software.

It is proving popular for a simple reason: financial firms are always trying to do more, so anything that lets them focus on their business and not on the network side is a bonus. “They want to be able to have simple, easy to use systems, not complicated optical gear.”

 

Conclusion

Lempriere believes that firms should aim to be more strategic and holistic, looking at which organisations they can work with over the next five years to actually deliver the change that they need to grow their business, to comply with regulations and to be successful.

Lempriere says it all comes down to picking the right partners – firms that will work closely with a customer and go on a journey. “It’s such a fast moving area at the moment, I think if they (financial firms) look at suppliers and partners individually and tactically, I think they will struggle,” he says.

In the increasingly Darwinian world of FinTech, those firms that fail to see this bigger picture are likely to find themselves pitted against nimbler rivals. And for those companies that see network infrastructure as an opportunity rather than a cost, the picture will be very different. For them, the battle may have already been half won.


1. SDN (software-defined networking) allows network services to be managed by software rather than at the static physical level. By decoupling the system from decisions about where traffic is routed, administrators can make networks much more dynamic and responsive to quick business priorities. The main aim of a software-defined network is to bring agility and flexibility to the network configuration.

2. The ITU Telecommunication Standardization Sector defines an OTN as a set of Optical Network Elements connected by optical fibre links in order to provide functionality of transport, multiplexing, switching, management, supervision and survivability of optical channels carrying client signals. IT specialists say that for many firms, current network infrastructure isn’t built to scale and technology such as OTN can help them handle future bandwidth or technology challenges.

OTN provides feature-rich technology, such as the ability to detect faults early on and proactively address them. It also allows for secure transparent encrypted network transport at layer one. Any breaches or anomalies in the network can be detected, giving transparency and a view of the network. Legacy technology typically cannot do this.

3. The physical layer, or Layer 1 of the Open Systems Interconnection (OSI) model does not include the transmission media. Transmission media stays outside the scope of the Physical Layer and is also referred to as Layer 0 of the Open Systems 4 Interconnection (OSI) Model.


For more information on the companies mentioned in this article, visit:

Menu