By Michel Finzi
This is the first in a series of blog posts from Celoxica on various topics pertaining to accelerated market data, risk management and trading technology infrastructure. We hope you find these pieces insightful and we welcome your comments and feedback.
Over the past 24 months we have seen more and more financial institutions – both buy-side and sell-side – evaluating whether accelerated market data technologies are the appropriate solutions for their next generation trading and investing infrastructure. Whether this is motivated by firms’ needs to upgrade their infrastructure (for example to remain competitive, to trade new markets or to keep up with anticipated processing capacity demands) or whether it is due to the maturation of an industry that now enables a wider range of market participants to adopt technologies that were once considered “niche” in a much more cost-effective way, the trend is observable and undeniable.
Combine that with the fact that the torrent pace of technology evolution continues to run its uninterruptable course, and it serves as a clarion call for many technology vendors serving this market to proclaim that their particular solution set contains the best – if not the only – tool for the job. And so we have started to see a resurgence of the debate about which approach is best suited to tackle tasks such as capturing, managing and processing accelerated market data, handling pre-trade risk and processing order flow: FPGA or non-FPGA? Hardware or Software?
To readers, we offer a cautionary statement by way of the old saying, “To a hammer, all the world’s a nail”. Simply stated, beware a one-size-fits-all solution, because the chances are that one ‘size’ will not fit you as much as you think, if at all. We therefore encourage anyone considering moving to new technologies or evaluating new trading infrastructure to ask a number of questions. And we hope that through these discussions pieces, we will equip you with some insights to help you through your evaluations.
The debate revisited:
In deconstructing the debate of hardware vs. software, it is abundantly clear that each approach has its own advantages. Used in the right context each tool can be suitable for a part of the job. And when applied in the right combination, a truly powerful, scalable, cost effective and more easily supported solution can be achieved. But the first thing any prospective client should do with a particular technology provider is clearly define what the “job” is, i.e. what are the current and prospective use cases?
This extends far beyond the most commonly discussed topics of performance & technology platform preference. For the majority of use cases, a much wider range of issues need to be considered. How is the solution to be integrated with other trading applications? What are the required message formats and data mappings? How might the solution need to be expanded into other asset classes or regions? What kind of footprint will be required in the data centre? How will it be supported and maintained on an ongoing basis? What are the change management disciplines and what is the product roadmap? And last but not least, what are the upfront and on-going costs that need to be factored into the TCO of the solution?
Each underlying technology has its own strengths and weaknesses, so we encourage anyone considering which way to go to ask the difficult questions and dig into the detail early. In our experience this is the only way to determine which technology providers should be on a given short list for further consideration.
Where do we stand?
Having served many latency sensitive clients as well as large global enterprises since our inception, Celoxica has years of experience in dealing with the hardware vs software debate. And our position on what solution is most appropriate is clear and unequivocal, “it depends”. Where reduced data centre footprint, seamless integration with other applications, scalability, deterministic performance and ultra-low latency are key considerations, we always advocate either the use of a hardware-based (FPGA) approach or a combined hardware and software deployment via a “Hybrid” model. The ability of an FPGA to manage data packets and accomplish massive parallel processing with unrivalled deterministic throughput capacity, combined, when necessary, with software’s ability to perform a wide range of other relevant functions – which may be tailored to a particular firm and which can be run on increasingly powerful microprocessors and spread across multiple cores – offers clients a true best-of-breed solution. Why not take advantage of both if that is what the situation calls for?
It really is about about using the best tools for the job. Clients should not have to worry about the hardware/software argument when working with a vendor who is expert in both.
This hybrid approach is one that is not often discussed, but with recent advances in technology, we expect that this will change. So in this series, we will delve deeper into the relevant merits of hardware and software in an accelerated trading infrastructure, looking at specific topics such as why determinism and predictability are important; what are the issues around scale; what is the true TCO of each approach; exploding some myths around the “difficulties” of working with FPGAs; and much more.
Please check back soon for the next instalment.
Read the original post on The Trading Mesh.