Financial services institutions (FSIs) are under much pressure to ensure they have appropriate infrastructures in place to handle throughput of rising market data volumes. This is according to a new report by independent market analyst Datamonitor, (DTM.L) “The Growing Significance of Market Data within Financial Markets”. The report investigates the technologies being demanded by the financial markets industry and the strategies they are / will begin to employ to cope with new market developments, and reveals market data volumes in this space are doubling annually. Spend by European and US hedge fund and fund management firms (buy side firms) on front office market data infrastructure is set to reach US$484 million by 2009. That by investment banks (sell side firms) will peak at US$1.9 billion. The report points out that with Reg NMS* and MiFID** to be fully enacted in October and November 2007, even more data will need to be processed to be compliant.
”Market data has always been a fundamental element within capital and financial markets”, say Amit Shah, financial services technology analyst with Datamonitor and author of the study. “However, upcoming regulations such as Regulation National Market System (Reg NMS) and Markets in Financial Instruments Directive (MiFID), will place immense pressures for accuracy and transparency. This has raised the significance of market data and therefore brought this issue back on the strategic agenda.”
Investment into market data infrastructures will continue to grow
As the global economy has expanded, market data requirements have continued to grow. According to Datamonitor, the total European and US financial services industry front office market data infrastructure IT spend currently stands at $2.1bn.
The upcoming increase in market data will have profound impacts on market data infrastructures. One impact being, the need for storage and another, ensuring the firm in question has relevant analytics in place, to deal with the anticipated increase in volumes. Forthcoming regulations state that firms need to store data for five years and with annual storage doubling annually, pressure on the systems to accommodate the data will be immense. Given the storage pressure on systems, Datamonitor expects IT spend on market data storage to peak in 2008.
Maintaining data quality is a major hurdle throughout the trading lifecycle
A key element sometimes overlooked, is the issue of data quality as many FSIs are embroiled in their search for low latency. A Datamonitor respondent encapsulated this sentiment by saying, “everyone is looking for low latency, but the faster you go, the more opportunities for mistakes there are. If you are taking in as much raw data as possible you have to make sure it is good data”. This means firms should introduce data cleansing solutions, which would add latency.
Additionally, as low latency solutions become increasingly commoditized, firms will have to find new and innovative ways to mine data to stay ahead of the competition. Therefore, the quality of data could ultimately become more important than the speed of data. The key to gain a competitive advantage would be to optimize the speed and leverage the quality of the data.
Ultimately, for data quality FSIs need a storage and analytics capability where they have consistent market data, reference d
ata and analytics which are linked to applications throughout the trading cycle.Market data is the fuel for algorithmic trading models
Algorithmic trading, the use of advanced mathematical models for making transaction decisions in the financial markets, has been on a steady rise over the last few years, more so in the United States than other regions. Fragmentation of US markets has contributed to the development of algorithmic technology and techniques needed to navigate the markets, and regulatory initiatives such as Regulation NMS in the US and MiFID in Europe will certainly accelerate this adoption further.
In line with this steady growth, market data continues to be the 'fuel' for algorithmic models. As such data is embedded in every aspect of the trade decision process, from pre-trade through to post-trade analysis.
Shah concludes: “As automated trading becomes increasingly cross-asset, so too must storage and analytic platforms to support cross-asset, next generation trading. Although some within the industry would argue this is straightforward, it is a complex process to provide these high performance tools and meet the challenges of integrating the data”.