Mondo Visione Worldwide Financial Markets Intelligence

FTSE Mondo Visione Exchanges Index:

A tsunami of transactions

Date 14/07/2006

Peter Bennett

A tsunami of transactions is on the horizon that threatens to deluge securities markets, and sweep away fragile infrastructures.

The causes of the disruption are as follows:

  • Lower transaction costs enabled by increased liquidity provider automation
  • Increased competition amongst liquidity providers, multiplication of trading venues, and fragmentation of liquidity pools
  • Direct electronic market access
  • Explosive growth in hedge funds
  • Rapid growth in black box and algorithmic trading

Already the Tokyo Stock Exchange has suffered an embarrassing breakdown of its systems, occasioned by transaction overload. They will not be alone.

This article reviews the dynamics and scale of the impending deluge of transactions and points to solutions for coping with it.

Some definitions

Liquidity provider
Any service provider whether it be an exchange, ECN, MTF, market maker, specialist, systematic internaliser, etc. who provides order matching services in any instrument class.

Black box trading system
An automated system that applies statistical techniques to market data to divine trading opportunities expressed as buy and sell signals.

Algorithmic trading system
An automated trading system that takes buy and sell orders and seeks to provide optimal execution across the multiple trading venues operated by liquidity providers.

The dynamics of the situation

Buy and sell side black box trading systems consume high frequency market data and subject it to statistical analysis to find arbitrage, or other trading opportunities, measured often in fractions of a basis point (the basis spread). The systems emit buy and sell signals that need to be executed with minimal market impact, and as quickly as possible to lock in what may be an ethereal trading opportunity. The trading positions sought are large and leveraged to maximise the opportunity for profit.

The buy and sell orders are entered into algorithmic trading systems designed to achieve optimal execution. To reduce market impact costs, large orders are typically broken down into small parcels and sprayed to all available liquidity pools. Orders that are not immediately executed may be cancelled or replaced.

Black box trading has the effect of increasing overall trading volume as smaller trading opportunities are exploited. Liquidity providers compete to effectively reduce market friction by improving liquidity and lowering transaction costs. This in turn increases trading opportunities available to black box traders in a virtuous circle - or a vicious circle if the liquidity provider has trading capacity limitations!

Algorithmic trading increases the transaction rates that need to be handled by liquidity providers. Two drivers are involved - algorithmic trading drives down shares per trade, and drives up the order to trade ratio.

This dual effect can dramatically increase transaction rates to be handled by liquidity provider trading engines, and also increases market data rates. Increased market data rates, in turn, put pressure on distribution infrastructure, and require increased analytics performance within black box and algorithmic trading systems. As the choke-points in the infrastructure chain are progressively relieved the whole system speeds up and is effectively regulated by the lowest performance link.

Also there is positive feedback in the system. As liquidity providers lower transaction costs and increase liquidity, there are more opportunities for black box trading. Whenever there is positive feedback in a system expect disruptive non-linear growth.

A successful liquidity provider therefore needs to ensure that its trading engines have the capacity and scalability to meet the new, disruptive demands.

Overview of the automated trading process

Some metrics

So what is the scale of the problem?

Currently NYSE and Nasdaq average approximately 2bn shares per day, and the trend is upwards. In terms of shares per trade, the trend in the last four years is sharply downwards, and currently stands at 300 to 400 shares per trade.

In terms of the order per trade ratio, there is not a lot of published information on this important parameter. It was, however, reported recently that the order to trade ratio on the Toronto Stock Exchange is 10 and trending sharply upwards.1 Other exchanges have confirmed the trend.

Let us calculate an order of magnitude peak transaction volume for a large liquidity provider such as NYSE, say in 2010. Here we are focusing on the orders and trades, expressed as peak transactions per second, that need to be handled by the central trading engine. We will need to make some intelligent guesses about key parameters such as order to trade ratio.

Increase in algorithmic trading

A first assumption is that black box and algorithmic trading will grow unabated, as predicted by most informed sources. For example according to a study from Celent Communications, algorithmic-based equity trading is expected to increase from around 14% of overall trade volumes to nearly 25% by 2008. Although traditional buy-side firms have been slow to embrace advanced execution strategies, they are viewed as representing the largest growth sector by many analysts like Harrell Smith, head of Celent's securities practice in New York, who authored 'Algorithmic Trading Update 2005: Advanced Execution Goes Mainstream' (May 2005).

Celent predicts that over the next four years, the buy-side's use of algorithms is set to increase at a compound annual growth rate of 28% - against 11% for hedge funds and 7% for sell-side firms. Meanwhile, a report from agency broker ITG Inc. released this March suggested a 144% surge in the use of algorithmic trading by the close of 2006.

Given the above, let us assume an order-to-trade ratio of 50 by 2010, and average shares per trade falling to the 200 ballpark. Let us also assume that the liquidity provider concerned enjoys a daily share volume of 2bn shares, a trading day of eight hours and an average to peak transaction multiplier of five (the industry norm). We are now in a position to calculate the peak transaction rate per second required of the central trading engine as follows:

Daily share volume (shares per day)

2,000,000,000

Shares per trade 200........
Orders per trade 50.
Hours in trading day 8...
Average to peak transaction multiplier 5
Peak transaction rate (transactions per sec) 88,542

We have focused on the US equities market statistics, but it should not be taken that the issues under discussion are limited to equities or US equities market alone. Expect the growth trends discussed to occur in all asset classes and on a global basis.

Source: Tower Group

Current solutions

Most large exchanges use 'heavy iron solutions' such as HP non-stop (originally Tandem) to support their central trading systems. About 100 of the top exchanges use these machines because of their fault tolerant design and, importantly, their ability to scale horizontally. Many applications are programmed in the arcane Tandem Application Language to get the most out of what are very expensive machines.

An idea of the transaction performance can be gained from figures published by SIAC, the data processing utility jointly owned by NYSE and the American Stock Exchange, which provides for transaction processing for the two exchanges. SIAC's peak capacity for 2004/5 was in the order of 10,000 messages per second, and 24 fully configured HP non-top machines were deployed to underpin this. This indicates a performance of approximately 500 transactions per second for each HP non-stop machine.

Now consider our peak transaction rate for 2010 calculated above as approximately 88,000 transactions per second. If the existing platforms were to be scaled to meet this requirement we are talking about some 180 HP non-stop machines!

Even taking into account Moore's Law, an alternative solution is clearly required.

Next generation solutions

To address the above issue the London Stock Exchange for example, with the help of Accenture, is porting its current SETS (Stock Exchange Electronic Trading System) from HP non-stop/COBOL to a platform based on Microsoft .NET/Wintel.

Some would question the use of the .NET framework and Wintel servers for what is effectively a high performance enterprise class OLTP platform.

Accenture is also deploying a tuned version of the Deutsche Borse (HP Alpha-based) platform at the rapidly growing Shanghai Stock Exchange.

OMX is demonstrating a next generation higher performance version of the trading system it originally developed for the Swedish Stock Exchange.

Vendors such as Millennium Information Technologies, Financial Technologies and HCL Technologies have interesting products.

Others are quietly developing next generation platforms using disruptive technologies.

One such firm is Kabira. This small California-based company recently featured in the Gartner survey of application servers as a visionary. The Kabira platform provides for very fast ACID compliant transaction performance by running transactions to completion in main memory using a highly tuned framework developed by ex-Sybase engineers. Transactions also run in an HA framework that provides for synchronous or asynchronous updating of, for example, a shadow order book in a trading engine application, to achieve rapid and seamless failover in the event of hardware failure.

Proven in the mobile telecom space for applications like pay-as-you-go billing, the Kabira server provides for application development using platform independent UML models and a highly abstracted scripting language called Action Language. Code generation to a target machine is automatic, making for very rapid and robust development.

Kabira can demonstrate a trading engine with entry level performance of 100,000 transactions per second with linear scaling and full HA running on low cost platforms, such as Sun Microsystems' recently launched Cool Thread Servers.

Another issue facing liquidity providers is that activity during a trading period can be concentrated in one or a few instruments in the wake of news that impacts a particular company or a sector or a new issue. This requires that trading in a particular instrument or group of instruments be dynamically allocated to appropriate processing resources. This plays to a grid of commodity processors with appropriate virtualisation software as a preferred platform. In the limit, one instrument may be allocated the sole use of a processor from a small pool of very high performance processors reserved for this purpose.

Another solution to the trading system capacity issue may be to revert to the call to market or single priced auction rather than the continuous auction to deal more effectively with peak demands. The call auction could be run every ten seconds for example, with a staggered auction time to counter gaming.

Conclusion

The dynamics of growth of automated trading indicate a disruptive change in transaction volumes and market data. This will require new solutions for provisioning of central trading engines as well as market data distribution and consolidation infrastructures.

Fortunately there are several initiatives on the way to meet the new requirements. The liquidity providers who are quickest to exploit the new technologies on offer stand to gain first mover advantage in the coming maelstrom.

Reference

1 Tom Atkinson, RS Market Regulation Services Inc, Algorithmic Trading Conference February 22, 2006.