Mondo Visione Worldwide Financial Markets Intelligence

FTSE Mondo Visione Exchanges Index:

Algorithmic trading's journey, where performance is the destination

Date 14/07/2006

Mary Lou Von Kaenel and Bob Brenman
Jordan & Jordan Industry Consultants

"Algos are the vehicle. Market data is the fuel. Technology built the highway. Is regulation the next driver?"

Algorithmic trading is now firmly entrenched as a key tool for the financial sector. Reacting to performance pressure, buy-side firms are leveraging direct market access and the use of algorithmic trading strategies to decrease transaction costs and increase investment return. Sell-side firms are now competing for clients based on the performance of their algorithmic trading strategies and direct access execution services.

For the buy-side, algorithmic trading or 'algos', combined with other methods of direct market access (DMA), provides a means to boost performance with reduced transaction costs while increasing control over trade executions. For the sell-side, providing clients with direct market access and algo models has become necessary to attract and maintain a high profile institutional client base. For both, DMA and algo strategies may also become the most efficient ways to fulfil fiduciary responsibilities to achieve 'best execution' and to meet regulatory requirements.

This article describes the basics of algorithmic trading: the challenges, current trends and potential impact of regulatory changes on trading practices. Enabled by technology and vendor tools, algos have evolved in recent years and will become more complex to meet heightened expectations for performance.

Background

Algo trading has been on a steady rise over the last two to three years, more so in the United States than other regions of the globe. Current estimates for rules-based trading is 65-75% of the largest buy-side and hedge fund participants in the US, while Europe and Asia lag considerably. Fragmentation of US markets has contributed to the development of technology and techniques (e.g. algos) needed to navigate the markets, and regulatory initiatives such as Regulation NMS in the US and MiFID in Europe will certainly accelerate adoption.

To some extent, the lack of take-up has been blamed on the technology needed to support algorithmic trading. In the US, broker dealers have encouraged buy-side use of their algorithms by providing access through multiple channels. One approach has been to place a proprietary front-end on buy-side desks to facilitate order flow; but most brokers have worked with vendors who provide order management systems (OMSs) and other applications to give clients access to their proprietary trading strategies.

Only the very largest investment managers, 'quant' shops and hedge funds have designed their own algo models; although they must still rely on sell-side broker dealers to 'sponsor' direct access to the market centres for execution. These buy-side firms and hedge funds believe they have a competitive advantage with in-house expertise in sophisticated analytics and modeling, and access to vast historical research databases, despite the large investment in technology and infrastructure required to manage real-time market data and trading platforms.

Perhaps it is the changing nature of relationships between the buy-side and sell-side broker-dealers that has given the buy-side the confidence to pursue algo trading and direct market access. Buy- and sell-side alike are driven by the need to reduce costs and increase productivity. Brokers have responded with unbundled services and tiered commission rates and the buy-side is seizing the opportunities.

Particularly when trading well-known, highly liquid securities, buy-side traders have become comfortable utilising certain algo strategies. Automated trading methods not only slash commission rates (commissions may be as little as ½ to 1 cent per share) but also free buy-side traders to focus on larger or more difficult trades. For some orders where execution is just slightly more difficult, traders may consider direct market access as another low cost option; however, the most difficult trading scenarios are typically reserved for brokers to handle and of course, higher commission charges apply (full service transaction where fees often range between five and six cents per share).

In addition to the explicit cost savings, the buy-side also sees intangible benefit in the form of reduced slippage or market impact. Seen as both a liquidity and a productivity tool, algo trading also supports better alignment between strategy and execution. Today, one of the first decisions a buy-side trader must make is to determine the best approach to order placement and execution, whether it be manual or electronic; and if the latter, the trader must select the most appropriate model.

From the sell-side perspective, the business dynamics have changed. In addition to accommodating their clients' demand for direct market access and self-directed orders, brokers have enjoyed efficiency increases with a no/low touch solution for the straightforward orders. This justifies reduced commission rates and also frees traders to focus on executing the more difficult trades.

Initially, only a handful of the bulge bracket firms utilised algo models; but today, a far greater number of firms make algo models available to their clients. Agency-only brokers view algos as value-add product offerings needed to attract institutional clients. Brokers are now developing rules-based models and algo strategies to target specific client segments, focusing on the needs of particularly large buy-side firms looking to take a more directive approach to order flow, as well as small firms that may be concerned with trading in line with a benchmark.

Even as the demand for execution-only services has increased competition among sell-side providers, the barriers to entry have risen with the required analytics expertise and technology investment. Some find they are not up to the investment, and are rethinking their business models with a view toward outsourcing execution services. Others have recently entered the market with highly complex 'black box' product offerings. Users should be cautioned, however, against models that cannot be reasonably explained because of their complexity or proprietary nature; traders should insist on understanding the inner-workings and test reliability of a model before readily accepting its results.

The past

Algorithms have evolved over the years, and no doubt will continue to advance as strategies are modeled to suit a wide variety of circumstances including different market venues, instrument types (e.g., small, mid and large cap stocks, futures, bonds), trading volumes, time of day and tolerance for risk. The earliest models, many of which are still in use today, are basically enhanced direct market access strategies such as smart order routing and time slicing. Their primary task is to break large orders into many pieces to be executed on multiple markets either simultaneously, or at predefined intervals throughout the day.

Market fragmentation and the effects of decimalisation encouraged new development and furthered use of algos to find liquidity and test market depth. But, as many of the less sophisticated models have been 'reverse-engineered' by market players, orders that are placed in a simple, methodical way are easily detectable and subject to gaming, hence defeating the purpose of the algo strategy. The goal is to utilise algorithms that leave no footprints and therefore minimize market impact.

The present

While not highly sophisticated, two of the most currently used models are VWAP, volume weighted average price, and IS, implementation shortfall (often referenced as 'arrival price' or 'decision price'. These require a small amount of quantitative input, and are designed to minimize risk by performing executions against benchmarks. Successful performance of an algorithmic model is typically measured in terms of the average variance or 'slippage' against a given benchmark.

VWAP is the most commonly used algo, as it provides a fair representation of prices throughout the trading period; but it is inherently an 'at market' strategy. Other intra-day benchmarks that are often used in similar strategies include TWAP (time-weighted average price), and the average of open, high, low, and close.

Implementation shortfall is rapidly emerging as a preferred strategy because its objective is aimed at a specific pre-trade price (that is, the arrival price defined as the mid-price at the time the order was received, or the decision price being the bid or ask price at the time the trade decision was made), and is therefore able to be measured.

Brokers offer a wide variety of algos including their own versions of 'stealth' or 'guerilla' algos designed to act in a random or unpredictable manner so the trader's intention remains undetected. In many types of algo trading strategies, the degree of urgency (patient, normal or aggressive) is a function that can be selected by the trader. In others, adjustments to trading behaviour can be automatic with changing market conditions. Depending on the complexity of the algorithm, a series of variables can be input by the trader to optimise the strategy and maintain tighter control of the execution result.

Standard instructions that can be specified with a typical order submission include market or limit order, display size, time increment for new submissions or revisions, etc. For many rules-based models, execution strategies are centered on a benchmark price, with parameters pre-set to react in real-time to the behaviour of the specific stock.

The conditional instructions might cover stock price movements beyond a specified range, spikes in trade volume or changes in volatility profiles, for example. To maximize results, traders most often include additional execution parameters such as: participation rates (target and maximum), price thresholds and limits, aggressive or passive settings, time intervals and start and end times.

Algo strategies seek to deliver the best possible performance for a given level of risk. This balance determines the overall effectiveness of an algo. Some strategies are designed to be opportunistic while others function according to a pre-defined schedule. The concepts of time, size, spread and volatility are primary considerations in determining the risk inherent in most algo strategies being used today.

The future

With an increased focus on performance, the complexity of algos has evolved to support the need for greater control and more predictable results. A second generation of algo strategies and pre-trade analytics is making use of updated market information to define and adapt trading strategies in a dynamic environment, rather than being based solely on static data.

In addition, extensive modeling, research and back-testing is now underway to look at broad sections of the market to predict trading patterns, analyze behaviour of single stocks relative to their peers, across sectors, and across markets and instruments (e.g. futures, bonds) to better understand the impact of various influences on stock prices. This development is enabled by technology available to capture, store, analyze and manipulate massive amounts of data. Still in early stages, these 'artificial intelligence' models will become the standard trading algorithms of the future.

Pre-trade analytics

Pre-trade analytics are a critical component of selecting an appropriate execution strategy for the circumstances. Not every trade is well suited for rules-based execution; there are numerous situations that require manual handling. In most cases, brokers provide pre-trade analytics packages to aid the buy-side in making trade strategy and execution decisions. Web-services are also available from third party services to support buy-side trading decisions with historical data and real-time market simulations.

Pre-trade analytics provide an opportunity to predict the behaviour of the particular security in the current environment using historical and current market data. Another feature often built into pre-trade analytics is the ability to adjust certain constraints to ensure that execution strategy meets the intended goals.

Pre-trade analytics could be considered the current frontier, as they are not as fully developed as the algorithms themselves. Recent trends are toward tighter integration of market data with decision-making tools in the buy-side environment, incorporating the execution decision process with real-time market information. Used in conjunction with real-time data, analytics provide insight into short-term price movement and market conditions. Trading patterns not only for single stocks can be identified, but also relationships between pairs and other combinations of stocks, which can be helpful in developing additional hedging strategies. While exponentially more complex, these concepts and their application to trading baskets and multi-instrument portfolios are likely subjects for the next wave of algorithm development.

Trade cost analysis

Trade cost analysis (TCA), or the ability to measure effectiveness, is integral to the value proposition and the basic premise of algorithmic strategies. An algo's performance is defined in terms of market impact or slippage from a benchmark. The proper selection of a model, the parameters and constraints applied, the timing, the market venue and many other implicit and exogenous factors contribute to an algo's performance and results. Providing a snapshot of trade-specific and market condition details, execution performance results are typically complete by end-of-trade date or T+1.

TCA reports are available from both the algo providers and independent third party services. Broker-provided information will detail results at the trader and algo level; while independent services, in addition to being unbiased, provide the added advantage of comparing execution quality across brokers. There are services that allow interactive analysis across multiple brokers, or can segment trades by broker, market sector, venue, trader or portfolio manager. Using specific trade-level details and related market data at the time of the trade (e.g. size, volume, volatility) for analysis, users can refine their selections of dealers and use of specific algorithms for future trades based on past results.

Market data

Market data is 'fuel' for algorithmic models. Market data is embedded in every aspect of the trade decision process, pre-trade through post-trade analysis:

  • Historical, static data is the prerequisite for algorithm development, needed to create the model and to test its efficacy.
  • Market data, sometimes static, sometimes real-time, is critical in pre-trade analytics to provide proper guidance around strategy and parameter selection.
  • Real-time market data is intrinsic to execution, and needed to recalibrate strategies using market feedback during the implementation process.
  • Market data provides a context for TCA and post-trade performance reports.
  • Market data is so integral to the process, that some algo providers who consider low-latency market data to be a competitive advantage have 'co-located' their data servers and algo engines in close proximity to the original data source (e.g. exchange). This may shave milliseconds from the data distribution trip, but efficiency of internal APIs and processing software remains paramount, even with this approach.

Other firms view cleansed data to be the primary requirement for algos to function most reliably. These providers are willing to sacrifice speed for quality and perform data checks before applying it to the model. Thus, there are decisions and trade-offs to be made, but it is clear that market data is the fuel that makes algos run.

While algo models surely won't run out of gas, given the staggering growth in market data traffic and quote messages in recent years, the fuel lines of algo trading could most certainly become clogged. A new data compression standard, the FAST protocol (also known as FIX Adapted for STreaming) launched at the end of 2005. Early adopters of this new standard are the exchanges that face increasing demands of their market participants to improve latency and performance under peak conditions. FAST is an enabling technology that can be used in the infrastructure between market venues and market participants to achieve response times measured in milliseconds.

As algo models are increasingly reliant on real-time market data, the latency of market data as well as the latency in order routing are continually stressed for improvements. Measurement and monitoring of latencies is fast becoming paramount, as the reasons for orders not being filled may be delays with the market data that supports the automated decision process, as well as latencies in getting order instructions submitted to the market venue. The infrastructure between matching engines and market participants is fast becoming an area for competitive improvements in performance with measurements and expectations in milliseconds.

The Financial Information Exchange (FIX) protocol

FIX protocol provides a standard message format in the public domain that has long been used as a communication tool for order placement and execution reports between counterparties. Most OMS vendors and in-house order management systems in the large buy-side firms support the FIX protocol for exchanging information related to trade orders and executions.

While FIX has been leveraged to support algo trading, customisation is required with each implementation. This typically involves two types of modifications to existing order routing systems:

  • Modify FIX gateway to support custom tags, and
  • Modify OMS order entry front-end to incorporate the algorithmic parameters.

The current process that firms use to distribute their algorithms on third party platforms is to develop rules of engagement for using FIX, a document that typically describes the strategies, defines custom tags, and suggests user interface requirements. Unfortunately, every time providers want to add new strategies (or refine existing strategies) they need to invent new tag numbers, update their specifications and wait for the clients and their vendors to code them and release new versions of the interfaces.

This has been a difficult process due to the resources required of both the broker and the vendor to implement algo strategies for each of their buy-side users. As the number of algorithmic providers has grown (currently more than 30), each with multiple strategies (usually between 4-10), the implementation task is compounded by the many vendors in the space with a wide-range of priorities and capabilities.

To expedite the process, there is a global FIX initiative underway that proposes algorithmic trading extensions to FIX messages to support unlimited algorithmic parameters in a standardised manner. The effort also addresses the vendors' and clients' needs to represent the new features on their front-ends. The Algorithmic Trading Working Group has been formed as part of the Global Technical Committee of FIX Protocol Limited (FPL), and is composed of representatives from the financial community including sell-side, buy-side and financial technology providers.

The group's proposed solutions will bring tremendous benefit to the community of algo providers and users as it will allow for more customisation of broker algorithms while shortening the time-to-market with minimal integration development work required by the OMS vendor or client. The FIX methodology will not only make real-time 'on-the-fly' integration of new/modified algorithms possible in any OMS, but it will also permit web-based deployment of new algorithms directly from brokers to their clients.1

Regulatory impact

In 2005, the US Securities and Exchange Commission passed Regulation NMS (the Rules)2, a comprehensive set of rules designed to strengthen the national market system for US equities. Scheduled for full implementation in mid-2006, some aspects of the regulation will impact firms' direct market access procedures and execution strategies in dealing with all US equity market centres.

Most notably, the Order Protection Rule requires that orders execute against the best bid or offer (i.e., the top-of-book) in the marketplace. Broker-dealers that want to make their own order routing decisions will need to sweep the markets to wipe out top-of-book quotes that are at a better price. (See inset for Regulation NMS background.)

Firms that offer DMA and algo trading are readying their systems and services to meet some of the new order routing and compliance standards. For example, new order types just introduced by Nasdaq as a result of Reg NMS include: Intermarket Sweep Order, Price to Comply Order, and Price to Post Order.3

While current algo models and DMA programmes may need to be modified to incorporate new order types and compliant routing decisions, the rules-based nature of algorithmic trading is well-suited to meet Reg NMS compliance requirements. In fact, the regulations could easily advance the use of algos for this purpose, as market-data driven routing employed by algos can be leveraged to meet regulatory compliance obligations. The data that is used to support the trading decision process can be collected, stored and reviewed to demonstrate that a firm's policies and procedures are effective in preventing trade-throughs and achieving best execution.

In order to ensure that routing decisions are made on current market data, low-latency real-time market data and active latency monitoring need to be addressed as part of a firm's policies and procedures. The functionality to locate best bid and offer in each market and act on it within one second assumes real-time access to the top-of-book BBO (best bid/offer) from the full complement of market centres. Requirements to maintain a low-latency market data environment with the infrastructure to process data and react in the microsecond timeframe needed to ensure execution quality is fully consistent with the algo approach. Many are preparing for anticipated increases in quote data and added traffic of messages per second as a result of automated listed trading and the potential for gaming the new Reg NMS rules as they relate to market data revenue.

While the sell-side has the compliance obligation for routing orders in accordance with these Rules, to the extent that buy-side firms choose to define their own execution strategies and venues, sponsoring sell-side firms need to ensure that their clients understand the Rules and the new obligations associated with US equity order routing and executions. Based on buy-side client trading strategies, their proprietary algo models and direct market access technology may require modifications.

Similar, albeit more expansive rules must also be considered with the anticipated adoption of MiFID (Markets in Financial Instruments Directive) across Europe. Originally planned for April 2006, the European Commission has delayed implementation until November 2007.

MiFID requires firms to establish and implement an order execution policy to allow them to obtain best execution. Much like their US counterparts, European users of algorithmic strategies should be well positioned to comply with certain MiFID requirements.

There are also other aspects of MiFID that could increase market fragmentation and disperse availability of market data. An increase in the use of algos might ease the burden and provide much needed support for regulatory compliance.

Regulation NMS background

On April 6, 2005, the Commission adopted Regulation NMS, a series of initiatives designed to modernise and strengthen the national market system for equity securities.

Components of Reg NMS:

  • Order Protection Rule - provides intermarket price priority for displayed and accessible quotations (Rule 611)
  • Access Rule - addresses access to markets (Rule 610)
  • Sub Penny Rule - establishes minimum pricing increments (Rule 612)
  • Market Data Rules include:
    • Allocation amendment - institutes a new Market Data Revenue Allocation Formula
    • Governance amendment - creates advisory committees
    • Distribution and Display Rules - governing market data (Rule 600, 601 & 603)
Trading centres affected:
  • National securities exchanges: AMEX, BSE, CBOE, CHX, Nasdaq, NSX, NYSE, PHLX and PCX
  • One national securities association: NASD (ADF)
  • Approximately 600 broker-dealers including:
    • Approximately 585 broker-dealers that internalise order flow (based on the number of registered market makers and specialists at year end 2003).
    • ECNs that trade NMS stocks.
  • ATSs that trade NMS stocks.

The road ahead

While algo trading has picked up speed over the past few years, the current regulatory environment combined with demands for performance from buy-side traders should allow algo trading to shift into the fast lane in the coming years. Ever-improving technology as well as standardisation of FIX messages will also move implementation of the new strategies along more quickly.

Paired with advances to come in market data, the algorithmic trading journey has just begun.

References

1 This section has been adapted from FPL Algorithmic Working Group: Algorithmic Trading Integration Work Plan, version 1.0; Authors: John Goeller/Daniel Clayden; published 01/26/2005; www.fixprotocol.org.

2 The full text of the release is available at www.sec.gov/rules/final/34-51808.pdf.

3 For complete details, please refer to the Nasdaq filing available at www.nasdaq.com/about/SR-NASDAQ-2006-001_Rule_Filing.pdf.