The longest sneezing attack, according to official records, lasted 993 days. The same figure -- 993 -- is the number of millions of US dollars revenue that exchanges made in the year 2000 from selling their real-time data. A figure not to be sneezed at -- but is it enough?
Compare this with the revenues of data vendors such as Reuters, who take real-time data from the exchanges, repackage it and distribute it to the exchanges' customers. In 2000 Reuters generated revenues of about USD2,500m, over twice that made by the exchanges, from the same raw materials. Interestingly, 2,500 is the approximate number of four-letter words in the English language. Should executives at the exchanges be uttering expletives in the realisation of the value that they are giving away? Or should they listen to Bernie Weinstein, CEO of the data vendor ILX, who says that "the exchanges should grow the trees and the data vendors should make the furniture".
In their former mutually owned days, exchanges were less concerned about making money and more concerned about the quality of order execution and supporting services. Now, as they are increasingly owned by shareholders who are not active participants at the exchange, financial performance takes top priority. But the opportunities to improve financial performance are limited. The number one revenue stream for most exchanges -- transaction fees -- is being squeezed as competing venues fight for market share on price, and as their eventual customers -- the investing sector -- become more cost conscious.
What about the business that, for many exchanges, is the number two revenue stream: so called 'Information Products'? To date, exchanges have been extremely innovative in packaging and selling information products containing real-time or reference market data. But one apparently easy target remains. Exchanges could make a move for the large slice of market data revenue currently captured by the data vendors -- Reuters, Bloomberg, Thomson Financial and their peers -- by disintermediating these data vendors and selling their real-time data direct to their customers.
In this article we examine why the exchanges have not done this before, and why it may be possible today as a result of new technology. Finally, we look at whether the exchanges' customers -- the banks and brokerages that need exchange data -- would allow this to happen. Will it be milk and honey for the exchanges, and curtains for the data vendors? Read on and find out.
You Never Give Me Your Money
Data vendors capture most of the value from real-time exchange data
The world market for exchange-sourced information products is, at a conservative estimate, USD6bn. Of this revenue, only USD1bn is captured by the exchanges from where the data comes. The remaining value is captured by data vendors who clean, consolidate and repackage the data for their customers.
The USD1bn of exchange revenues take the form of exchange fees, charged by the exchanges and usually collected by the data vendors. An exchange fee is a personal license to use specific data from an exchange. A site fee may also be levied. Payment of an exchange fee will not in itself provide an end user with anything. The data vendors collect data from exchanges, mix in news and value added data (e.g. VWAP, intraday highs and lows) and deliver the data to end users, often via workstation products specifically designed for the display of financial information. The USD5bn of data vendor revenues represent the payment by end users for this service.
Usually, customers buy a market data package from a data vendor, and then pay exchange fees for the selection of exchanges that each individual user needs. Data from the remaining exchanges will usually be available delayed by 15 or 20 minutes, thereby avoiding the need to pay for exchange fees from those sources.
We can see how value is delivered from exchanges to end users by looking at two value chains, both showing the delivery of data to an individual user, rather than to a collection of users in a firm. The first value chain is for the delivery of real-time data to a trader on a single exchange. We assume that this user pays only one exchange fee -- in this example to the London Stock Exchange -- and receives the remaining data delayed, in order to avoid fees.
Figure 1: Domestic data value chain
It is difficult to generalise about exchange data value chains because of the variety of fee rates and business models used. The above model, based on London Stock Exchange's published tariffs for Level 1 data, is fairly representative of industry norms. In this example the exchange levies two charges: an end user exchange fee (USD50 per month per user) and a wholesale fee for the vendor (a total of USD34,000 per year, but negligible when split across thousands of subscribers). An individual end user would pay a total of about USD300 per month to a data vendor for access to the same data within an open systems environment (i.e. without any display application). This consists of the USD50 exchange fee that the vendor collects on behalf of the exchange, and USD250 for a data vendor 'domestic data only' package (excluding original editorial content that would need to be purchased separately). The mark up on the raw data is 500%, or six times.
We can also look at a more complex example. Below is the same value chain, but illustrated using the example of an international cash equities trader. This trader needs full depth of real-time data from Nasdaq, NYSE, LSE, Euronext and Deutsche Börse (Xetra).
Figure 2: International data value chain
In this example, the total fees for Level 2 data from the five exchanges are USD330 per month. The vendor would charge a total of USD780, consisting of about USD450 for an 'international' data package excluding news, plus the USD330 that is passed directly to the exchange. The vendor mark up in this example is much less than in the domestic example: 150%, or 2.5 times.
Not all exchange data is worth money. Exchanges that have attracted liquidity in major stocks can charge a premium for their prices, because these prices effectively define the market price for the instruments. Other exchanges, with less liquidity or with less popular stocks, are forced to give their data away for free, as a form of advertising for business. Over time, if they succeed in attracting liquidity away from their competitors, they may be able to charge for their data.
As an aside, it is interesting to note the use of market data fees by US stock markets as a weapon to attract business. In February 2002, Island defected from Nasdaq to the Cincinnati Stock Exchange, taking with it 20% of Nasdaq's trading volume. This move was triggered by the promise of a larger share of market data revenues -- 75% compared to the 60% that Nasdaq shared with its traders. Nasdaq responded by increasing its handout to 80% of market data revenues, but only for firms that report at least 95% of their trades through Nasdaq.
From Me To You
Data vendors play an essential role in the distribution of exchange data
Exchanges broadcast their data to the data vendors, who as we have seen will package it up with other data and charge a premium of between 150% and 500%. The data from exchanges will be in a proprietary format, so one of the jobs of the data vendor is to translate the format into a standard that will be understood by their customers. Many exchanges also offer their feed to end users. Customers that take this exchange feed will need to develop their own 'feed handler' to translate the feed contents into a meaningful format.
Figure 3 shows the consolidation of data from multiple exchanges, and the delivery of two alternative offerings: a 'closed' solution for screen users, accessing data on a workstation only, and an 'open' solution feeding into a customer's market data platform such as Triarch. This platform can collect data from multiple sources (e.g. direct from exchanges, or from alternative vendors) and can distribute data to applications and workstations on the customer site.
Member firms at an exchange will usually have electronic access to the market through a trading interface. This may be an in-house development (written to the exchange's trading API) or may be provided by an independent software vendor (ISV) with connections to multiple exchanges, such as GL Trade. The trading interface will usually provide the same or better market data than the exchange's market data feed. However, many firms will take the market data feed too -- effectively paying for their own data -- because it is easier to integrate into their market data distribution platform.
Revolution
New technology could help exchanges to capture more value
If there is value for exchanges to capture from data vendors, why aren't they doing it already? In a sense they are; as explained earlier, many exchanges offer a raw feed as well as providing data to vendors to redistribute. However, few customers choose this option because of the technical difficulty and the cost. The technical difficulty arises because each exchange feed uses different messaging formats and telecommunications protocols. Customers would need to develop a feed handler for each exchange they need to access. The costs arise because customers would need to connect directly to each exchange -- usually an international, high capacity dedicated line -- rather than have a single connection to a local data centre providing all their data needs. So, despite the opportunity to avoid the data vendor mark up on exchange data, and to get their data a few milliseconds faster, few customers choose this option.
Three technological developments change the picture. The first is XML, or at least a market data specific variant such as the Market Data Definition Language (MDDL). MDDL -- or any other messaging format that the exchanges could agree on -- would solve one of the problems that data vendors currently fix: the need to convert a range of proprietary, exchange specific data formats into a standard format. The second development is the emergence of secure, reliable IP data networks such as Radianz, Savvis and (at the time of writing) Global Crossing. These networks would allow exchanges to connect to their customers without needing the help of a data vendor. In fact, LIFFE already uses a global network managed by Equant (who own half of Radianz) to provide worldwide access to their trading system, though not at present for their market data. The third development is the availability of a technology to send 'streaming' data -- for example unsolicited quote and trade messages -- over an IP network. This technology has been developed by the data vendors themselves, and by development houses such as Caplin Systems, who offer the Real Time Trading Protocol (RTTP).
In summary, the combination of a standard messaging format (e.g. MDDL) and a secure, reliable, worldwide data network (e.g. Radianz) with a data streaming capability (e.g. RTTP) would allow exchanges to publish their market data direct to their customers. If they did this, the distribution model would change from the one shown in Figure 3, with the data vendors collecting and distributing data, to the one shown in Figure 4.
Under the Direct Delivery model, data from multiple exchanges is consolidated on a single network and delivered to end user workstations (with a feed handler included in the workstation) or to market data platforms. The workstations or other end user applications would be responsible for bringing a common look and order to the data, perhaps applying market rules such as the calculation of highs, lows and end of day prices. Unlike today, when a feed handler is required for each exchange feed, a single feed handler would be needed to capture data onto a market data platform.
The next question to address is this: assuming this delivery model could be created, how could exchanges use it to their commercial advantage? Without decisive action, exchanges would see no change to the fees they collect, whilst the banks and brokerages that use the data would see significant savings in their market data bills, even assuming that they pay separately for the data delivery network. Exchanges could reasonably expect a share of the rewards, but how could this be achieved? A possible pricing strategy would be to offer two feeds: the existing feed in an unchanged proprietary format, as used by the data vendors and a few end users, could continue to be offered at the same price as today. A new Direct Delivery feed over a global network, using XML or another standard protocol, would be offered at a premium. Differential pricing between data vendors and end users for the same feed, which is illegal in the US, would not be needed.
An alternative strategy would be to arrange revenue sharing with the provider of the distribution network. Exchange fees would remain unchanged in this scenario. End users would buy a combined package of exchange data and connectivity. Each exchange would need to negotiate their revenue share separately with the network provider. Data vendors could continue to access exchange data, either by direct connection as now or from the network provider.
Please Please Me
End users of real-time data are not happy
Perhaps the strongest argument for a move to the Direct Delivery model is that the banks and brokerages, which are the ultimate customers for all this data, want a change. This is not just a cost issue, although the IT and market data budgets for firms using exchange data are under intense pressure. User firms are also concerned about data vendors forcing costs on them every time the vendors choose to change their technology. According to one insider, the banks are looking to break away from single supplier architectures in order to avoid this in the future. They will do this by exploiting standards. The largest user firms also see a great advantage in taking data direct from exchanges. Firstly, it arrives quicker, because it avoids the complex processing that the data vendors apply. Secondly, it is also more 'faithful': it does not suffer from the application of market rules that the user firms feel are their responsibility, and not the data vendors'. After all, they are the experts in the markets.
Of course, the buyers of information products in the banks and brokerages must be conservative and cautious. They must protect existing investments in infrastructure and applications, and must absolutely avoid risk. Despite pressures to reduce costs, they are resistant to fixing things that 'ain't broke'. So a strong theoretical argument for a new data delivery model would not be enough to move user firms to it. However, a number of changes will be forced on them over the next few years -- notably the introduction of T+1 and STP, and the switch by Reuters from Triarch to RMDS -- and these changes will encourage user firms to consider their options. The timing may be right for a new delivery model.
Hello Goodbye
Data vendor revenues may get squeezed
So far we have established that exchanges need to increase their information product revenues, and that new technologies could allow them to do this by selling real-time data direct to end users. What would these changes mean for data vendors?
We do not believe that all user firms would switch to Direct Delivery. The largest firms, currently taking 'open' solutions (i.e. data feeds serving market data platforms) from the vendors, would be prepared to take the technology risk, and would be able to integrate the new direct feed into their systems. Smaller firms, which currently take closed workstation solutions from the data vendors, would continue to prefer these simpler services with lower IT overheads. Direct Delivery would in most cases not be appropriate for these firms. But even some of these smaller firms could be served by simpler Web based offerings from individual exchanges. For example, Deutsche Börse announced in early 2002 that it was developing a web-delivered information product targeted at 20,000 small, domestic savings banks that could not afford Reuters or Bloomberg. Of course, single exchange products are only appropriate for firms that are primarily interested in just one exchange.
Figure 5: The Vendor Squeeze
Given these changes, we would expect to see the role of data vendors squeezed into serving small firms with an interest in data from multiple exchanges. How serious would this be for the vendors? They would lose much of the USD5bn revenue that they get for exchange-sourced information products.
Let It Be
Exchanges are not grasping the opportunity
Given the opportunity, you would expect to see exchanges rushing to agree a standard data delivery format, and to form an alliance with a global network provider for distribution of this data. The reality is far less dramatic. A few exchanges have made small moves towards a Direct Delivery model, but as independent initiatives, so losing much of the power of the model. Deutsche Börse is one of the most advanced exchanges in its thinking on information products. In February 2002, it announced an intention to double information product revenues. Central to these plans is the Consolidated Exchange Feed (CEF), which integrates data from the Xetra cash trading platform and from the Eurex derivatives market. CEF is designed to allow the flexible integration of additional data sources and, most significantly, supports direct connections by end users and the administration of those connections. CEF will be the basis of the Web-delivered product mentioned earlier, which is designed to support 20,000 small, price sensitive customers directly from the exchange.
Other exchanges offer products direct to their customers over the Internet. Easdaqlive, a browser-based market data system now owned by Nasdaq Europe, uses Caplin's Real-Time Transfer Protocol to display data from Nasdaq Europe and from 15 other stock exchanges and other sources. NYSE, Nasdaq and the International Petroleum Exchange have similar Internet-based products. The CME described their Internet delivered product, launched in February 2002, as a response to customers demanding to be able to purchase real-time quotes directly from the exchange. They now offer real-time streaming quotes by logging on the CME Web site where a user can register and purchase various quote packages.
But even those exchanges making small moves towards a Direct Delivery model claim they have no interest in supplanting the vendors in their role as primary distributors and administrators for information products. They see the data vendors as their partners in the business of distributing data. The exchanges are experts in running a market, whilst the vendors are experts in distribution and administration of information products.
Why aren't exchanges jumping at this opportunity? In preparing for this article, the largest exchanges across the world were polled for their views on selling their data direct. The following quotes are representative of what they said:
"We don't have a marketing department. Our market data department is just two guys." -- UK exchange.
"We're not a technology company. We cannot develop and support feeds." -- UK exchange.
"We are in the business of bulk data dissemination. We cannot administer thousands of individual customers." -- European exchange.
"We provide raw data. Quote vendors add value to the data." -- European exchange.
"Customers want multiple exchanges, not just us." -- European exchange.
"Clients don't want direct delivery." -- European exchange.
"Exchange demutulisation will make it hard for exchanges to agree a standard." -- UK exchange.
"We cannot provide data direct to most of our customers because the volume of data is too large. We need data vendors to filter the data." -- US exchange.
In the remaining sections we explore these objections, and the possible objections of the end user firms.
With A Little Help From My Friends
Exchanges must collaborate on data standards
For the Direct Delivery vision to become reality, exchanges must publish their data in a standard format. Although exchanges are in many cases direct competitors, this is not a reason to avoid working together on standards, because a standard format will help them all. The nature of the format matters less than the fact that it is a standard. With the fanfare surrounding XML, there is much more energy behind standards initiatives these days. Two initiatives in particular may offer a way forward.
The first is the Market Data Definition Language (MDDL). MDDL is a variant of XML, describing standard formats and definitions for market data. MDDL is being driven by the data vendors (Reuters, Bloomberg, Dow Jones, S&P Comstock and Sungard) and the major buy-side and sell-side user firms. Interestingly, only two exchanges -- NYSE and Nasdaq -- are actively involved in the organisation. The MDDL mission statement explains that, although MDDL has been designed initially for snapshot and time series applications, it can be extended to historical, streaming and interpretative and vendor-specific data models.
MDDL also addresses the integration of feeds. Quoting from the mission statement:
"From the user perspective, the goal of MDDL is to enable users to integrate data from multiple sources by standardising both the input feeds used for data warehousing (i.e. define what's being provided by vendors) and the output methods by which client applications request the data (i.e. ensure compatibility on how to get data in and out of applications)."
Although this does not mention integrating data from exchanges, one could observe that this is not different in principle to integrating multiple vendor feeds.
The second initiative is the convergence of two mature and well-adopted messaging formats. FIX was originally designed for order routing and order execution. In 2000, it was extended to support market data, and has since been adopted by NYSE, Euronext and other exchanges for distributing data (admittedly for niche products, rather than their core real-time products). SWIFT supports a range of message formats, and a messaging network, for post-trade messages. In 2001, FIX and SWIFT agreed to converge their respective messaging protocols into ISO 15022 XML, which they anticipate will become the industry standard. Although this broad standard does not yet cover market data, it will clearly exert influence on associated standards. After all, post-trade information, pre-trade information and market data share a great deal of content. It doesn't make sense to define them multiple times.
Neither MDDL nor FIX is immediately suitable as a format for exchanges to deliver data direct to their clients. One of the reasons is the size of individual messages. The totality of real-time exchange data is measured in thousands of messages per second, so message compression is critical to keep the cost of telecommunications between user firms and the exchanges within reasonable bounds. Neither MDDL nor FIX is designed for telecommunications efficiency. The good news is that MDDL is very compressible -- down to 10% of its size, according to trials at one user firm.
It is entirely possible that an alternative, as yet unimagined format could be used for data distribution -- something along the lines of the highly compressed and reliable proprietary protocols used by the data vendors. This would not necessarily stop the use of MDDL within the user firms. Firms would need to create a mapping table to translate the contents of the Direct Delivery feed into MDDL and input into their applications. If exchanges published directly in MDDL, this translation would not be necessary.
In summary, the need to create and agree a standard format for exchange data should not prevent the creation of the Direct Delivery vision. Standards are coming anyway.
Across The Universe
A data distribution network is more than a wire
Bernie Weinstein, CEO of data vendor ILX, says that "those who believe that vendors can be disintermediated show an ignorance of the rules of business. There's a reason why data distribution evolved as it did. The exchanges grow trees, we're making furniture". When he talks about the rules of business, he is referring to two economies of scale that the vendors provide: network services and administration.
Quote vendors rightly protest that there is more to distributing data than just pushing it down a wire. Vendors provide reliability (by running various content checks on the data in real time), entitlement management (by ensuring that only authorised customers receive data), fast request and recovery (by providing intermediate caches), and, most importantly, speed of distribution, where milliseconds really matter. Most data vendors also offer a 'by interest' service, where the end user only receives data that has been explicitly requested, thereby minimising telecommunications costs. A vanilla IP network would offer none of these. These capabilities would need to be layered on top of the network -- so the IP network would provide connectivity, with someone else providing the value added services. Would the exchanges or the network suppliers develop these network services and pay for their deployment? Either way, they would be getting into a business they don't understand and probably cannot afford: the data vending business.
A further problem is the administration of thousands of end users. The most common reason given by exchanges for not wanting to sell data direct to end users is that they could not manage the administrative burden. They see the data vendors' main role as being administrative: entitling users, collecting fees and managing contracts.
Don't Let Me Down
End user firms are driven by aversion to risk
The benefits to end user firms of a Direct Delivery model can be summarised under three headings:
- Cost: data should be cheaper if the data vendor mark-up is avoided
- Speed: data should arrive faster if it avoids the vendor processing systems
- Faithfulness: data should be untouched by vendor systems.
The cost benefit to exchanges is partially offset by the additional cost of managing multiple feeds from exchanges, rather than a single consolidated feed from the vendors. This is not entirely a technical issue; non-technical costs include arranging contractual terms, payment of fees, and management of entitlements. In fact, all of the administrative concerns of the exchanges -- of having to maintain contractual and billing relationships with multiple parties -- are mirrored by the end user firms.
The largest user firms already take direct feeds from exchanges, and go to the trouble of managing the integration of feeds and managing the necessary paperwork. They do this to get fast, faithful data. These firms almost always take one or more vendor feeds too. This is because, for some users and applications, they need the additional data in the feeds: news, analytics, historic data, broker research, and value added fields such as intraday high and low prices. They also value the quality checks that the vendors apply, although they deliberately avoid these checks when choosing a direct feed. Even the exchanges themselves need vendor feeds. For example, 80% of the NYSE floor use ILX data rather than NYSE's own data.
Under the Direct Delivery model, a further management cost should not be forgotten: fault management. Data vendors own the collection, validation and distribution of data through to their clients' sites. They often own the workstation software too. Any perceived problems with the data can be reported to the vendor, who will take ownership of the problem even if its cause is outside its walls. The end user does not need to establish in which organisation the fault lies (e.g. in the original exchange data, in the collection and distribution systems, on the workstation software) before it can be reported. Contrast this with the Direct Delivery model, where the multiple parties involved in the data delivery chain could pass the buck between each other over a reported fault. This is reminiscent of a recent wonderful IBM advertisement, in which a hassled female IT director is chairing a meeting about a business-critical fault that has hit her company. She asks around the table what the problem is. The database supplier blames the hardware supplier, the hardware supplier blames the network supplier and so on. She then asks: "Who is responsible for making all these systems work together?". A long silence ensues, followed by one of the assembled techies suggesting: "That would be you". This is the situation that today's finance sector IT manager strives to avoid.
Perhaps the strongest argument of all against user firms adopting Direct Delivery is the issue of risk. Consider the music company EMI's thinking on the distribution of tapes and CDs to their outlets. Distribution is clearly not a core competence of EMI. They know about music, publishing, rights management and intellectual property. They are not experts in lorries and warehouses. However, after much consideration, EMI decided not to outsource their distribution operation. Why not? Because they would save 1% of costs but risk 100% of revenues. Devin Wenig of Reuters echoed these points in a recent article in the Financial Times. In describing the threat to Reuters from the Internet, he commented: "All applications are mission critical, and it is a very demanding client base. They can't afford to experiment with unproven technology -- it has to work, and it has to fit the business need."
The Long And Winding Road
Change may come slowly
Given these conflicting thoughts, what is actually likely to happen? The two tiers of delivery that we see today -- multiple proprietary feeds from the exchanges, and intermediated data from the vendors -- will both continue to be widely used. As exchanges adopt standards, the multiple proprietary feeds could merge into a single, standard Direct Delivery feed, and the largest user firms currently taking intermediated data could move to Direct Delivery. This feed could replace or complement their current vendor feed. Smaller, less IT-capable firms would not be able to take the Direct Delivery feed and would instead continue to take a vendor feed.
How will these changes actually happen? They could occur through exchanges strategic actions, working together on data standards and allying with network providers for delivery. They could occur as exchanges and their software suppliers consolidate, leading naturally to fewer and fewer different messaging formats and distribution networks. Or they could happen by accident: the problems that the exchanges must overcome to deliver their data direct may be addressed by the IT industry in another domain -- e.g. a network for supply chain automation -- and then the exchanges may discover this network suits them perfectly. In the words of a strategist at one of the data vendors: "This is how things happen. It is classic IT accident that could allow exchanges to change their business model. And when they can change, and the economic benefits are clear, they will change."
An alternative vision of the future is offered by the same strategist. Vendors are paid to do things that are perceived to be difficult. What if these things start to look easy, because they are in part provided by off-the-shelf packages or pre-packaged services. Even if competition does not bring prices down, end user firms will be less prepared to pay a premium for the services simply because they look easy. According to the strategist: "If the end user pays say USD900 pcm, and knows the exchange is getting USD50, he will wonder where is the remaining USD850 is going. It's going on pure IT services that are valuable but not that difficult. It could be difficult for someone else to do them, so competition doesn't necessarily bring prices down, but this is more to do with the entrenched position of data vendors rather than their competency". So in this vision the combined pricing pressure of user firms will reduce data vendors revenues, but not to the advantage of exchanges.
Either way, the disappointing conclusion is that the opportunities for revenue growth are not as promising as may first have appeared. Though some growth may be possible, it will not be achieved without addressing the challenges of providing a reliable and robust distribution network, and developing entitlement and administration systems suitable for managing thousands of direct customers. And even then, direct delivery will not suit all customers.
Is there any precedent for the owners of content to take control of distribution of their assets from an incumbent intermediary? The television sports rights industry offers an interesting parallel. Historically, the owners of sports rights, for example the UK's football Premier League, have sold their rights wholesale to intermediaries who manage distribution, entitlement, subscription and customer service. In the case of football in the UK, the intermediaries are satellite and cable TV networks. This year the Premier League is reported to be considering launching its own TV channel in a bid to shore up the value of the television rights. However, this arrangement would still require one of the pay TV networks to carry the channel, and to administer it. In the words of one observer, as reported by the Financial Times, "Content is king, but distribution is King Kong".
We started by asking whether exchange executives should be uttering expletives in response to the revelation that the data vendors are capturing most of the value from the exchanges' real-time data. Our conclusion is that data vendors largely deserve their cut for the services they provide. However, with changes to technology, the emergence of standards, the consolidation of exchanges and suppliers and the strategic actions of the individual exchanges, we may see this cut reduce over time. A realistic target for exchanges could be to increase information product revenues by 50%. By coincidence, 50% is the percentage of fathers who are too busy to spend quality time with their families.
I'm off to see mine. Goodnight.