It’s hard to believe that barely 15 years ago, the business of supplying real-time market data to the trading and investment community was dominated by just a handful of large companies. That handful has shrunk to two, three, possibly four major players. But the degree to which the major vendors dominated 15 years ago is surely much less today. Today these firms are being attacked on many fronts. And while poor market conditions may have had a hand in the changing landscape, new technologies, attitudes and business requirements are as much to blame for this, the beginning of the end of the age of the Big Vendors.
I grew up in those glory days of the market data vendor. The London market was dominated by Reuters, Telerate and the London Stock Exchange; the New York market by Telerate, Quotron and, to a lesser extent, Reuters. In 15 years, that landscape has changed to one (literally) dominated by Bloomberg, Reuters and, to a lesser extent, Thomson Financial. Telerate continues, snapping at the heels of the Big Three. But Quotron is gone, first outmanoeuvred by ADP, now also gone, and then subsumed into the mighty Reuters. The London Stock Exchange, too, bowed out of the business a decade ago, although it’s now back in the game through its acquisition of ProQuote, a low-cost provider.
On the surface, all that seems to have happened is that one set of Big Vendors has been supplanted by another. But there’s more to the story than that. Back in the late 1980s, market data vendors’ businesses were far simpler than they are today. This was due to a number of factors. First, the then-current technology – pre-digital feeds – meant control over distribution of vendor services was relatively straightforward. Second, content originators were more willing to consider exclusive distribution arrangements with market data vendors. Third, the vendors themselves were more specialised than they are today: each had its strengths and – because the other two factors meant revenues were robust – each kept out of each other’s space.
During the 1990s, the landscape began to change. Digital feeds, trading room platforms, pricing engines and algorithmic trading models all created new demands for pricing services. New competitors put pressure on redistributor arrangements. Content originators questioned exclusive distribution deals, some even developing – with scant success – their own direct delivery systems. With the arrival of the Internet, many ‘difficult to get’ service became commoditised. Revenues and margins plummeted, and the market went into consolidation, a situation best exemplified by the spending spree and subsequent collapse of Bridge Information Systems.
This was the story for at least the bulk of the market data vendor marketplace. The exception to the rule, of course, was Bloomberg. Bloomberg had never conceded that it was a market data vendor. It always maintained that it was something else, removed from the crowd of suppliers whose very existence depended on their latest real-time update. Bloomberg’s one-size-fits-all pricing model and focus on quality – in terms of data breadth and depth – allowed it to differentiate itself from the rest of the pack. More importantly, though, Bloomberg cultivated the idea of community among its subscribers. The system became not only the place to check prices, rates and sport scores. It also provided the medium for communicating with contacts, clients and counterparties.
As a result of all these ongoing changes, Bloomberg emerged as market leader. Reuters floundered, and is only now starting to recover. The other two large players remain in catch-up mode.
But the sands are still shifting. Recent developments in the delivery technology area have meant that the major players are facing new challenges to their business. In essence, the market is fragmenting in terms of the ways the user community wish to receive the financial information they need from their suppliers. And this fragmentation may ultimately have a bearing on just who those suppliers are.
As the market starts what looks to be a long and slow recovery from the market crash of spring 2000 and tragedy of September 11, a new landscape for market data delivery is emerging. Since those dark days, a series of negative factors have had varying impact on how financial institutions approach the challenge of delivering real-time data to their traders and their trading applications.
Of course, the events of 9/11 probably had the most dramatic impact: namely, to stop all innovation in this space for at least six months and possibly up to two years, depending on your view. But a number of micro factors also played a role in shaping the new landscape.
In the specific area of trading room data delivery, the chief factor was probably Reuters’ odd – though politically understandable – decision to merge its TIB/Rendezvous and Triarch trading room platforms. This in essence forced a decision about market data delivery infrastructure upon the majority of the big institutions, who had to comply with Reuters’ policy of obsolescence for its two market leading ‘brands’ and accept (or reject) its new product offering, the Reuters Market Data System (RMDS).
Another factor in the shaping of the new landscape has been the knock-on effect of the recession and 9/11: a dramatic new pressure on cost control. The result here has been the emergence of low-cost, often browser-based and/or Internet-delivered market data display systems, nipping at the heels of the mid-tier offerings of the market leaders.
Lastly, transactions automation finally took hold and higher rates of price updates emanated from data sources. The result has been demand-pull and supply-push of high-speed, low-latency delivery mechanisms for pricing data, to meet the requirement for high-speed access to electronic markets.
From the seemingly inexorable march towards total Reuters domination of the platform business, these pressures have conspired to create a new world order, as it were. And Reuters’ place in this new world order is by no means assured as, perhaps surprisingly, competitors are finally getting serious about their feed strategies.
This leaves the marketplace with a new, three-tier structure, with substantially different technologies and players. As these tiers begin to take hold, the Big Vendors’ grasp on their traditional delivery channels may start to weaken.
At the lower end of this new spectrum lies the gamut of low-cost delivery alternatives. These include offerings for streaming real-time data across IP networks (Caplin Systems, CSK’s Slingshot, I-Deal Data Systems and others), and browser-based display providers (Knowledge Technology Solutions, Proquote, Infotec, Techop, Mandarin and others). After attempting to strike at the heart of the trading room environment, these providers appear to have found a niche in the branch or off-trading-room areas of the business, including retail broking, investment banking, corporate finance and research.
At the top end of the market – reliable and high-speed delivery of real-time data, primarily to drive trading and pricing applications rather than display for consumption by the human eye – the landscape is entirely different. Here, direct connections to exchanges, increasingly under pressure to deliver ever-faster updates to price changes, are becoming the norm, in a hail back to the late 1980s when many large institutions built their own ticker plants to ensure mission-critical applications got the data they needed.
Today, though, firms are looking to outside vendors to help support such applications, which include pricing, basket-trading and other forms of arbitrage-based electronic trading. While the new market is only now emerging, the two main players appear to be InfoDyne, a stalwart in helping firms with various aspects of data management and distribution, and Hyperfeed Technologies, which more recently switched its business model to focus on this area of the business.
To date, neither has scored a major victory at an institution seeking to implement this new model for the low-latency end of the market. Instead, they have implemented their ticker plant technologies – our term – at data originators and vendors: Telekurs USA in InfoDyne’s case, and Moneyline Telerate in Hyperfeed’s.
It’s surely only a matter of time, however, before a major institution adopts this approach – we know of several pilots – to support its high-end applications, and perhaps even middle- and lower-tier ones. Their solutions’ appeal is being strengthened by the availability of straightforward links to exchanges and ECNs offered via Securities Industry Automation Corp. (SIAC) in turn through connections from the likes of Savvis and Radianz.
That leaves the middle tier of the marketplace, the feed-and-platform combinations. Here, Reuters dominates, with Moneyline Telerate’s TRS attempting a reincarnation. This set-up – largely supported by feeds from Reuters, Moneyline and less than a handful of others – has characterised the trading room technology landscape for almost a decade. (Who remembers Micgrognosis, Data Logic, Desisco, Teknekron, FD Consulting?)
But entrants young and old are making inroads. Moneyline Telerate has done a substantial feed deals with Citibank and ICAP, and has signed a strategic alliance with Comstock, now part of Financial Times Interactive Data. Bloomberg has high hopes for its Fat Pipe Feed and has even re-designated the Open Bloomberg Server as its Thin Pipe Feed. A new feed supplier, Relegance, has sealed a distribution agreement with Radianz. Bloomberg is further rumoured to be talking to IBM about bringing to market an ‘independent’ data distribution platform to compete with RMDS, perhaps the platform we suggested the vendor would never build only a few months ago.
Elsewhere, Thomson Financial has finally hired someone to sort out its data delivery strategy. And Tibco – practically free of Reuters’ ownership – is again turning its attentions to the financial services marketplace. Although precluded from direct involvement in the market data delivery market space, its messaging capabilities may allow it to support others, at least peripherally.
Notwithstanding this hotbed of competition within the traditional market data platform marketplace, the two outer tiers will apply pressure on this middle band. From the lower end of the spectrum, continuing cost pressures will raise questions about the need to continue with high-end feed services and higher-cost local delivery infrastructures. Over time, this will provide increasing numbers of decision points for lower-end users that could benefit Caplin et al.
Meanwhile, Reuters will continue to struggle to convince its TIB/Rendezvous clients to upgrade to RMDS. The upgrade from Triarch is a painless affair: add the so-called point-to-point server and a Triarch implementation is basically transformed to RMDS at extremely low cost. Hence, the glut of announcements of migrations at big clients like UBS.
But it’s an altogether more complex affair for TIB users, not least because no two TIB/Rendezvous implementations are the same. Moreover, TIB – as opposed to Triarch – adopters during the 1990s were always the more technically adept or ambitious clients. These clients today are more likely to consider a change in overall market data delivery strategy, as opposed to a simple RMDS upgrade, than their Triarch peers. This may give the likes of InfoDyne and Hyperfeed the chance to offer a wholesale change in the way these institutions manage their real-time data systems.
Add to the above the notion that Reuters has its own issues for delivery technology migration. Specifically, the new Reuters Data Network (RDN) will be reliant on technologies developed by the former Bridge Information Systems, acquired by Reuters a couple of years back. BridgeFeed’s (now RDN’s) capabilities for handling non-U.S. data sets has come under question. Furthermore, adoption of RDN will ultimately require adoption of Bridge’s securities identification nomenclature to supersede the Reuters Instrument Code (RIC) identifier.
It’s into this new three-tier delivery landscape that content originators now threaten to throw their hats. With such options open to them, these originators may not need the Big Vendors in quite the same way they did back in the glory days of 15 years ago.
Essentially, market data vendors are consolidators of other people’s data. There are two main models for the arrangements the vendors have with the originators. First, real-time or delayed pricing generated by liquidity vehicles like exchanges and interdealer brokers are often aggregated by the vendors in order to offer financial institutions a convenient way of receiving price information from numerous markets simultaneously. It has, by the way, long been a source of some contention that financial institutions pay both to contribute to and to receive services from market data vendors. Second, providers of other value-added services such as news, market commentary or research often share revenues with redistributors like Reuters and Bloomberg in return for delivery capability via those vendors’ networks.
With the emergence of two new delivery channel options, originators may begin to think twice about how they get their data in front of the final end-user. The early days of the Internet boom held much promise of such a scenario, but this never materialised. In part, this was because of the realisation – too late, for many – of the limitations of a wholly unregulated and unmaintained network resource. Today’s Internet- and browser-based data delivery suppliers understand those limitations and pitch their services at appropriate price points for the appropriate audience.
Elsewhere, the emergence of private networks like Savvis and Radianz, combined with the data management capabilities of the likes of InfoDyne and Hyperfeed, may offer real opportunities for efficiencies. Content originators may use such facilities to gain independent delivery and thereby increase revenues (with no requirement to share with the Big Vendors). Users may opt to receive only what they want, rather than what is packaged by the product managers of the Big Vendors.
But all may not be lost for the bigger market data vendors. Another twist in the market, relating to the recent obsession with reference data, may play directly into their hands.
Of course, the obsession with reference data isn’t really about reference data at all. It’s about risk, operational risk. Reference data has emerged in recent years as the antidote to trade losses caused by operational inefficiencies. These may involve erroneous valuations caused by bad pricing data. Or they could involve broken trades due to incorrect counterparty data. Whatever the case, the industry has jumped on the reference data bandwagon as a cure for all manner of operational risk-related ills.
And therein lies the opportunity for the Big Vendors. As things now stand, front-office and back-office operations are supported by separate suppliers of pricing and descriptive data. While Reuters and Bloomberg dominate the front office, the reference data marketplace is dominated by the likes of FT Interactive Data, Telekurs and others. Only now are Reuters and Bloomberg turning their attention to these back-office requirements.
It’s becoming clear that unifying data sources for both front- and back-office operations could further reap reductions in operational risk. It follows, then, that the market can expect to see front-office suppliers start offering back-office solutions and vice versa.
Certainly, Reuters has begun looking at how to leverage its front-office assets in the back office, and has created a small but successful startup business – DataScope – in this area. Bloomberg’s Data License is clearly aimed at doing the same thing. Moneyline Telerate last year entered the back-office data management space through the acquisition of Market Information Services. And Thomson Financial’s Datastream business complements the front-office data services offered through its Global Topic and ILX product lines.
So too is the writing on the wall in the back-office area. Telekurs has long offered its real-time feed to potential front-office clients and is now being taken seriously as an alternative to the main suppliers. FT Interactive Data, meanwhile, has bought Comstock from Standard & Poor’s. This represents a move from its back-office roots into the realm of the front office, where Comstock is also partnering with Moneyline Telerate.
The back office isn’t the only ray of hope for the Big Vendors, at least for two of them. Recent indications suggest that the big retail wars are about to start up again, having been largely on the backburner for several years. Reuters and Thomson Financial – read: Quotron and ILX – are slogging it out once again. A third player, SunGard Market Data – read: ADP – hasn’t yet made a major move, having lost its quote-terminal champion to Radianz last year.
Thomson Financial seems to be setting the pace for now. To much fanfare, last year it won the vast Merrill Lynch contract to supply ‘wealth management’ services to the firm’s 27,000 or so retail brokers. Now, it’s just won a $200 million deal to implement its Thomson ONE platform to Wachovia Corp. The deal is particularly galling for Reuters since it will involve replacing some 6,000 units at the former Prudential Securities, now owned by Wachovia and a long-time client of Reuters Plus, Reuters’ U.S. retail broker quote terminal product, which grew from the Quotron product line.
U.S. retail brokerage is interesting territory for the Big Vendors. This is mainly because the monetary sums involved are huge – Merrill is worth $300 million to Thomson – and the projects themselves need the credibility of a Big Vendor. While new technologies may suggest that smaller players have a more level playing field these days, the use of small, specialist providers was tried the last time around in the early 1990s. Then, players like American Real-Time Services and StockMate got gobbled up by the big boys.
Of course, the problem with the retail brokerage area is that it is by definition highly cyclical. The nature of the projects requires major commitments by both parties. The result is long contracts, often five or more years in length. As such, the market typically sees a flurry of activity as those contracts come due, and then nothing for a few years. There’s nothing wrong, clearly, with locking in a revenue stream over the next five years, but the opportunity for similar lock-ins is a narrow window.
Finally, as Bloomberg has shown, integrating major liquidity vehicles with decision-support and financial information can be leveraged to create a one-stop shop for those institutions that don’t want to build it all themselves. Reuters has already begun to add transactional capability to its services, as has Moneyline Telerate. At the time of writing, Thomson Financial was in exclusive negotiations to acquire bond trading platform TradeWeb. Integration of electronic transaction systems will require major commitments, and plays right into the Big Vendors hands, offering broad distribution to those emerging electronic markets.
On balance, as the market fragments and new technologies present new opportunities for new players to enter the space, the prospects for the Big Vendors may appear bleak. Certainly, the likes of Reuters, Bloomberg, Thomson Financial and Moneyline Telerate will come under pressure in all areas of the business.
The past two years have been characterised by sometimes severe cost-cutting measures for these firms, and it’s likely that, while creative output may have been hurt in the short term, they will come out of the recent down market as leaner and meaner operations. If better times return – and they don’t need to be as good as those glory days of market data – margins will improve and the Big Vendors will find themselves better positioned for battle. For it’s important not to forget one old adage in this business: if you can’t beat ‘em, buy ‘em.