Mondo Visione Worldwide Financial Markets Intelligence

FTSE Mondo Visione Exchanges Index:

Principia verbum: Entering the Age of Reason in data management

Date 14/07/2006

Michael Atkin
Managing Director, Enterprise Data Management Council

The logic and value of achieving enterprise-wide control over data content as a strategic and operational asset is now firmly established in the minds of most financial institutions.

It began slowly with concerns over meeting the operational challenges of T+1, gained ground with the objectives of front-to-back operational efficiency in the face of changing economics, received a push with global concerns about credit and systemic risk, and now is being driven forward by the realities of a new business environment and a desire by firms to leverage their considerable data assets to support trading innovation, better serve clients, support cross-asset risk analysis and comply with regulatory obligations.

Enterprise data management and the need to both understand and address complex data dependencies across functions, between applications and among multiple lines of business is pushing data content to a level of equivalence with technology as part of a financial institution's core operational infrastructure. It's taken a while, but the value of data control to meet the evolving requirements of doing business in complex financial environments has become clear. As such, the challenge has shifted from conceptual buy-in on the rationale of data management, to the more tactical objectives associated with the realities of EDM implementation and the difficulties in achieving balance among competing business priorities.

My objective with this article is to put an unbiased stake in the ground on where the financial industry stands on enterprise data management and to examine the strategic, operational, and business management requirements associated with successful EDM practices.

Some definitions

The definition of the data types that are included in EDM seems rather straightforward and includes all content that a financial institution needs to access in order to conduct business, meet reporting and regulatory requirements, serve customers and manage risk. The real challenge is how it gets done, not what needs to occur.

However, from an industry management perspective, definitions are important primarily because they provide a common framework for discussion. So for the sake of simplicity, we offer the following for your consideration:

  • Enterprise data management - is a concept. It refers to the ability of a financial institution to precisely define, easily integrate, and efficiently retrieve data for both internal applications and external communications.
  • Data types - everything required to conduct business. The core challenge is that financial institutions are modeling the data in different ways and using different words to describe their contents. Precision of terms, definitions and relationships forms the basis of data management strategy.
  • Applications - the translation of innovation and creativity within financial institutions. It is important to understand applications only because data precision, transparency and consistency matter.
  • Data requirements - perhaps the most important area because it covers formats for processing, identification schemes for retrieval, transparency for quality assurance, and commercial terms and conditions associated with data usage.

Drivers of the EDM train

Most people are aware of the changes that are taking place throughout the financial industry. Business requirements have changed. Customers are more sophisticated. Regulators are paying attention to the notions of transparency, disclosure and reporting accuracy. Products and markets are more complex. And margins continue to be squeezed along with cycles.

Equally obvious is that in order to operate in this environment, firms need accurate, consistent, transparent and precise data. They need it to support innovation, meet regulatory obligations, manage risks, automate processes, serve customers and operate efficiently. Gaining control over data assets is logical, mandatory to do business and ultimately inevitable.

So the challenge that firms face is not one of business logic but rather one of business process, corporate culture, and operational transformation. Below is an outline of the core drivers of data management for your consideration.

Regulators

Regulators are clearly the number one driver of change in the financial industry - and there is ample new regulation do deal with. But as you dig a little deeper on regulatory compliance at least two things become apparent. The first is that products, clients, counterparties, issues, investment programmes and ownership hierarchies all need to be uniquely identified and clearly cross-referenced with transparent symbology and consistent data definitions. This is the underlying pre-requisite and still a missing ingredient. The second is that these challenges can't be isolated at an individual warehouse unless the firm only has one warehouse. Just as with business silos, data silos need to be unraveled and interconnected to promote more complete views of data, relationships, objectives, and customer requirements.

Client servicing

The demands and requirements of financial institution's customers are becoming more sophisticated. Concerns about risk result in greater diversification of holdings. Customer interest now spans multiple business units propelling firms to rethink the traditional product-oriented way they have been operating. This translates into the need to address data from multiple repositories across product lines. It means a requirement to know more about your customers business. And the more you know about your customers, the better the opportunity for product innovation, up-selling, data asset leverage and new revenue streams. If the data structure is solid, it is easier to implement creativity and easier to manage operational risk.

Operational efficiency

The quest for operational efficiency to deal with shortened clearing and settlement cycles as well as changing economics is still alive and well. Market complexity translates into product complexity - all of which needs to be managed in the face of rising volumes and interdependent front-to-back office securities processes. These changes ultimately mean shorter timeframes for product introduction, compressed cycles for processing and the mandate for operational agility.

Transactions flow

Complex transactions consist of multiple internal external steps, each with its own unique data requirements. In order to manage the transactions flow, data has to be accessible, accurate and precise at various levels of granularity and passed along from one step on the chain to the other. As trading mechanisms, products and markets become more complex, there is an increasing need to link front, middle and back office operations. Clean, well structured data makes communication easier, reduces errors and decreases the need for manual efforts.

Internal frustration

On a practical level, one of the top drivers is simply internal client frustration with bad, wrong and late data. Frustration from the business units leads to internal evaluation of data and processes - which is often the catalyst behind the creation of internal data quality initiatives.

Cost

Bad data management is expensive. It includes money from trade repairs, mismatches and settlement delays, data duplication across multiple product lines, loss of leverage in data contract negotiations, duplicate systems and excessive maintenance costs, opportunity costs associated with diverting staff to do research and repair, time to market issues, fines and penalties from trading errors and artificially high transactions costs.

Add it all together - profit and loss risk (from bad prices, convoluted trading models and bad decisions), risk to reputation (due to avoidable errors, fines and bad press), regulatory risk (from missing reporting deadlines, proving best execution and providing sufficient audit trails), and business development opportunities - all have clear data management relationships and are all part of the logical business case for data management prioritisation.

Roots of the problem

There is certainly a broad spectrum of data challenges for financial institutions to deal with. Part of the problem is that data is generated in multiple business silos at the point of need and that there are many one-time initiatives launched for specific purposes. Part of the problem is technological with legacy systems and data silos that prevent capital markets from moving forward with the data management objectives. Below is a brief summary of the core categories of challenges to address.

Multiple business cases

There is no single business case for better data management - there are multiple.

There is the overall logical case of the need to gain control of corporate data assets, and the requirement to have a functional distribution platform in place to ensure that business units have the ability to get access to the data they need in order to do business in the existing and future industry environment. There is the monetary business case associated with more efficient operations. And there are specific cases based on function, data type, and business objective, all of which have to be measured against where each institution is in its data management lifecycle, the magnitude (or perceived magnitude) of the problems, the risks of doing EDM poorly and the costs associated with funding and managing EDM projects.

The EDM Council recently pulled together the various components of the composite business case into the following chart as an outline of the drivers to consider.

Regulatory obligations
  • Best execution obligations
  • Price transparency
  • KYC/AML
  • Complete audit trails
  • Meet reporting deadlines
  • Full disclosure
  • Justify data decisions
  • Capital adequacy requirements
  • EU Transparency Directive
  • Prospectus Directive
  • Markets in Financial Instruments Directive (MiFID)
  • Sarbanes-Oxley
  • Patriot Act
  • CR Directive (Basel II)
  • Giovannini I and II
  • G-30 recommendations
  • PD/UCITS
  • Risk mitigation
  • Bad prices
  • Convoluted trading models
  • Avoidable errors
  • Damage to reputation
  • Sanctions and censures
  • Calculating severities and default probabilities
  • Capital reserves and set-asides
  • Interdependent markets
  • Processing cycle delays
  • Fail to deliver
  • Fail to receive
  • Non-confirmed trades
  • Overdue transfers
  • Reconciliation
  • Buy-ins
  • Reclaims
  • Liquidity risk
  • Settlement defaults
  • Miscalculation on values
  • Operational efficiency
  • Acquisition integration
  • Symbology conflicts
  • Misidentified products
  • Trade repairs
  • Settlement delays
  • Multiple listings
  • Economies of scale on vendor negotiation
  • Compressed clearing and settlement cycles
  • Unproductive staff utilisation
  • Lack of identification precision
  • Front/back office breaks
  • Re-booking
  • Hedging on incorrect positions
  • Data duplication
  • Multiple data hubs
  • Vendor and product migration
  • Internal business unit frustration
  • Entitlement control and reporting
  • Time-to-market
  • Operational agility
  • Scalability
  • Composite view
  • Changing client requirements
  • Growing market and product complexity
  • Automated messaging to counter-parties
  • Determining customer support ROI
  • Cross-selling
  • New product development
  • Understanding client business processes
  • Accurate benchmarking
  • Track customer revenue and profitability
  • Internal fragmentation

    Much of the challenge is that the industry is not structured to think about data management on an enterprise level. Many firms are still structured along geographic or product lines with their own local requirements, data repositories and budget control.

    This is the classic definition of fragmentation, resulting in an organisation that can be tactically sound, but strategically disjointed. Even financial institutions that have a clear data management mandate suffer from the politics of fragmentation. Everything we've learned reinforces the lack of cohesion between business units as still the number one obstacle to achieving enterprise-wide data objectives.

    Data management complexity

    Unfortunately, there is no 'easy button' for data management. The data resources of financial institutions are extremely broad. Not only does EDM cover multiple master file types (i.e. product, pricing, client/counterparty, corporate actions, transactions, accounts, instructions, etc.) it also needs to be integrated into dozens of functional processes throughout the transactions chain, all managed by different business units and all requiring various degrees of data precision and granularity.

    The key point is that the data requirements for efficient operations and business leverage are not the same for every process in the transactions chain. Problems occur when the data dependencies are not completely understood and when the data elements required for downstream applications are not captured and communicated. The pathway to success is not only about defining requirements; it's about ownership of data flow (particularly as transactions become more complex, with shorter timeframes for processing and with rising volumes). The lack of ownership often translates into the lack of business unit adoption to what is essentially an enterprise problem.

    Other priorities

    Financial institutions face a variety of business challenges, from complying with requirements from multiple regulatory regimes, to meeting the new requirements of customers, to platform re-architecture, to improving modeling capabilities, and the list goes on. Data content management is a component, but not necessarily a driver of, all these activities.

    So while the industry certainly understands the data management objective, some believe there is no present crisis to immediately resolve. The existing strategy, while fragmented and decentralised, is still viewed as functional. Data access and workflow inefficiencies are annoying but the costs are dispersed.

    Add to that the fact that firms usually measure aggregate, rather than specific reasons behind trade and processing errors: because short-term workarounds are possible, because root cause analysis is usually the result of (not the impetus behind) data initiatives, because legacy systems are hard to reconcile, because there is little industry coordination and because growth priorities take precedence.

    Implementation strategy

    Gaining organisational alignment on the importance of data management is the easier half of the battle. Translating that commitment into an effective implementation strategy, pulling the trigger and avoiding project derailment is frequently the more difficult challenge. Below is a list of the core lessons we can learn from those riding the early crest of the EDM wave.

    Not an IT problem

    The most consistent message is that EDM is not an IT problem. There is a clear distribution and applications mapping dimension, hence a technology connection, but for the most part IT-driven initiatives have been expensive failures.

    Data management is a business problem and ruled by organisational governance. Individual business units generally have small data management problems that become significant only in aggregate. As one data manager put it: 'world domination and IT purity is not always the right path to nirvana'.

    Meaning of centralisation

    Centralisation does not always mean one monolithic reference data warehouse. More often than not it means centralisation of the data entry point to ensure clean data which can then feed multiple databases for applications.

    It always means centralisation of the data model, centralisation of requirements management and centralisation of governance. Many firms are wisely choosing to implement a centralised hub within their existing legacy and political environments. Their goal is data access and integration until the business units decide to replace the existing infrastructure.

    Rise of the data tsar

    One of the challenges with EDM is that it is so broad that no one owns it. The difficulty is that when things go wrong, there is no one in place to answer for the problem. Without a strategic owner responsible for the full spectrum of data types, institutions wind up with multiple data contracts, data silos, business unit competition, difficulties linking up functions and an inability to leverage knowledge across the firm.

    Remember, the key to EDM success is more organisational than data- or technology-related. If the resource and decision making structure is in place, it is easier to shift from the challenges of organisational functionality to those associated with implementation capability.

    Data management policies

    There is a clear and unambiguous relationship between a firms content assets, its data management and modeling approach and all of the applications to which data is applied. Effective EDM strategies incorporate a complete understanding of the data requirements throughout the information chain, ownership over data flow and quality stewardship, and data model consistency throughout the organisation. Without these four pre-requisites in place, data management initiatives will be difficult to effectively manage.

    Rising data quality tide lifts all ships

    One of the biggest benefits of a data management orientation is the long-term impact on data quality:

    • Upgrading infrastructures and understanding data dependencies enables financial institutions to shift their focus from low value-added data scrubbing processes to high value-added data manufacturing processes.
    • The focus on data models, business rules and application requirements result in the creation of tools to measure capabilities.
    • Audit and monitoring tools promote comparison and quality benchmarking.
    • Comparison facilitates competition and encourages suppliers to pay attention to utilisation rates rather than just bulk delivery.
    • A usage orientation propels suppliers to better understand front-to-back data linkages.
    • Data integration requirements motivate all to embrace standards.
    • Standards reinforce competition on data quality and the upward spiral helps sustain data manufacturing process improvements.
    • The rising tide lifts the entire industry. Investment in the data management infrastructure is good for selling data. Increased customer spending and higher expectations in terms of data quality attract the attention of data vendors to increase QA processes, compete on commercial terms and raise the standard of the offer - which is the most desirable outcome for all participants.

    Michael Atkin is the Managing Director of the EDM Council - a business forum for financial institutions to share information on the business strategies and practical realities of implementing effective solutions to manage data across the enterprise. Comments are welcome at +1 301 933 2945 or atkin@edmcouncil.org