Mondo Visione Worldwide Financial Markets Intelligence

FTSE Mondo Visione Exchanges Index:

The Enterprise Data Management Scorecard

Date 20/08/2007

Michael Atkin
Managing Director, EDM Council

In 2006 I contributed an article to the Handbook of World Exchanges setting out the business justification for establishing enterprise-wide control over data content as an essential strategic and operational priority for financial institutions. Just over a year later, the concept of enterprise data management (EDM) has not only taken hold throughout the industry, it has quickly risen up the ranks to become a priority for senior management. The pace at which this activity is maturing is nothing short of amazing.

Let’s take a quick look at the timeline. Eight years ago the industry was facing a potential regulatory mandate to reduce settlement risk – remember the discussions surrounding T+1 for clearing and settlement cycles and the goals of straight through processing(STP). And of course, STP required automation of business processes, and business process automation required precise data – hence the industry’s interest in data content management.

When the T+1 mandate went away, so did the motivation for data management. Data content (if it was recognised at all) was a poor stepchild of IT departments, completely misunderstood and near the bottom of the list of activities competing for budget and top management attention. Since then a lot has changed. Regulators are still concerned with reducing risk, but this time it takes the form of legislation to address specific objectives such as best execution, the requirement for adequate capital to be held in reserve to protect investors, full disclosure in terms of price transparency and trade reporting, and the mandate to be able to follow transaction flow for ‘know your customer’ and anti-money laundering purposes.

But that’s not the full story. In addition to regulation, business processes have changed. Investment strategy is increasingly contract-based and model-reliant. Customers are more sophisticated, necessitating holistic views of requirements and investment objectives. Products and markets are more complex, meaning more cross-asset risk modeling. And margins continue to be squeezed along with cycles.

Add it all together … profit and loss concerns (from bad prices, convoluted trading models and bad business decisions) … risk to reputation (due to avoidable errors, fines and bad press) … regulatory risk (from missing reporting deadlines, proving best execution and providing sufficient audit trails) … and business development opportunities – all have a clear data management relationship. As a result, EDM has moved from being a passive response to the threat of regulation to being a proactive orientation essential for business operations. So the good news is that most financial institutions get the importance of data management from a logic and awareness perspective.

But even though we collectively ‘get it’ we still have a long way to go to make it a reality. We’re still searching for consistent business metrics. We’re still struggling our way through the implementation mine field. We’re still viewing data management as a component of IT. Financial institutions are still silo-based and balance sheet-driven. And we still don’t manage the information supply chain very well. Now, I don’t want to paint a negative picture – just the opposite. I’m surprised and encouraged by the level of progress. In fact, I’ll go one step further by maintaining that EDM has reached the stage of maturity where it is now viewed by top management as just another business challenge to manage.

With that as a backdrop, my objective with this article is to paint a picture of where we currently are on the EDM continuum and where we think we’re headed on our quest for data management nirvana.

EDM scorecard

Earlier this year John Bottega, a long time colleague of mine and now Chief Data Officer at Citigroup, presented his view of the EDM scorecard. In Figure 1 I have taken his initial view, added to it the issues that the EDM Council has been tracking, and generated a consolidated view for your consideration. We offer this primarily as a common framework from which to talk about these issues – now and in the future.

Figure 1: An EDM scorecard

Category Notes Grade
Industry Awareness Content management as distinct from data processing and as a core requirement for business operations. Data as operational infrastructure. B
Business Drivers Shift from cost -> risk -> data leverage. EDM just emerging as a c-suite priority. Elevation of EDM to ‘strategic business objective’. C
Staff Capabilities EDM as a multi-dimensional discipline requiring a broad range of subject matter expertise and a broad array of skills. C-
Organisational Alignment and Governance

Business driven -> technology enabled -> operations supported. Solving the tragedy of the commons. Beyond brushfire mentality (demand management). C
Business Case and Funding Model Real business benefits are hard to quantify. ROI scorecard is still based on spreadsheet. Funding model is vertical. D
Implementation / Proper Execution Overcoming the four horsemen of the EDM apocalypse (ignorance, arrogance, obsolescence and power). C-
Core Identification Standards Foundation of effective EDM. Essential to get the foundation right. This can/should be standardised as a priority. D
Supply Chain Management Big opportunity to improve the data manufacturing process. Clear requirements specification needed. Markup at origination. Shift to next level of added value (risky and difficult). C

The ideal outcome from this scorecard is a consensus about the implications of these core areas and to define what we (both as individual companies and as a collective body) can/are/should be doing to improve our grades. From the perspective of the EDM Council, it is one thing to define the issues that financial institutions seek to address. It’s better to determine what (if anything) we can do about them. Let’s take a brief look at each of these in turn.

Industry awareness

The primary point here is that managing data as content with its emphasis on precise meaning, rather than as bits and bytes to be captured and distributed, is a unique orientation and not something that financial institutions are used to managing. As an industry, we’ve done a fair job of positioning this issue and there are certainly enough conferences on reference data.

Business drivers

The initial driver of enterprise data management was cost containment. Content acquisition was done on a line of business basis, with lots of duplication. There were redundant systems in place for managing master files and the ‘bad data tax’ in the form of manual reconciliation to ensure content acceptability was expensive. But cost containment alone (while a perfectly valid objective) is not sufficient for EDM to become a sustainable area of focus for most financial institutions.

The current driver for EDM is risk mitigation with the focus shifting from sourcing strategy to data quality management. And risk is a real driver. It’s in line with regulatory requirements and business process objectives – and it is clearly data dependent. In my opinion, risk mitigation and compliance (both internal in the form of measurement against investment agreements and modeling against benchmarks, and external in the form of regulatory reporting) are as close to the ‘killer apps’ for EDM as any we have seen.

But risk and compliance are also relatively new participants in c-suite parties and are still among the most isolated areas of the financial institution. So we’ve made some progress, but we still have a long road to travel in terms of getting EDM elevated out of the technology department and into the realm of a ‘strategic business objective’.

Staff capabilities

Enterprise data (content) management is a multi-dimensional discipline requiring a broad range of subject matter expertise and a broad array of skills. Members of the data management team must understand financial products, pricing strategy, corporate actions and standards. Staff need to have a customer service mentality and relate to both transaction workflow and organisational governance. They need to understand technology and be familiar with the concepts of golden copy, operational hubs and front-to-back data distribution. They need to understand applications ranging from trading to compliance modeling to regulatory reporting to customer account management. And the data management team needs to be able to build and justify the business case, effectively communicate across functions and lines of business, and have excellent negotiation skills.

This is quite a bit more complex than what was previously expected of ‘back office and data entry clerks’ because the scope of responsibility has expanded dramatically. And while the demand for qualified data managers has never been higher, we are not doing that great a job (either in schools or in continuing education) as it relates to the data management profession.

Governance

EDM is ruled by organisational governance. It is the pre-requisite and will not become a reality without a strong executive sponsor and the alignment of the organisation behind the objectives. But the reality of changing the way organisations in the financial industry operate (i.e. moving firms from a silo-based and vertical orientation to an enterprise-wide and integrated approach) is very difficult. Firms are not structured that way. They do not compensate their management in that way. And they don’t operate that way.

That’s why the four horsemen of the EDM apocalypse (ignorance, arrogance, obsolescence and power) are so dangerous. Technology departments are powerful. Data content management is not sexy or well understood. Outdated technology processes (often referred to as legacy systems) are rampant. And organisational dynamics (particularly in our greedy industry) are quirky because adding new concepts into the organisational mix is threatening. The two most dangerous challenges are the ‘tragedy of the commons’ (i.e. everyone wants data quality, but no one owns it) and ‘brushfire mentality’ (i.e. everything is an immediate priority and attention is given to those that yell the loudest).

So this is about changing corporate culture – a massively difficult objective. Financial institutions are fragmented by design and therefore it is very easy for data to become fragmented, even within specific business units. Databases reside everywhere. They are easy to replicate and firms do it all the time. It’s easy to fall into the ‘us’ versus ‘them’ syndrome and we therefore have an arduous journey ahead of us as it relates to both the practical and political aspects of data management.

Business case and funding

The composite business case for effective enterprise-wide data management is both logical and compelling, but it is hard to quantify in spreadsheet terms. The problem is that the true benefits of EDM are systemic and the downstream gains are intertwined with other processes. As a result, many firms have built their business cases on low level objectives such as cost containment and reducing staff levels, pushed from the bottom up with little chance of actually being able to change corporate culture, with a monolithic approach to centralisation, and an expectation of universal adherence to the new way of operating by powerful business units.

Combine the business case and metrics challenge with the fact that EDM-related funding is also vertically aligned. That makes it difficult to shake budget dollars from business units that operate based on short-term payback and book benefits as line items on their spreadsheets. EDM funding models are a difficult gauntlet to run because the real business benefits are not easily captured in dollar metrics. The bottom line reality is that without a strong business case and adequate funding, EDM is just another good idea competing with lots of other good ideas as a management priority.

EDM implementation

EDM is a new area of activity for the financial industry and there is no strategic or tactical roadmap associated with programme implementation. The process clearly starts with making sure the requirements are precisely defined and specifically understood. This deals with the relationship between front office activities, back office processing, account management and internal reporting. Getting all the involved stakeholders aligned and speaking the same language is essential and difficult.

Establishing the operating model for EDM can also be somewhat daunting. This area is all about data models, business rules, content precedence, data stewardship, security and entitlements, and metadata management. In many ways, this is the ‘dark art of data content management’ – an area that is still a mystery to many in our industry.

The third core challenge is technological. And while I’m desperately trying to portray EDM as a content problem, there is a huge technology dimension. Firms must have a functional platform and harmonisation among systems. They must have a robust messaging infrastructure. And they must be able to reconcile legacy systems and processes while still maintaining business continuity.

Core identification standards

Precise identification of financial instruments, legal entities and data elements are the foundation of effective EDM. Here’s my take on the status of the big three:

Instrument identification: This was all the rage during the T+1 threat and because of the challenges of multiple listings. As a result there was a lot of effort by the Association of National Numbering Agencies (ANNA) to clean up the ISIN assignment and maintenance process and significant action by data vendors to fill gaps from multiple listings.

Legal entity identification: ISO standards exist for settlement and clearing agencies, custodians, exchanges/trading systems and fund managers but not for issuers, corporate clients, suppliers and funds. And they are clearly needed for counterparty and issuer risk management, regulatory reporting, corporate research, legal agreements, documentation and counterparty identification on transactions.

The problem with legal entity identification standards is commercial and the restrictions being placed on implementation by ISO policy. Assigning identifiers and managing implementation across the entire global industry requires investment. And no company worth their salt will step up to this plate without a clear commercial return. Nothing worthwhile gets done for free. Cost-based models don’t work. This activity needs to be viewed as a business proposition if we want it implemented. Let’s learn from the securities identification experience; set out the business requirements and commercial ground rules up front and let those inclined to participate apply their commercial creativity before the fact, not after the horse has left the barn.

Data element identification: Much of the recent attention in the ‘standards world’ focuses on the desire to standardise naming conventions, definitions and business relationships of the data elements that make up internal master files and that are contained in feeds from data sources.

Again, the challenges are clear. There are hundreds of data elements delivered from multiple sources around the globe. The lack of standard nomenclature and precise data definitions makes it hard to compare data, hard to automate business processes, hard to feed complex analytics, hard to exchange data and result in a never ending process of mapping and cross-referencing. There has been a significant amount of activity but it has been primarily driven by technologists in the form of XML schemas and UML data models. It appears that the industry would rather start with a precise data ontology (i.e. a simple and unambiguous glossary of data element terminology based on practical business requirements) that can be used to simplify mapping, promote comparability and be adopted by suppliers to improve the efficiency of integration.

Final word

Let me leave you with a final thought about supply chain management. Financial information is a manufactured product with vendors (i.e. Reuters, IDC, Telekurs, Bloomberg, S&P, etc.) still acting as the primary manufacturing agents for financial institutions. The current focus on data precision is propelling many firms to work more closely with these vendors to better understand (and make more transparent) the processes they employ as part of that manufacturing process. This new focus on the data creation process is helping to raise the bar on data quality and is prompting vendors to pay attention to data utilisation rather than just bulk delivery.

We think this is good news for the industry at large. More transparency on data manufacturing shifts the discussion away from traditional definitions of added-value (i.e. acquisition, symbology, normalisation and distribution) which are becoming more commoditised and pushes vendors in the direction of hard-to-get data, analytics, integration into applications and contextual content (i.e. the next level of added value). And after all, overcoming data scarcity, paying more attention to how data is being used and adding secret sauce to make data more functional is what financial institutions want from their data vendors.