Historically, in-house legal teams, IT – or reference data and system architects – and downstream consumers of data have existed in different worlds. The in-house legal role traditionally has been restricted to advisory guidance, both transactional and non-transactional, but has not adequately touched on the realm of documentation storage and even less so the classification of associated meta-data of contracts and how best to collate, manage and distribute them as required by the wider financial institution.

This blind spot was highlighted by the financial crisis and now, despite increasing recognition of the issues surrounding data quality, the management of legal contracts within the financial services industry remains an area regulators have identified as being not fit for purpose.

Poor data management – a major risk for most financial firms

Data management (of which data quality management is a subset) has been identified as a major risk at the board level of most financial firms. Data quality is seen as a causal theme impacting several other processes such as the quality of regulatory compliance and reporting, as well as business optimisation (such as RWA reduction, liquidity and collateral optimisation).

The most commonly cited root causes of poor legal data quality are poor governance and processes, inadequate system validations, manual data entry errors and inadequate inter-system linkages for the flow of data. In the legal data domain in particular, the following factors also become clear: (i) consumers of data (e.g., a downstream reporting or content-related process) are often not fully aware of the full set of data attributes they need, the upstream sources from which these are taken, and the effect of poor data quality on their processes; and (ii) providers of data – owners of systems or even data in physical form – are often unaware of all the data attributes they hold, their meaning, intended usage or values, the way in which these attributes are sourced or derived, and the full set of downstream consumers of this data.

It is this lack of end-to-end visibility, and ownership, of data flows that accounts for the most difficult and intractable data quality issues in banks today.

The complexity of the legal data game

One of the most complex types of documentation entered into by capital market participants is the International Swaps and Derivatives Association (ISDA) Master Agreement and its related annexes. Most of the derivatives in the global derivatives market are documented under this ISDA documentation, which is highly complex, technical and modular in order for it to be easily combined as required by the contracting parties to achieve the desired commercial and legal effect.

Although successful from a legal perspective, this modular architecture is now causing many data challenges, particularly given that data extraction from the contract portfolios of financial firms until recently often has been little more than an afterthought. Moreover, the fact that the management of legal data is largely performed outside of a legal function has meant that, in many institutions, review of this documentation is performed by operational teams, typically creating another silo of data where its wider context becomes lost.

It is becoming increasingly clear, therefore, that data accuracy issues commonly arise due to the lack of granularity of the representation of the contract terms within IT systems.

As the complexity of the clauses increases, the issue becomes increasingly problematic. The worry is that many firms are simply oblivious to the important nuances of the contractual wording. For regulated financial firms, the consequences of the use of collateral above and beyond the contractual terms is not just a simple contractual breach, but can lead to a failure to segregate client assets, triggering a breach of the associated client asset and client money (CASS) regulation — with significant reputational and financial consequences.

It is, however, not just the legal contract data itself. The management of netting and collateral enforceability flags related to these agreements is key from a risk and capital perspective, relying on data from long and complex legal opinions on the master netting agreements firms have entered into. Our recent survey highlighted many of the significant shortcomings in this regard in the industry, despite the focus in this area under regulation such as CRD IV. The governance and end-to-end process around this legal data simply cannot be ignored without risk of fines and sanctions for non-compliant firms.

Getting the bases right

What is needed is increased ownership, accountability and responsibility regarding: (i) the storage of legal contracts and opinions; and (ii) the key terms contained within these documents, as required by an institution’s internal and external consumers (both in terms of understanding what these key terms are and the accuracy of the data captured).

To overcome the issue of data ownership, data governance is the key starting point. In this context, three key data-related roles have been identified.

Data custodian. Each system within an institution has a designated business owner, who will also act as ‘custodian’ for the data present in those systems. The role of custodian includes documenting the data, its source, meaning and format; identifying the downstream systems and processes that consume the data directly from them; and implementing appropriate controls and KPIs to ensure quality.

Data domain owner. Whilst the data custodian will focus on the data within their systems, a particular type of data will often exist in, and flow between, multiple systems. Indeed, based upon data models for the banking industry, around 15 of such high-level ‘data domains’ can typically be identified to occur in most banks. The role of the data owner becomes one of end-to-end oversight and strategic ownership of each such data domain.

Downstream process owner. This individual will have additional responsibility for data quality management. If a process consumes data from an upstream process or system, the process owner should know the critical data elements that determine the outcome of the process; the upstream source of those elements; and the way in which they are interpreted and used within the processes and systems. They should also explicitly inform the upstream data custodian that their data is being used in the downstream process and mutually agree minimal data quality expectations. The quality of this data should be monitored over time and escalated if necessary.

Team play

These three roles together provide a critical, end-to-end feedback loop that makes ownership, accountability and responsibility for data quality management evident and transparent. Moreover, it allows additional processes, standards and controls to be put in place to ensure a greater upfront focus on data and make a common data dictionary to be accessible throughout the organisation. Making sure everyone talks the same language is vital to developing a common understanding of the data.

Unprecedented regulatory focus on processes that depend on end-to-end data quality have now created both a necessity and an opportunity to adopt a far more comprehensive approach. Sticky tape will no longer work.

Share and Enjoy !