Featured Product

    Strong Data Management – An Absolute Necessity

    Banks and businesses have long been plagued by poor data quality, a result of weak technology, lack of management oversight, and simple human error. Inferior data, too long left unchecked, has far-reaching consequences – not the least of which was the 2008 global financial crisis. Banks that establish a strong data management framework will gain a distinct advantage over their competitors and more efficiently achieve regulatory compliance.

    The data wasn't – and still isn’t – good enough

    “Five years after the financial crisis, firms’ progress toward consistent, timely, and accurate reporting of top counterparty exposures fails to meet both supervisory expectations and industry self-identified best practices. The area of greatest concern remains firms’ inability to consistently produce high-quality data.”1

    This quote from the Progress Report on Counterparty Data by the Senior Supervisors Group summarizes one of the causes of the financial crisis and the reason that the sizeable investments in improving risk management in the years preceding the crisis seem to have been in vain: There was – and still is – not enough good data about the risk to which a bank is exposed.

    Effective risk management that is capable of identifying, assessing, and prioritizing risks is based on a sound infrastructure, powerful analytics, and reliable data. All three ingredients are interconnected and influence each other, as illustrated by Figure 1.

    Figure 1. Three ingredients of effective risk management
    Three ingredients of effective risk management
    Source: Moody's Analytics
    • Infrastructure comprises not only technical aspects like IT equipment, but also technical organization and processes such as IT governance, roles, responsibilities, and internal policies.
    • Analytics refers to the wide variety of quantitative modeling techniques that have developed over the past 20 years to better understand the drivers of risk and predict potential losses resulting from credit or market activities.
    • Data includes not only granular information about risk exposure itself, but also the taxonomies that define and categorize that information and the data governance that maintains the accountability and quality of the data.

    As the quote from the Senior Supervisors Group suggests, many banks use seriously flawed data, making meaningful risk management next to impossible. Weak data quality is an impediment not only for risk management, but also for the business of a bank in general. As other risk management experts have pointed out, “If the data quality is poor, the information will be poor and only luck can stop the decisions from being poor.”2

    From important strategic decisions to mundane issues like standard regulatory reports, the story is the same. A business or business function (like risk management) that has to rely on weak data is ultimately set up for failure. A bank that uses good quality data has an opportunity to outpace its competitors.

    What is "good data quality"?

    There have been numerous attempts in the past two decades to define data quality along a series of dimensions, such as accuracy and consistency.3 Depending on the individual needs of an organization, that definition can vary.

    Table 1 shows the typical criteria used by statistics providers like the Statistics and Regulatory Data Division of the Bank of England and Eurostat.4 Most of today’s banks fall short in at least one of these areas, giving rise to serious concerns for risk managers.

    Table 1. Typical criteria used by statistics providers
    Source: Moody's Analytics

    What are the consequences of poor quality data?

    Weak data is a common deficiency in almost all businesses. Still, some companies tolerate a certain level of bad data rather than try to manage or eliminate it, because the sources of poor data quality are myriad and addressing them one by one is a laborious, time-consuming, and expensive exercise.

    Weak data quality is an impediment not only for risk management, but also for the business of a bank in general. As other risk management experts have pointed out, “If the data quality is poor, the information will be poor and only luck can stop the decisions from being poor.”

    Sooner or later, however, bad data begins to proliferate across systems, and discrepancies grow rapidly, which results in a number of issues:5

    • Increased downtime for systems to reconcile data
    • Diversion of resources from areas important for the business
    • Slower deployment of new systems
    • Inability to comply with industry and quality standards
    • Frustrated employees whose activities are hampered by poor data
    • A cumulative increase in costs

    Quantifying the cost of bad data

    There have been several attempts to quantify the cost of bad data quality. The exact cost is difficult to calculate, but research by academics and reports by industry experts provide a number of revealing examples:6

    • According to a 2010 Forbes survey, data-related problems cost companies more than $5 million annually. One-fifth of the companies surveyed estimated losses in excess of $20 million per year.7
    • Gartner research shows that 40% of the anticipated value of all business initiatives is never achieved. Poor data quality in both the planning and execution phases of these initiatives is a primary cause.8
    • Eighty-eight percent of all data integration projects either fail completely or significantly overrun their budgets.9
    • Seventy-five percent of organizations have identified costs stemming from dirty data.10
    • Thirty-three percent of organizations have delayed or canceled new IT systems because of poor data.11
    • Organizations typically overestimate the quality of their data and underestimate the cost of errors.12
    • One telecommunications firm lost $8 million a month because data entry errors incorrectly coded accounts, preventing bills from being sent out.13
    • One large bank discovered that 62% of its home equity loans were being calculated incorrectly, with the principal getting larger each month.14
    • One regional bank could not calculate customer or product profitability because of missing and inaccurate cost data.15

    These findings provide an idea of the extent to which weak data quality can add to a business’s costs. Given that these examples stem from research and industry reports that cover the first decade of the 21st century, one must ask why data quality management has not been addressed more seriously by those responsible.

    There are two main reasons for the data quality deficit

    Experts have repeatedly identified two main reasons for the weak data quality that plagues many banks – the lack of accountability and commitment by the organization’s senior management to address weak data, and the lack of effective technologies to monitor, manage, and correct inaccurate data when needed.

    Technology that reliably maintains data quality by automatically applying error checks and business rules and by supporting sound audit trails and lineage can only result in greater confidence in the data.

    To address the lack of senior management involvement, the Basel Committee on Banking Supervision (BCBS) outlined a number of new responsibilities. Boards must now determine their risk reporting requirements and be aware of the limitations that prevent a comprehensive aggregation of risk data in the reports they receive.16 Senior management must also ensure that its strategic IT planning process includes both a way to improve risk data aggregation capability and the creation of an infrastructure that remedies any shortcomings against the principles defined by the BCBS.

    These obligations will cover the entire value chain of a bank’s data, because the BCBS requires that senior management understand the problems that limit the comprehensive aggregation of risk data in terms of:

    • Coverage – i.e., are all risks included?
    • Technical aspects – i.e., how advanced is the level of automation, vs. manual processes?
    • Legal aspects – i.e., are there any the limitations to sharing data?

    To comply with these requirements, a bank will need to establish strong data governance covering policies, procedures, organization, and roles and responsibilities as part of its overall corporate governance structure.

    By setting up stronger data governance structures, banks will address the first main data quality weakness and define the correct use of and accountability for that data. Data governance should also include the scope of a bank’s IT governance by considering data quality aspects and processes.

    Reaching a new data quality standard: A step-by-step process

    Because data changes constantly, continuously monitoring it to maintain quality will only become more and more important. Figure 2 outlines the process.17

    Figure 2. Monitoring and maintaining data quality – step-by-step process
    Source: Moody's Analytics

    Once data is extracted from source systems, the next step – profiling – applies rules and error checks to assess the overall quality of the data. Profiling involves identifying, modifying or removing incorrect or corrupt data, with the goal of retaining just one unique instance of each datum. This step is also supported by the BCBS, which requests that banks “strive toward a single authoritative source for risk data per each type of risk.”18

    Based on lessons learned, the set of rules and checks will evolve to avoid the repetition of previously identified data errors. Automating these processes will help maintain data quality and even enhance it by making the data more comprehensive.

    Implementing effective technologies

    To address the second reason – the lack of effective technologies – banks today have the opportunity to select from a wide range of tools and solutions that support processes to improve and maintain data quality. Most banks already have some components that could form the foundation of a data quality framework, which they could then enhance with new components as required.

    The requirements for effective technologies hinge on speed, scalability, reliability, and adaptability. Speed and scalability speak to the ever-growing amounts of different types of data stored in multiple, siloed systems based on entities, lines of businesses, risk types, etc. After identifying which system contains the required data, an expert must extract, standardize, and consolidate it to assess its quality.

    Despite the fact that many banks employ large numbers of employees to capture, review, and validate data (as well as find gaps), they still struggle with poor data quality in their core systems, as a result of input errors, unchecked changes, and the age of the data (particularly with information stored in legacy systems or compiled manually). Technology that reliably maintains data quality by automatically applying error checks and business rules and by supporting sound audit trails and lineage can only result in greater confidence in the data.

    As the requirements for information – as well as the information itself – are constantly evolving, effective technology has to be adaptable. The technology should feature flexible processes to aggregate data in different ways and should be able to include new information or exclude outdated information. It should also reflect new developments within the organization as well as any external factors influencing the bank’s risk profile, such as changes in the regulatory framework.

    Summary

    Effective risk management relies on three key ingredients: sound infrastructure, powerful analytics, and reliable data. The latter especially has been neglected for too long by too many. Although regulators have repeatedly voiced their concerns, it took a financial crisis to implement much tighter rules. As a result, banks will have to invest heavily in their data management architecture in the coming years.

    The two main reasons for weak data quality are a lack of senior management commitment and ineffective technology. To benefit from good data management, banks will need to establish strong data governance that sets rules and defines clear roles and responsibilities, while enhancing an existing data quality framework with technologies that offer speed, scalability, reliability, and adaptability.

    Good data management will confer a competitive advantage to those banks that have it. The value banks can reap from setting up a better data management framework will be leaner, more efficient and less expensive processes that lead to faster and more reliable business decisions.

    Sources

    1. Senior Supervisors Group, Progress Report on Counterparty Data, p. 1, 2014.

    2. Mackintoch, J./Mee, P., The Oliver Wyman Risk Journal, Data Quality: The truth isn’t out there, p.75, 2011.

    3. Haug, A./Albjørn, J.S., Journal of Enterprise Information Management, Vol. 24, No. 3, 2011, Barriers to master data quality, pp. 292-293.

    4. Bank of England, Data Quality Framework, p. 7, 2014.

    5. Marsh, R., Database Marketing & Customer Strategy Management, Vol. 12 No. 2, Drowning in dirty data? It’s time to sink or swim: A four-stage methodology for total data quality management, p. 108, 2005.

    6. Marsh, R., Database Marketing & Customer Strategy Management, Vol. 12 No. 2, Drowning in dirty data? It’s time to sink or swim: A four-stage methodology for total data quality management, p. 106, 2005.

    7. Forbes, Managing Information in the Enterprise: Perspectives for Business Leaders, p.2, 2010.

    8. Gartner, Measuring the Business Value of Data Quality, p.1, 2011.

    9. Marsh, R., Database Marketing & Customer Strategy Management, Vol. 12 No. 2, Drowning in dirty data? It’s time to sink or swim: A four-stage methodology for total data quality management, p. 106, 2005.

    10. Marsh, R., Database Marketing & Customer Strategy Management, Vol. 12 No. 2, Drowning in dirty data? It’s time to sink or swim: A four-stage methodology for total data quality management, p. 106, 2005.

    11. Marsh, R., Database Marketing & Customer Strategy Management, Vol. 12 No. 2, Drowning in dirty data? It’s time to sink or swim: A four-stage methodology for total data quality management, p. 106, 2005.

    12. Marsh, R., Database Marketing & Customer Strategy Management, Vol. 12 No. 2, Drowning in dirty data? It’s time to sink or swim: A four-stage methodology for total data quality management, p. 106, 2005.

    13. Eckerson, W. W., The Data Warehouse Institute Report Series, Data Quality and the Bottom Line, p. 9, 2002.

    14. Eckerson, W. W., The Data Warehouse Institute Report Series, Data Quality and the Bottom Line, p. 9, 2002.

    15. Eckerson, W. W., The Data Warehouse Institute Report Series, Data Quality and the Bottom Line, p. 9, 2002.

    16. BCBS, Principles for effective risk data aggregation and risk reporting, p. 7, 2013.

    17. Heale, B., Risk Perspectives, Vol. 4, Data: The Foundation of Risk Management, p. 53, 2014.

    18. BCBS, Principles for effective risk data aggregation and risk reporting, p. 8, 2013.

    Featured Experts
    As Published In:
    Related Articles
    Article

    Five Challenges for the Business-Model “Bank”

    Banks and the services they offer remain essential to global economies.To stay relevant, however, banks need to adjust their business models and adapt to the new realities – tighter regulation, lower interest rates, changing client needs and behavior, technology disruption, and accelerating disintermediation.

    December 2015 WebPage Dr. Christian Thun
    Whitepaper

    AnaCredit Gives Banks an Opportunity to Improve Data Management, but Challenges Remain

    The AnaCredit project is scheduled to be implemented in three stages by mid-2020. This paper looks at the challenges for banks in creating the AnaCredit framework and how to overcome these main challenges.

    July 2015 Pdf Dr. Christian Thun
    Webinar-on-Demand

    Outlining a Path From Data Management to Value Generation

    Effective Risk Data Aggregation and Reporting stipulate that banks need to have a strong governance framework, risk data architecture and IT infrastructure. This webinar discusses how addressing data quality has become an opportunity for competitive advantage.

    April 2015 WebPage Dr. Christian Thun
    Whitepaper

    European Banks Underestimate the Challenges of BCBS 239 Implementation

    This whitepaper looks at the results of the Moody's Analytics survey in January 2015, which reveals that many banks underestimate the time, resources and cost involved to implement BCBS 239.

    April 2015 Pdf Dr. Christian Thun
    Webinar-on-Demand

    Moody's Analytics 2015 Survey on BCBS 239 Compliance

    In response to the data challenges faced by banks when preparing their BCBS 239 project, Moody's Analytics has engaged with market participants in EMEA to better understand the current state of the industry by providing a snapshot through our BCBS 239 survey.

    February 2015 WebPage Dr. Christian Thun
    Article

    Lean Management: A New Zeitgeist in Risk Management

    Banks can greatly benefit from a leaner and more integrated approach to risk management. This article addresses how banks can constantly evolve to an efficient and productive process, by focusing on data, infrastructure, process, and – most importantly – people.

    November 2014 WebPage Dr. Christian Thun
    Webinar-on-Demand

    Webinar-on-Demand: Stress Testing as a Catalyst for BCBS 239 – or Vice Versa?

    Banks will be under even more pressure as stress testing is becoming a recurring exercise and the new principles for risk data aggregation (BCBS 239) require them to quickly solve the issues around the data warehouses.

    September 2014 WebPage Dr. Christian Thun
    Whitepaper

    Can the Asset Quality Review Restore Confidence?

    The article looks at the ECB's use of the AQR as a building block for restoring confidence within the European Banking System. It also provides an outline of the key elements of the AQR, including collateral and real estate valuation, collective provision analysis, and fair value exposures.

    May 2014 Pdf Dr. Christian Thun
    Article

    Leveraging the Regulatory Stress Tests to Build Long-Term Value

    Banks have to dedicate enormous resources to comply with CCAR and DFAST, but rather than treating stress testing like a check-the-box exercise, banks should view it as an opportunity to better manage their businesses and invest in robust stress testing frameworks.

    November 2013 WebPage Dr. Christian Thun
    Article

    Stress Testing Best Practices: A Seven Steps Model

    Implementing stress testing practices across the various bank divisions is a complex process. In order to address the need for an implementation framework, Moody’s Analytics has created a Seven Steps Model.

    September 2013 WebPage Dr. Christian Thun, Sandrine Prioux, María C. Cañamero
    RESULTS 1 - 10 OF 16