Featured Product

    Challenges and Pitfalls of Stress Testing

    This article describes stress testing challenges and pitfalls and offers ways to successfully overcome them to comply with the new regulatory guidelines and to establish internal frameworks.

    Banks around the world have devoted considerable time and resources to complying with the new regulatory guidelines and to establishing internal frameworks so they can perform stress tests for different types of risk, asset classes, and business lines. To successfully embed such a framework for stress testing, banks need to establish an enterprise-wide process that encompasses multiple steps involving a variety of employees, departments, and data sources. The management of such a process is challenging and its complex nature makes it prone to pitfalls and errors. This article describes some of these challenges and offers ways to deal with them.

    At the beginning of every meaningful stress test, financial institutions need to decide what they need to stress, how they will conduct the test, who will be in charge, and what they want to achieve with the results. A stress test has to meet business objectives, such as setting trade limits or capital allocations, or defining the organisation’s risk appetite, which can differ from regulatory requirements.

    Deciding what needs to be stressed and how

    Many banks are still having problems with this initial step. To decide what needs to be stressed, banks often align their efforts with regulatory requirements or market best practices, rather than deriving them from an internal business and risk analysis perspective.

    An obstacle to such an integrated, bank-wide perspective is often the organisational setup that evolved over the last decade. Banks aligned their risk management functions with the key risk categories according to Basel II, leading to a silo organisation in risk management that focuses separately on credit, market, operational, concentration, and liquidity risk. Such a setup has made efficient bank-wide or cross-risk stress testing, as well as its planning and coordination, unnecessarily difficult.

    Looking at the methods for stress testing that have evolved over the years, two main methods have arisen: sensitivity tests and scenario analyses. Sensitivity tests assume that only one risk factor, such as a shift in the yield curve, changes significantly. Sensitivity tests are rather simple in nature and relatively straightforward to implement, but lack plausibility because they do not take into account interdependencies between risk factors. As a result, the scenario analysis has become common practice to stress different risk categories. Scenario analysis examines the impact on a risk factor, such as probability of default, resulting from simultaneous changes in macroeconomic variables, such as inflation or GDP, allowing for a more realistic assessment of risk.

    Designing meaningful scenarios

    The most common pitfall is the design of meaningful scenarios that are severe but plausible at the same time. Depending on the scenario, the results of the stress test may significantly misrepresent the risks to which a bank is actually exposed, because the scenario may not be severe enough or plausible, or because it does not address important aspects. The unforeseen problems at Franco-Belgian bank Dexia in October 2011 after it had passed the stress test of the European Banking Authority three months earlier and the sudden problems of Ireland’s banks in November 2010 after they had passed the EU stress test just four months earlier are both good illustrations of this kind of misrepresentation.

    The biggest obstacles in scenario design are the lack of sufficient data and the inability of a human test designer to create a variety of scenarios that do not just stress the obvious and ignore the potential effect of unforeseen events.

    Developing a stress scenario to estimate the potential impact of catastrophic but low-likelihood events to a bank’s portfolio is difficult even for experienced risk managers. Despite a risk manager’s efforts, this kind of thought experiment is prone to two major pitfalls: ignoring plausible scenarios and considering implausible ones. Human creativity is influenced by experience, which leads risk managers to ignore plausible stress scenarios simply because they have not occurred yet. If a risk manager’s imagination is geared toward implausible scenarios – for example, an asteroid hitting the earth – the key purpose of the stress test, to enable better decision making, is jeopardised. What kinds of useful options will the management of a bank derive from the alarming results of a highly implausible stress scenario? How should it approach reverse stress testing that asks for the kinds of plausible circumstances that could make a bank’s business model unviable? Interestingly, given the myriad factors that could make a bank’s business unviable, senior management and risk managers tend to consider a big idiosyncratic shock, rather than more likely scenarios, in their reverse stress testing.

    An obstacle to such an integrated, bank-wide perspective is often the organisational setup that evolved over the last decade.

    Gathering sufficient data

    The most immediate challenge many banks face, is a lack of data. In particular, information from periods of severe stress is rare – information that would form the basis for a scenario, as well as help discern the linkage between macroeconomic variables and risk drivers. Given the interdependencies between macroeconomic variables such as GDP, unemployment, inflation, and oil prices, having sufficient data available to understand and properly model behaviour under stress is critical. A lack of sufficient data will eventually lead to a weak and unstable linkage between any scenario and relevant risk factors, yielding an outcome that may set values at implausible levels. Given that the focus of stress testing is on the tails of the distribution, a lack of data will limit the usefulness of the stress test. If additional data are not available and assumptions have to be made, those responsible for the scenario design or stress test should run the test using different assumptions to better grasp the potential margins of error.

    Even institutions that have enough granular information face data quality problems resulting from insufficient internal IT architecture, inconsistent data and processes, and non-accountability of those responsible for the input or audit of the information quality. Another increasingly important aspect is speed. If the results of a stress test should be relevant for a business decision, they will need to be available within days, if not hours, after the process has started. It is not uncommon that weeks can pass before the results of a stress test are available to senior management. In today’s dynamic and volatile markets, to be in a position to consider contingency plans for the business only after several weeks have passed is at the very least a competitive disadvantage.

    Linking a scenario with drivers of credit risk such as Expected Default Frequency (EDFTM) or Loss Given Default (LGD), and subsequently the economic capital required to protect a loan portfolio from unexpected losses, is another area of common pitfalls. The behaviour of risk drivers such as EDF or LGD under stress is usually modelled assuming non-linear relationships but proper parameterisation of the linkage function may suffer from a lack of data or intuition. Similarly, the calculation of economic capital under stress will only yield meaningful results if the bank is able to understand the dynamics of asset correlations during periods of economic stress. Banks often rely on changes in equity correlations as a proxy to capture these dynamics simply because data is readily available for these and they are easier to measure. However, empirical evidence has shown that equity correlations tend to be too low for financial firms, as well as for utilities and low-credit-quality firms. These deviations will lead to significant underestimation of the amount of required economic capital during stress periods.

    The biggest obstacles in scenario design are the lack of sufficient data and the inability of a human test designer to create a variety of scenarios that do not just stress the obvious and ignore the potential effect of unforeseen events.

    Communicating the results into action

    All efforts to create a meaningful stress test will be useless if one key aspect is left out: communication. Internal communication is just as important, if not more so, as the external communication in the form of regulator-prescribed formats. The stress test has to be easily communicated. It has to be understood by risk managers as well as senior management, and has to illustrate and quantify the vulnerabilities of an organisation’s current business model, as well as the transmission mechanism from scenario assumptions to potential portfolio impact.

    Ultimately, the results of a stress test will affect the decision-making process. Stress test results need to be benchmarked against the risk appetite of an organisation and lead to a critical review of its current risk profile. Senior management has to prepare plans for early intervention, such as raising funds, suspending dividends to shareholders, limiting or even eliminating certain business activities, requiring more frequent reporting, replacing responsible managers – even closing a business line if it can no longer continue in a viable fashion. Senior management’s engagement at this point is critical to endorsing any necessary action plans. Unfortunately, incorporating into a company’s strategic business planning the results of a hypothetical stress test scenario that may never materialise is a challenge on its own.

    Although much has been achieved in the last three to four years and the banks’ stress test frameworks are very different from their pre-crisis versions, risk managers still face and must address numerous challenges and pitfalls before they can turn stress testing into the powerful instrument it can be.

    Featured Experts
    As Published In:
    Related Articles
    Article

    Five Challenges for the Business-Model “Bank”

    Banks and the services they offer remain essential to global economies.To stay relevant, however, banks need to adjust their business models and adapt to the new realities – tighter regulation, lower interest rates, changing client needs and behavior, technology disruption, and accelerating disintermediation.

    December 2015 WebPage Dr. Christian Thun
    Whitepaper

    AnaCredit Gives Banks an Opportunity to Improve Data Management, but Challenges Remain

    The AnaCredit project is scheduled to be implemented in three stages by mid-2020. This paper looks at the challenges for banks in creating the AnaCredit framework and how to overcome these main challenges.

    July 2015 Pdf Dr. Christian Thun
    Article

    Strong Data Management – An Absolute Necessity

    Inferior data, too long left unchecked, has far-reaching consequences – not the least of which was the 2008 global financial crisis. Banks that establish a strong data management framework will gain a distinct advantage over their competitors and more efficiently achieve regulatory compliance.

    May 2015 WebPage Dr. Christian Thun
    Webinar-on-Demand

    Outlining a Path From Data Management to Value Generation

    Effective Risk Data Aggregation and Reporting stipulate that banks need to have a strong governance framework, risk data architecture and IT infrastructure. This webinar discusses how addressing data quality has become an opportunity for competitive advantage.

    April 2015 WebPage Dr. Christian Thun
    Whitepaper

    European Banks Underestimate the Challenges of BCBS 239 Implementation

    This whitepaper looks at the results of the Moody's Analytics survey in January 2015, which reveals that many banks underestimate the time, resources and cost involved to implement BCBS 239.

    April 2015 Pdf Dr. Christian Thun
    Webinar-on-Demand

    Moody's Analytics 2015 Survey on BCBS 239 Compliance

    In response to the data challenges faced by banks when preparing their BCBS 239 project, Moody's Analytics has engaged with market participants in EMEA to better understand the current state of the industry by providing a snapshot through our BCBS 239 survey.

    February 2015 WebPage Dr. Christian Thun
    Article

    Lean Management: A New Zeitgeist in Risk Management

    Banks can greatly benefit from a leaner and more integrated approach to risk management. This article addresses how banks can constantly evolve to an efficient and productive process, by focusing on data, infrastructure, process, and – most importantly – people.

    November 2014 WebPage Dr. Christian Thun
    Webinar-on-Demand

    Webinar-on-Demand: Stress Testing as a Catalyst for BCBS 239 – or Vice Versa?

    Banks will be under even more pressure as stress testing is becoming a recurring exercise and the new principles for risk data aggregation (BCBS 239) require them to quickly solve the issues around the data warehouses.

    September 2014 WebPage Dr. Christian Thun
    Whitepaper

    Can the Asset Quality Review Restore Confidence?

    The article looks at the ECB's use of the AQR as a building block for restoring confidence within the European Banking System. It also provides an outline of the key elements of the AQR, including collateral and real estate valuation, collective provision analysis, and fair value exposures.

    May 2014 Pdf Dr. Christian Thun
    Article

    Leveraging the Regulatory Stress Tests to Build Long-Term Value

    Banks have to dedicate enormous resources to comply with CCAR and DFAST, but rather than treating stress testing like a check-the-box exercise, banks should view it as an opportunity to better manage their businesses and invest in robust stress testing frameworks.

    November 2013 WebPage Dr. Christian Thun
    RESULTS 1 - 10 OF 16