Featured Product

    Stress Testing – a Return to RAP vs. GAAP?

    Although banks have made significant strides in many areas of stress testing, opportunities for improvement remain. This article discusses how several key stress testing modeling aspects still need to be addressed.

    The Dodd-Frank Stress Test (DFAST) and Comprehensive Capital Analysis and Review (CCAR) results were released by the Federal Reserve (the Fed) in mid-March.1 When combined with the methodology documents published by the Fed, a somewhat detailed overview emerges for assessing banks’ abilities to withstand stressed economic downturns. Banks and other systemically important financial institutions (SIFIs) that submitted or expect to submit stress test results are thoroughly reviewing this documentation to improve their institutional stress testing methodology.

    In order for banks to forecast their balance sheets and income statements, however, they may need to use a methodology different from what is described in that documentation. This article reviews the methodology outlined by the Fed in detail and outlines some areas in which banks and SIFIs may seek to modify their approach.

    The stress testing dilemma: choosing a modeling methodology

    The Fed desires that each bank’s stress testing program become more than just a regulatory compliance exercise – it envisions banks using their programs to better inform strategic and business planning, risk appetite, and advanced risk management practices. Many of the primary methodologies described in the Fed’s published documents, however, create a challenge for banks to achieve this vision and appear to be inconsistent with Generally Accepted Accounting Principles(GAAP).

    Conversations with chief risk officers and other top executives at larger banks would seem to confirm this dilemma. Banks struggle as to whether they model the stress tests on an accrual accounting basis, which would allow them to best leverage the exercise for their business practices, or model the stress scenarios using the methodology as described by the Fed. Additionally, some are also considering whether they should forecast under both methodologies – reminiscent of when banks had to prepare and report financial results under both GAAP and Regulatory Accounting Principles (RAP) accounting.

    The pitfalls of an EL-based loss approach

    The loss forecasting methodology explained by the Fed in their documentation is an expected loss, or EL-based methodology. This methodology is preferred by risk modelers and can most accurately be described as a mark-to- market (MTM) view of accounting, despite the fact that a vast majority of most banks’ loans are governed by accrual accounting. The EL-based loss approach assumes that all losses are realized at the time of default. In reality, banks account for loan “losses” as charge-offs, not EL. Accounting charge-offs usually occur over time, not in full at the time of default. The time lag associated with loan charge-offs can be very important in Commercial Real Estate (CRE) portfolios, and more recently with mortgage portfolios in judicial states where the foreclosure process can extend to years. In these portfolios, often only 60-70% of the losses are realized as charge-offs in the first four quarters. The two methodologies will equal out over the long term, but nine quarters is typically not enough time for this conversion to happen.

    Incorporating a loss emergence vector

    The basic EL building blocks – Probability of Default (PD), Loss Given Default (LGD), and Exposure at Default (EAD) – provide 90% or more of what is required to forecast charge-offs instead of EL losses.

    In addition, banks need to incorporate a loss emergence vector, which describes how much of the LGD is realized as a charge-off by quarter after default (in Q1, Q2, Q3, etc.). The loss emergence vector will vary by loan product type and collateral type. It may also vary based on location of the collateral, especially in the instance of CRE or residential mortgage loans, where the ability of a bank to act on its collateral in judicial states can be much slower. Without the key information provided by the loss emergence vector, banks are reduced to oversimplifying the modeling process and may be overstating losses in a nine-quarter forecasting exercise.

    Without the key information provided by the loss emergence vector, banks are reduced to oversimplifying the modeling process and overstating losses in a nine-quarter forecasting exercise.

    Forecasting non-performing loans

    The MTM accounting treatment that the Fed employs also negatively affects the ability to accurately forecast non-performing loans (NPLs), a key credit quality metric employed by both banks and regulators. NPL balances are changed primarily one factor that increases, and three that reduce, NPL balances. New inflows into NPLs increase the balances. This metric is modeled using a conditional PD or conditional credit transition matrix approach. Reductions to NPL balances are:

    1. Reclassifications of non-performing loans back to performing
    2. Charge-offs
    3. Payments made on the loan

    Payments can include any payments by the borrower or guarantor or any type of sale of the collateral securing the loan.

    Modeling payment-related NPL reductions

    It is commonly known that the methodology the Fed employs for losses generally accelerates the reductions to NPLs. By using ELs instead of charge-offs, the NPL balances are reduced more aggressively than what actually occurs in practice at banks. But what about the payment-related reductions?

    To model the payment-related NPL reductions, banks need to additionally employ a principal balance reduction vector, similar to the loss emergence vector. Just as the loss emergence vector is applied to the LGD to accurately model when the losses are realized as charge-offs, the principal balance reduction vector is applied to one minus the LGD (which equals the amount of the defaulted loan a bank will ultimately get repaid) to identify when the payments on the non-performing loan will be received. The Fed methodology document is silent on this key aspect of NPL modeling. Given that the methodology document covers other pieces of the documentation in adequate detail, we can assume that nothing will be done for this aspect of NPL balance forecast modeling. How then can banks apply the Fed methodology in a manner that will be helpful for them if the modeling process does not reflect the established market practice of, nor provide sufficient guidance on, NPL balance forecasting?

    Regarding ALLL

    The Fed describes a process whereby the allowance for loan and lease losses (ALLL) is simply the sum of the next four quarters of stressed losses. First, we must remember that the Fed’s losses are not charge-offs, and will be higher – sometimes materially higher – than charge-offs in a stressed nine-quarter forecast. Even more important, however, is the fact that the Fed model does not align with how the ALLL is estimated by banks. Accounting firms would frown upon an ALLL method that is based on a forecasted four quarters of severely adverse stressed forward losses.

    The ALLL is designed to account for incurred but unrealized losses. It is not intended to be a reserve against stressed potential losses, which is how the Fed is modeling the ALLL forecast. A proper framework would align to a bank’s three primary ALLL components: FAS 5 pool reserves, FAS 114 reserves, and an unallocated reserve. These could be modeled using a function of the ELs for each portfolio at each future reporting period for the FAS 5 pool reserve, a function of the forecasted NPL balances at each future reporting period for the FAS 114 reserve, and a separate overlay for the forecasted unallocated reserves. Similar to the loss forecasting component of the stress test, it is impossible for a bank to forecast the ALLL for stress test purposes that will be applicable for both DFAST / CCAR and the bank’s internal management, unless a bank reverts to both a GAAP and RAP stress testing forecasting.

    Top-down forecasting of net charge-offs

    Finally, the 2013 DFAST methodology document describes an alternative loss methodology employed by the Fed to forecast net charge-offs (NCOs). In a one sentence paragraph on page 44, the second approach is described as “models capture the historical behavior of net charge-offs relative to changes in macroeconomic and financial market variables and loan portfolio characteristics.”2 Alternative methodologies are always good to have, but it appears the methodology the Fed uses could be improved.

    Even more important, however, is the fact that the Fed model does not align to how the ALLL is estimated by banks. Accounting firms would frown upon an ALLL method that is based on a forecasted four quarters of severely adverse stressed forward losses.

    Although it is impossible to say with certainty, the description indicates that the Fed directly models NCOs. Such a process does not work well, especially in the instances of commercial and CRE portfolios due to the long lag between gross charge-offs (GCOs) and recoveries. NCOs represent GCOs, minus recoveries. GCOs are a reflection of current credit conditions; however, recoveries are a function of prior GCOs, and therefore some prior period economic and credit conditions. When banks perform econometric modeling on NCOs, the true result of what is happening in the economy and credit cycle is reflected in the GCO time series, but noise is added by the inclusion of recoveries in the NCO time series. Therefore, banks should not include recoveries in this type of econometric modeling, but rather model the specific loan portfolio GCO time series econometrically and then separately forecast the specific loan portfolio recoveries as a lagged function of prior GCOs.

    This article intends to offer a new perspective on the concept of stress testing for capital adequacy purposes. By reviewing the methodology in detail, we seek to highlight some of the accounting challenges banks and SIFIs may face in applying the principles put forth by the Fed.

    Banks, regulators, and other market participants should be commended for the progress made so far on this important market initiative. Many aspects of stress testing are significant improvements over the previous methodologies banks employed to assess capital adequacy. Specifically, using a nine-quarter horizon, in conjunction with the requirement to stay well capitalized throughout the entire horizon, greatly improves the solvency of the system. This policy, including the substantial improvement in disclosure and transparency, leads market participants to a much higher degree of confidence in financial counterparty transactions and consequently financial system liquidity. Furthermore, banks have made significant strides in improving their risk governance, cultures, systems, and models. Nonetheless, opportunities for improvement remain. This article covers several key stress testing modeling aspects that need to be addressed. We remain optimistic, however, that regulatory clarity for some of these crucial issues will be forthcoming.

    Sources

    1 Board of Governors of The Federal Reserve System, Dodd-Frank Act Stress Test 2013: Supervisory Stress Test Methodology and Results, March 2013.

    2 Board of Governors of The Federal Reserve System, Comprehensive Capital Analysis and Review 2013: Assessment Framework and Results, March 2013.

    Featured Experts
    As Published In: