The Financial Accounting Standards Board (FASB) issued Accounting Standards Update (ASU) 2016-13 (the current expected credit loss (CECL) model) in June 2016 to replace the existing incurred loss model. One of the main reasons for issuing the ASU was that the incurred loss model resulted in banks delaying credit loss recognition and setting loss allowances that were “too little, too late” during economic downturns. Most stakeholders believe that CECL will have a significant impact on allowance and provision, earnings, and capital. This article examines CECL’s potential impacts from an empirical perspective. Using historical data (500,000 commercial and industrial (C&I) loans from 15 US banks), we calculate and compare loan- and portfolio-level loss allowances under the incurred loss model and CECL. We find that CECL generally helps alleviate the “too little, too late” problem seen during the financial crisis. However, we observe significant variations in allowances across banks and over time. Loss allowances under CECL are not always higher than those under the incurred loss methodology. The impact of CECL on allowance depends on portfolio characteristics such as loan maturity, economic cycle, and banks’ lending policies and allowance practices. In addition, we find that CECL generally leads to higher volatilities in loss allowance.
The 2008 global financial crisis amplified the need to improve existing financial reporting standards. Specifically, the incurred loss model under US Generally Accepted Accounting Principles (GAAP) for impairment calculation and reporting was criticized by regulators1 and various market participants for delaying credit loss recognition and setting loss allowances that were “too little, too late” during economic downturns. Under the incurred loss model, banks recognize impairment for financial instruments when credit losses are determined to be “probable and reasonably estimable,” as of the reporting date. Currently, GAAP restricts the ability for banks to record expected credit losses that do not yet meet the “probable” threshold. It has been observed that banks generally determine the loss allowance amount to be set aside based on historical loss experiences, which were low in the years leading up to the financial crisis.
To address the “too little, too late” issue of the existing model, in June 2016, the FASB issued the long-awaited financial accounting standards update ASU 2016-13: “Financial Instruments – Credit Losses: Measurement of Credit Losses on Financial Instruments.” Commonly known as the current expected credit loss (CECL) model, it requires institutions to estimate the expected credit loss over the life of financial instruments, based on historical information, current conditions, and reasonable forecasts, and to set aside lifetime expected credit losses as the loss allowance.
Many stakeholders expect CECL’s impact to be substantive; however, assessments are scarce and typically based on surveys conducted among a small group of banks or studies of synthetic portfolios constructed as of a specific analysis date. This article seeks to shed some light on CECL’s impact from an empirical perspective. Using historical loan data from 15 US banks, we calculate loan- and portfolio-level loss allowances under the incurred loss model and the CECL model at a quarterly frequency from 2003 to 2015. We can then assess how loss allowances would have differed during this 12-year period if CECL had been implemented in 2003.
The loan portfolio data we use for this study comes from the loan accounting system (LAS) data in Moody’s Analytics Credit Research Database (CRD). The CRD is one of the world’s largest historical time series of private firm, middle-market loan data for C&I borrowers. The dataset collects facility and loan information at the draw level from contributing banks at a quarterly frequency, including origination date/amount, contractual maturity, unpaid balance, delinquency and default status, bank internal rating, and/or probability of default (PD). In addition, we use Moody’s Analytics RiskCalc software, an industry-leading default risk assessment solution for private firms, to generate both the through-the-cycle (TTC) and point-in-time (PIT) forward-looking Expected Default Frequency (EDF) measures for each observation within the loan accounting system database. In total, we have 393,479 unique term loans and 181,933 facility draws from 151,468 borrowers, constituting a total of approximately 3.64 million observations. The mean and median times to maturity across all observations are 2.13 years and 1.58 years, respectively.
We calculate the loan loss allowances compliant with the two impairment models as far as back our data allows for each loan in the portfolios of the contributing banks. We then compare the two loss allowance rates with banks’ historical reported net charge off (NCO) rates.
The CRD does not have actual historical loss allowances from contributing banks. In an effort to achieve a uniform treatment across contributing banks, we calculate historical loss allowances based on the loan-level PD and loss given default (LGD). Most of the CRD-contributing banks submit their internal risk rating for the borrowers and the associated PD or PD range of each rating. To ensure consistency with banks’ loss allowance calculation processes, we use these internally assigned PDs when available. For loans without internal PDs, we use the one-year TTC PD generated by the RiskCalc software. We do not have any information regarding the internal LGD estimates from the CRD contributors. Instead, we use the long-term, loan-level LGD generated by Moody’s Analytics LossCalc software.2 We calculate the one-year loss rate in quarter t as: 1-year TTC PD(t)·LGD(t).
The goal of the incurred loss model is to estimate those loan losses incurred in the portfolio but not yet identified and charged off. Most banks use the loss emergence period (LEP) to adjust the annual loss rate in their loss allowance calculations. For example, assume a bank has a loan portfolio in which it takes two years for a loan to move from loss event to charge-off; the bank then has two years of losses inherent in its portfolio at any given point. If the estimated annual loss rate is 2%, and the bank uses 2% to estimate the allowance for loan and lease losses (ALLL), it will only reserve one year of losses. However, if the bank multiplies the annual loss rate by LEP of two years, it will reserve for two years of losses. In general, the loss allowance rate under the incurred loss model for each loan is calculated as LEP·(1-year TTC PD(t))·LGD(t), and the loss rate of a loan portfolio is equal to the balance-weighted sum of loan allowance rates.
Using publicly available information in FR Y-9C reports on C&I portfolio allowance rates for Q1 2013 – Q4 2015, we estimate the LEP for each bank so that the C&I portfolio allowance rate for that bank as calculated based on PD and LGD is as close to the publicly available information as possible. The estimated LEP ranges from 1.33 years to 2.55 years across banks, with an average of 1.90 years. Once we estimate LEP for each bank, we assume the bank used the same LEP prior to Q1 2013, when the C&I portfolio loss allowance rate was not publicly available. Figure 1 compares the modeled C&I portfolio incurred loss allowance rate for the 15 banks in aggregate, with the publicly reported loss allowance rates. The green line represents the modeled C&I loss allowance rate, and the blue line represents the actual loss allowance rate collected from banks’ FR Y-9C reports, beginning Q1 2013.3 The green and blue lines match quite well, which suggests that our model assumptions are reasonable and consistent with the banks’ internal practices for loss allowance calculation under existing GAAP rules.
We calculate the CECL loss allowance as the lifetime expected credit losses based on Moody’s Analytics RiskCalc PIT PD term structure, Moody’s Analytics LossCalc LGD term structure, and the contractual time to maturity of each individual loan at each quarter t:
LossAllowanceCECL(t) = ΣMi=1 LGD(ti) · (CPD(ti) - CPD(ti-1)) Here ti=t+i quarter, tM is the contractual time to maturity of the loan as of quarter t, and CPD(ti) is the cumulative probability of default from t to ti. We ignore discounting in this calculation, because reliable effective interest rates are not available. We do not apply additional qualitative (Q) factors to the modeled CECL loss allowance above, because the RiskCalc and LossCalc models are point-in-time and incorporate information about future credit environments.
Figure 2 compares the aggregated loss allowance rates of the C&I portfolios from 15 CRD-contributing banks. We also include the historical one-year NCO rates (i.e., NCO over the next four quarters for each time t) for C&I portfolios publicly available for the same 15 banks. The two sets of allowance rates fluctuate over time and cross each other. However, during the financial crisis, the CECL allowance increased earlier and faster than the incurred loss curve and remained above the NCO line. This suggests our modeled CECL loss allowance is more responsive to market deterioration than the incurred loss approach, and the CECL reserve would have been sufficient to absorb the loss during the following year.
Figure 3 shows results for a set of four individual banks selected from our data sample. We see significant variations across the four banks regarding trends and levels for both loss allowance rates and historical loss experiences over time. As shown in Figure 3, the loss allowances created under CECL are generally sufficient to cover the actual losses banks experienced the following year. However, the CECL loss allowances4 are not always higher than those seen under the incurred loss model during our analysis period, and the relative change in allowance level varies across banks. The economic cycle is a key driving factor, as indicated by these charts. When retroactively applied to the financial crisis period, the CECL model calls for dramatically higher loss allowance rates than the incurred loss model. Under the current, relatively benign economic conditions, with historically low loss experience, CECL’s loss allowance level remains close to banks’ existing reserve levels. For some banks, the CECL allowance level may even be lower than the incurred loss allowance.
We also attempt to better understand CECL’s impact on each bank by examining their portfolio characteristics, including (but not limited to) loan maturity, industry sector, credit riskiness, and allowance calculation practices. The average LEP used by Bank A is 2.30 years, the highest among the four banks and very close to the average time to maturity of 2.37 years in its C&I portfolio. Given that PIT PD after the financial crisis is generally lower than the long-run average TTC PD, it is not surprising to see the CECL loss allowance level lower than the incurred loss allowance level in the current environment. The same argument applies to Bank B, which has an average LEP of 1.95 and an average time to maturity of 1.94 years. Bank B’s portfolio is slightly riskier than Bank A’s, with a balance-weighted, one-year TTC PD of 3.1%, compared to Bank A’s 2.7%. For Banks C and D, the portfolios’ lifetimes are much longer than the LEP used in the incurred loss model. Banks C and D have average times to maturity of 2.62 years and 2.44 years, respectively, compared to LEPs of 1.60 years and 1.46 years. For those banks, the CECL loss allowance level is nearly always above the allowance level generated by the incurred loss model.
Market participants have been arguing that the “reasonable and supportable forecast,” CECL’s forward-looking requirement, may inject additional volatility into banks’ loss allowances. Figure 4 lists the standard deviations of the historical loss allowance rates under the two accounting standards for two different analysis periods. The volatility of the loss allowance rate under the incurred loss model is significantly lower than the loss allowance rate volatility under the CECL model, in line with general market expectations. In addition, our research suggests that CECL may significantly affect the level and volatility of banks’ earnings and available capital.5
We find that the loss allowance levels under the incurred loss model were indeed “too little, too late” during the economic downturn. The loss allowances estimated under the CECL model are much more responsive to market changes and are generally sufficient to cover banks’ realized losses at different time periods, even without additional qualitative adjustments. The CECL impact varies significantly across banks and over time. The relative changes in loss allowance levels are driven primarily by a portfolio’s loan maturity, credit riskiness, the bank’s business, and economic cycle. If CECL is implemented immediately under the current, more moderate economic conditions, allowance levels may actually decrease for some banks. Our results are in line with market expectations that CECL will generally lead to higher volatility in loss allowances as compared to the incurred loss model.
1 As an example, see US Government Accountability Office, 2013.
2 The required inputs for Moody’s Analytics LossCalc software are borrower’s PD, loan industry sector, secured vs. unsecured, and evaluation date.
3 The disaggregated historical loss allowance rates collected from 10-Q and 10-K forms were used for LEP factor calibration but not included for aggregate loss allowance rate calculation in Figure 1 for comparison purposes.
4 Of course, banks can apply additional qualitative adjustments during loss allowance calculation, which further increases or decreases the allowance level from our modeled results.
5 See Levy, et al., 2017.
American Bankers Association. “FASB’s Current Expected Credit Loss Model for Credit Loss Accounting (CECL): Background and FAQ’s for Bankers.” June 2016.
Baskin, Dorsey and Graham Dyer. “ALLL – Historical Loss Calculation: An exploration of the methods of calculating historical loss experience for the purposes of estimating the allowance for loan and lease losses.” Grant Thornton. May 21, 2015.
Cole, Roger T. “Interagency Policy Statement on the Allowance for Loan and Lease Losses (ALLL).” Board of Governors of the Federal Reserve System, Supervisory Letter SR 06-17. December 13, 2006.
Dwyer, Douglas. “RiskCalc: New Research and Model Validation Results.” Moody’s Analytics presentation. May 2011.
European Banking Authority. “Report on Results from the EBA Impact Assessment of IFRS 9.” November 10, 2016.
Financial Accounting Standard Board. “Financial Instruments – Credit Losses (Topic 326): Measurement of Credit Losses on Financial Instruments.” June 2016.
IFRS Foundation. “IFRS 9 Financial Instruments (Hedge Accounting and amendments to IFRS 9, IFRS 7 and IAS 39): Implementation Guidance.” November 2013.
Levy, Amnon, Xuan Liang, Yanping Pan, Yashan Wang, Pierre Xu, and Jing Zhang. “Measuring and Managing Credit Risk in Earnings Volatility of a Loan Portfolio
Under IFRS 9.” Moody’s Analytics whitepaper. March 2017.
US Government Accountability Office. “Financial Institutions: Causes and Consequences of Recent Bank Failures.” Report to Congressional Committees. January 2013.
Scott is a Director in the Regulatory and Accounting Solutions team responsible for providing accounting expertise across solutions, products, and services offered by Moody’s Analytics in the US. He has over 15 years of experience leading auditing, consulting and accounting policy initiatives for financial institutions.
Examines the role of disruptive technologies in the financial sector and how firms can improve their practices to remain competitive.
Previous ArticleAccounting for Purchased Credit Deteriorated Financial Assets: Current and Future GAAP
This paper describes a method to optimize assets under Basel III Liquidity Coverage Ratio (LCR) requirements. We develop a framework that optimizes the trade‐off between risk and return of assets. Although developed in conjunction with Basel III liquidity requirements, it is readily extendable to similar solvency requisites.
Moody’s Analytics analyzed a range of plausible outcomes of quantitative expected losses under CECL, incorporating COVID-19 impacts across commercial and industrial (C&I), commercial real estate (CRE), and retail loans.
Many banks went live with their models and systems for IFRS 9 provisioning more than two years ago. Now, the new accounting standard and the banks’ implemented methods to comply with it will face their first serious challenge following the global outbreak of Coronavirus (COVID-19).
CECL was scheduled to go into effect at the beginning of 2020 until COVID-19 disrupted businesses. Moody's Analytics analyzed a range of plausible outcomes of quantitative expected losses under CECL, incorporating COVID-19 impacts across commercial and industrial (C&I), commercial real estate (CRE), and retail loans.
Learn to differentiate C&I, CRE, retail, and securities. Choose approaches at the right level of flexibility and sophistication. Apply model-free solutions based on historical internal or industry data.
This webinar will focus on the implementation considerations and challenges of the CECL accounting standards for insurers.
For insurers, including reinsurance receivables is a unique result of the CECL accounting standard.
This paper explores the CECL standard’s background, the choices community banks, regional banks, and credit unions face, and some suggested approaches for dealing with these challenges.
The presentation discussed strategic and tactical considerations when creating a CECL modeling approach. We discuss the approach of adapting models built from industry/peer group data and then examine leveraging bank internal ratings and industry data for both C&I and CRE portfolios.
This paper investigates the impact of using EDF9 instead of EDF8 values as inputs for estimating credit portfolio risk measures within Moodys Analytics RiskFrontier®. The recent EDF9 enhancements affect portfolio risk analysis via various channels — due not only to new values for default probabilities, but also because the market Sharpe ratio (i.e. market-level risk premium) and asset return-based correlations for corporate exposures depend on time series of EDF measures. In this paper, we focus on the question of how using the new EDF9 default probabilities alter patterns in portfolio risk measures.