# Mortgage Models for CECL: A Bottom-Up Approach

In this article, we describe how a loan-level modeling approach can be used to forecast credit losses in residential mortgages. We review the challenges a bank may face in complying with the FASB’s recent Accounting Standards Update on reporting credit losses. In particular, we show how historical, current, and forward-looking information can be used to estimate credit losses. We also address other modeling and implementation considerations as they pertain to the estimation of credit losses.

## Introduction

When building and implementing econometric models for different asset classes, a modeler needs to carefully examine the requirements from the perspective of the end users of the models. A trader of whole loans may be more interested in the accurate modeling of loan-level cash flows and exploiting any statistical arbitrage. A servicer is likely to be concerned about delinquency transitions and time to liquidation. Regulatory stress testing requires that the models demonstrate sensitivity to macroeconomic conditions. Risk management requires that the models correctly capture the correlation between different assets in the portfolio.

The recently issued Accounting Standards Update (ASU) by the Financial Accounting Standards Board (FASB) introduces several considerations that banks must incorporate into their estimation of credit losses:

- The bank must use historical information, current information, and forward-looking information to arrive at the loss estimates
- The losses must be estimated over the life of the loans.
- The effect of prepayments must be accounted for when calculating the losses.

A typical bank’s whole loan retail portfolios consist of residential mortgages and home equity lines of credit (HELOCs), auto loans, credit cards, and other consumer loans. Banks usually have the largest exposure to residential mortgages and HELOCs. Although the number of mortgages is usually smaller, the balances on mortgages are also much larger than those on auto loans or credit cards. Moreover, residential mortgages and HELOCs can have several different products, such as fixed-rate and adjustable-rate mortgage (ARM) loans and loans with different terms and maturities. Therefore, as compared to other retail assets, mortgages tend to be less homogenous.

Retail portfolios can be analyzed using a top-down (segment-level) or bottom-up (loan-level) approach. This paper shows how we can use a loan-level modeling framework to arrive at the expected credit losses on residential mortgage portfolios.

## A Loan-Level Framework

A loan-level or bottom-up approach involves constructing econometric models for each loan in the portfolio. Results can be aggregated over all the loans in different cohorts or segments to arrive at segment-level or portfolio-level results.

Loan-level models are usually hazard-rate models and can be constructed in a competing risk framework. The data is naturally organized as panel data; each loan has multiple observations through time. Defaults and prepayments compete with each other in a multi-period setting. Survival models in this framework can be built using a panel logit model.

A bottom-up approach has the advantage that the results are naturally available at the highest level of granularity. The explanatory variables, such as loan and borrower characteristics and macroeconomic variables, are used at the loan level. Likewise, the performance variables, such as defaults, prepayments, cash flows, and losses, are modeled at the loan level. Heterogeneity of the loan characteristics – for example, different mortgage products such as first or second lien loans, adjustable- or fixed-rate loans, low-LTV (loan-to-value) or high-LTV loans, or loans with different levels of credit risk – can be easily accommodated.

Building loan-level models requires reliable historical loan-level data. This can be onerous and expensive. If the loan-level data is not reliable, the models that are built may need to be recalibrated. The implementation can also require additional resources.

Next, we look at some of the considerations in the ASU and see how they can be addressed in this framework.

## Portfolio Segmentation

A joint statement by the Federal Reserve, FDIC, National Credit Union Administration (NCUA), and Office of the Comptroller of the Currency (OCC) clarifies their position on segmentation of the portfolio. Although the standard allows for institutions to measure the expected credit risk on a collective or pool basis provided loans have similar risk characteristics, the statement says, “If a financial asset does not share risk characteristics with other financial assets, the new accounting standard requires expected credit losses to be measured on an individual asset basis.”^{1} In a loan-level modeling approach, this point is automatically addressed because each loan is treated separately. The migration of a loan from one risk bucket to another as a result of changes in the borrower’s credit score can also be accommodated in loan-level models.

## Use of Historical, Current, and Forward-Looking Information

The measurement of expected credit losses is to be based on relevant information about past events, including historical experience, current conditions, and reasonable and supportable forecasts that affect the collectability of the reported amount. Let us now see how these conditions are incorporated into a loan-level analysis.

Past events enter the models in a few different ways. First, terms in the models such as the spread at origination (SATO) – the difference between the interest rate on the mortgage and the prevailing market mortgage rate at loan origination – capture the credit riskiness of the borrower at loan origination. Second, factors such as the change in unemployment rate from loan origination or the change in home prices from loan origination reflect the macroeconomic conditions at loan origination. Third, the trajectory of interest rates and home prices from loan origination produces prepayment opportunities to the borrower. “Burnout,” which captures the unwillingness or inability of the borrower to prepay, is another factor that captures historical macroeconomic information in the model.

The models themselves are estimated using the default, prepayment, and loss experience in the historical dataset used to build the model. Therefore, as long as the models are built using the reporting institution’s data or are calibrated to it, the models can account for the historical experience.

Paragraph 326-20-55-3 in the ASU provides some commentary on the use of historical data:

*Historical loss information generally provides a basis for an entity’s assessment of expected credit losses. An entity may use historical periods that represent management’s expectations for future credit losses. An entity also may elect to use other historical loss periods, adjusted for current conditions, and other reasonable and supportable forecasts. When determining historical loss information in estimating expected credit losses, the information about historical credit loss data, after adjustments for current conditions and reasonable and supportable forecasts, should be applied to pools that are defined in a manner that is consistent with the pools for which the historical credit loss experience was observed.*

Current conditions enter the models in a few different ways. First, the delinquency status of the mortgage directly affects the probability of prepayment and default on the mortgage. Second, the current outstanding balance is used to determine the updated LTV of the borrower, which is one of the dominant factors in default and loss given default (LGD) models.

When we build a loan-level econometric model, we naturally separate out and capture the effects of macroeconomic drivers and loan and borrower characteristics. This ensures that any increase or decrease in the historical loss over different periods has been accounted for through the use of the appropriate driver. Therefore, when the models are used in forecasting the credit losses, an exact match of the loan-level characteristics with the historical data is not necessary. What is necessary, though, is that the data used in building the models is a superset of the data on which the models are run.

### As long as the models are built using the reporting institution’s data or are calibrated to it, the models can account for the historical experience.

Forward-looking information is incorporated through the use of macroeconomic forecasts. These contain future home prices, unemployment rates, and interest rates, which enter the models through different factors such as updated LTV, unemployment rate shocks, and interest rate spreads.

## Estimating Credit Losses Over the Life of the Loan

Expected credit losses are to be calculated over the life of the loan. In this section, we will consider how this calculation can be performed using a discounted cash flow method. In a discounted cash flow method, the loan’s cash flows, such as principal, interest, prepayments, and recoveries, should be estimated over the contractual life of the loan. Note that the expected life of the loan is much shorter than the contractual life. For example, a typical 30-year fixed rate mortgage may only have an average life of 10 years. When projecting cash flows in a competing risk framework, we apply the probabilities of prepayment and default in each period and determine the expected survival probability of the loan at the end of each period. Based on this survival probability, we can calculate the expected cash flows in each month from the reporting date. After discounting the cash flows by the effective interest rate, we arrive at the amount expected to be collected.

Therefore, the calculation of the expected credit losses over the life of the loan does not pose any additional challenges as long as hazard-rate models are applied in a multi-period setting.

## Estimating Losses for Loans with Low Credit Risk

Paragraph 326-20-30-10 states:

*An entity’s estimate of expected credit losses shall include a measure of the expected risk of credit loss even if that risk is remote, regardless of the method applied to estimate credit losses.*

When we use a loan-level econometric model, the model is estimated over the universe of loans with different credit qualities. Therefore, although the historical loss on a small set of good loans may be zero, the models are likely to estimate a low but non-zero probability of default for a large set of similar loans. The actual loss estimate may or may not be zero, depending on the estimated value of the collateral at the time of default or liquidation.

## Incorporating Prepayments in the Analysis

Mortgages have a built-in prepayment option whereby a borrower can choose to refinance a mortgage by borrowing from one financial institution at a lower rate and paying off the existing loan. Additionally, the borrower can pay off the mortgage by selling the house. Most mortgages do not have a prepayment penalty, so a borrower is free to exercise this option depending on the prevailing interest rates and other factors. In a competing risk framework, the conditional prepayment and default hazard rates are estimated. They compete with each other when implemented in a multi-period framework. If prepayments rise, then fewer loans are available to default. As a result, the cumulative or lifetime probability of default of the loans decreases.

When calculating the credit losses over the expected life of the loan, no special considerations are needed to determine the expected life. When performed over the contractual life of the loan, the probabilistic calculations naturally produce the expected life of the loan along with the option-adjusted expected credit losses of the loan. In other words, the effect of prepayments on the estimated life of the loan is accounted for in this modeling approach.

## Use of Reasonable and Supportable Forecasts

The ASU mentions the use of reasonable and supportable forecasts in several places throughout the document. We have already seen that given a macroeconomic forecast, the calculation of the expected credit loss using a discounted cash flow method is fairly straightforward. The next question is how one can arrive at a reasonable and supportable macroeconomic forecast.

There are three different ways in which one can justify a reasonable forecast. One could use a baseline forecast such as the one produced by Moody’s Analytics. Alternatively, one can use a set of scenarios that cover economic expansions and recessions and assign a probability weight to each scenario. The expected credit loss would be a probability-weighted sum of the expected credit losses on the set of scenarios. A third possibility is the use of a full-blown Monte Carlo simulation of the economic scenarios. In this method, one could calculate the loss for each scenario and calculate the average or expected value over the entire set of randomly simulated scenarios.

### When we build a loan-level econometric model, we naturally separate out and capture the effects of macroeconomic drivers and loan and borrower characteristics.

All three methods ensure that the calculations are done in an “average” or “expected” sense, as opposed to performing the calculations on stress or extreme scenarios. The use of a small set of probability-weighted scenarios, as opposed to a single baseline forecast, can help address the effect of any nonlinearities in the loss models while limiting the complexity of forecasting scenarios.

## Modeling with Little Performance History

Not all financial institutions have a long enough performance history or high-quality loan-level data to build or calibrate loan-level models. Since the ASU requires that the models consider the historical performance of the reporting institution, we need to consider a few options depending on the quality and quantity of available data.

When we talk of the data used for building the models, we need to consider two dimensions. The first is the cross-sectional dimension which determines how rich the data is in relation to: loan terms and conditions such as different types of mortgage products, LTV distribution, interest rates on the mortgages; borrower characteristics such as FICO scores and income and employment verification; geographic distribution; and other variables such as the type of property and distribution across vintages. The second is the time dimension which defines the length of the performance history and the availability of dynamic variables such as outstanding balance, delinquency status, interest rate, default and prepayment status, and realized losses.

The cross-sectional information defines the domain of applicability of the model. For example, if historical lending has been over a narrow range of FICO scores or origination LTVs, applying the model to a FICO or LTV value outside the range may be problematic. Similarly, if lending is limited to a certain geographic region of the US, applying a model built with this data to lending in other regions may be hard to justify. Data along the time dimension helps us account for business cycles containing economic expansions and recessions. Knowing how the mortgages behaved during the recent financial crisis helps us tease out the relationship between the default rate and large declines in home prices or high levels of unemployment rates. If we only had performance history of a few years when home prices were rising and the unemployment rate was falling, we will have to extrapolate the behavior of the models to periods that may contain a fall in home prices or a rise in unemployment rates.

This presents us with a few choices for building and implementing models. For example, one could build a model using an industrial-strength dataset that spans the entire US, covers different loan and borrower characteristics, and has a sufficiently rich performance history. If the reporting institution such as a bank does not have any historical data, either because the data was not collected in the past or the bank has started lending only recently, one has no choice but to use the model built using industry data as a proxy for the bank’s estimated credit losses.

If, on the other hand, the bank has a limited data history, we would not be able to infer the dependence of the default rates, prepayment rates, or LGD from this data. However, we could infer this from the model built using industry data while refitting or calibrating the model to the bank’s data. In other words, we could use some of the model coefficients from the industry model, but recalculate the remaining coefficients from the bank’s data. In this manner, we capture the economic cycles as well as the bank’s underwriting using a combination of the two datasets.

### The use of a small set of probability-weighted scenarios, as opposed to a single baseline forecast, can help address the effect of any nonlinearities in the loss models while limiting the complexity of forecasting scenarios.

As a third possibility, the bank may have a rich and long enough data history. In that case, the bank may choose to build a model exclusively using its own data. This ensures that the models use the bank’s performance history and are tuned to the bank’s underwriting standards.

## Using Models on Little or Unreliable Data

The FASB received feedback from small financial institutions, such as community banks and credit unions, that the implementation of the ASU could be complex. Several financial institutions do not have reliable loan-level data to even make use of a standard loan-level model. In this case, we need to explore what options such institutions may have to calculate expected credit losses.

Consider a situation in which the loan-level model uses several fields, but a credit union only has a few pieces of information for each loan. For example, this information could be limited to original LTV, vintage, type of mortgage, and FICO score at origination. More details such as the type of property or the level of income and employment documentation, although used by a standard model, are not recorded by the credit union. There are two possibilities: one can apply typical, mean, or median values of the unknown factors to the loan-level model, or one can consider reasonable distributions of the unknown factors to arrive at an estimate of the uncertainty in the credit loss.

Consider another example of lack of quality loan-level data. Suppose a credit union only has a few pieces of information at an aggregate level for different segments of its portfolio and the total exposure for each segment. One may know that the average FICO score of fixed-rate loans is 650, the average LTV at origination is 85, and the total exposure is $50 million. Again, as in the previous example, one can use the known data along with estimates, typical values, or ranges for the other data fields to estimate the credit losses for each segment separately. By knowing the total exposure, we can arrive at the expected credit loss for each segment.

## Conclusion

This paper shows how a loan-level approach can be used to estimate the expected credit losses over the life of a loan. It addresses a few of the items in the implementation of this approach. With the use of appropriately built and calibrated models, the method can be used not only for accounting for credit losses, but also in risk management and stress testing.

###### Notes

1 FRB, FDIC, NCUA, and OCC joint statement, 2016.

###### Sources

Board of Governors of the Federal Reserve System, Federal Deposit Insurance Corporation, National Credit Union Administration, and Office of the Comptroller of the Currency. “Joint Statement on the New Accounting Standard on Financial Instruments – Credit Losses.” June 17, 2016.

Financial Accounting Standards Board. “Financial Instruments – Credit Losses (Topic 326).” FASB Accounting Standards Update. June 2016.

###### SUBJECT MATTER EXPERTS

#### Dr. Shirish Chinchalkar

Managing Director, Consumer Credit Analytics

Shirish is experienced in numerical and high-performance computing and computational finance. He has worked on Monte Carlo methods, numerical optimization, and parallel computing. At Moody’s Analytics, he works in the economic and structured analytics group on the Portfolio Analyzer platform for analyzing residential mortgages, auto loans, and asset-backed securities.

###### As Published In:

### Devoted to the convergence of risk, finance, and accounting disciplines with regard to the new impairment standard, Financial Instruments -- Credit Losses, commonly known as the current expected credit loss (CECL) approach.

###### Related Insights

## CECL Quantification: Retail PortfoliosIn this webinar, our experts discuss the important considerations in the modeling and implementation of the CECL standard for retail portfolios. Learn more about loan-level modeling approaches that can be used to forecast credit losses for retail portfolios and how to leverage existing risk measurement practices. |

## CECL Quantification: Retail Portfolios Webinar SlidesIn this webinar, our experts discuss the important considerations in the modeling and implementation of the CECL standard for retail portfolios. Learn more about loan-level modeling approaches that can be used to forecast credit losses for retail portfolios and how to leverage existing risk measurement practices. |

## CECL Quantification: Retail Portfolios Webinar SlidesIn this webinar, our experts discuss the important considerations in the modeling and implementation of the CECL standard for retail portfolios. Learn more about loan-level modeling approaches that can be used to forecast credit losses for retail portfolios and how to leverage existing risk measurement practices. |

## Complying with IFRS 9 Impairment Calculations for Retail PortfoliosThis article discusses how to address the specific challenges that IFRS 9 poses for retail portfolios, including incorporating forward-looking information into impairment models, recognizing significant increases in credit risks, and determining the length of an instrument's lifetime. |

## Stress Testing for Retail Credit Portfolios: A Bottom-Up ApproachThis article focuses on model building from a bottom-up perspective of mortgages and home equity lines of credit to underscore the importance of loan-level analytics. |

## Account Level Retail ModelingWhy are mortgages complicated? In this presentation, Moody's Analytics expert Dr. Shirish Chinchalkar shares the challenges to modeling a retail portfolio – and how to solve them. |