Learn how the Comprehensive Capital Analysis and Review and Dodd-Frank Act Stress Tests will impact banks in 2014 and how banks can best prepare for the changes.
When reviewing CCAR banks with over $50 billion in assets, regulators will increasingly focus on infrastructure and automation in 2014. The report issued by the Fed in August 2013, titled Capital Planning at Large Bank Holding Companies: Supervisory Expectations and Range of Current Practice, emphasizes process control, automation, and integration, as well as various elements of the stress test forecast itself.1
In my view, there are three topics that these banks will find particularly interesting in the report. The first topic, covered extensively in the report, is the lack of integration of loss estimation within the Pre-Provision Net Revenue (PPNR) calculation. Since this integration has proven a weakness for many banks, it will attract more attention.
A second area of increased focus is on challenger models. These models are not the primary models used for the derivation of the CCAR results; instead, they are used to challenge the production models, ensuring they provide consistent and accurate results.
Finally, there is an increased emphasis on validating those production models. As the CCAR banks prepare for the tests, these are a few of the areas on which they may choose to focus.
The DFAST banks under $50 billion in assets are asked to file their first submission on March 31, 2014. As a result, participants have many questions about the type of estimation practices for credit risk and loss estimation they should use. Many elements of the stress test that should be given the highest attention pertain to loss estimation. Typically, smaller banks implement basic top-down types of models, which are appropriate for pools of assets that are fairly homogeneous. That said, these banks would benefit from learning more about heterogeneous asset classes and those that have what I like to call “chunkier” types of exposures, like C&I and CRE.
I anticipate that regulators are expecting more advanced methodologies for loss estimation than simple top-down models that try to make an existing probability of default (PD) sensitive to macroeconomic factors in a simple regression-based model.
Regulators are also keen to see how models at smaller banks are supported by the data and the risk rating process within those organizations. Many smaller banks still struggle with creating dual risk rating systems and the data layer to support the loss estimation. Therefore, I anticipate these banks will seek improved methodologies around their models in mid-2014.
With regards to PPNR, it’s crucial that banks integrate the estimated losses and the scenarios producing those estimated losses on the credit risk side. As losses begin to emerge, the loans migrate to a non-performing or non-accrual status, creating a drag on a bank’s net interest margin. This loss emergence process needs to be better integrated.
That’s a great question. When systems, platforms, and models are built correctly, liquidity risk can be more naturally accommodated. That’s not, however, the regulatory expectation or direction. In my opinion, liquidity risk will not be brought into the CCAR or DFAST framework. That said, sometime soon I do expect the regulators will publish new rules that provide additional guidance about modeling expectations for liquidity risk. Liquidity risk will remain a separate exercise, and it has the potential to be equally as challenging as the CCAR process.
Banks typically deal with cash flows and time horizons. For example, with asset liquidity, they have to consider different economic scenarios, and how quickly they can get rid of unencumbered liquid assets. A second consideration is treatment across the asset classes, as the time horizons for liquidation change with economic scenarios. When moving into a crisis, the liquidity that banks thought they had on day one could be gone on day fifteen, as the scenario continues to matriculate through their balance sheet and the market.
Modeling challenges also exist with collateral valuation. The time windows are not orderly over nine quarters. In fact, the time windows are over 1-5, 6-15, 15-30, 60, or 90 days. Banks have to produce the liquidity cash flows over these short-term time horizons, while simultaneously having to revalue their assets and any posted collateral, or collateral that is unencumbered that may be “called,” for additional collateralization of secured financing arrangements.
Not only that, but there are other challenges, such as including a contingency funding plan in the scenario and determining how those contingent sources of funds might be changing, and how off-balance vehicles may not be a direct obligation and instead might be a moral recourse or indirect obligation. There are a range of additional issues that banks may not necessarily consider in the context of the CCAR process. It is going to be a learning exercise as the regulators promulgate the new policy and the industry becomes familiar with a higher set of supervisory expectations around liquidity management. It might be a good idea to acquaint yourself with SR 12-17, which deals with the new continuous monitoring and supervisory framework for larger banks. It discusses the need for better resiliency and resolvability and how the Fed plans to implement that policy.
In most banks we have worked with, the process is not streamlined and is very time-consuming. For now, improvements are slow. In order to overcome these issues, banks need to develop an automation plan and strategy while recognizing that the issues are not going to be quickly fixed. The reason the CCAR exercise is so time-consuming is that it involves every element of an organization. It is an enterprise-wide stress test that crosses all of the data across the full, consolidated balance sheet, requiring banks to capture all that data and forecast balance sheets, income statements, and regulatory capital under different periods of stress, over two-plus years.
Banks have not been investing in automation, business process management, and workflow tools, let alone a datamart that contains the data necessary to estimate the models and produce analytics. A significant investment is required to create that automation. Unfortunately, there is no simple solution to this complex problem, which is not going away. Banks will have to make investments to get it right.
One lesson learned in our interactions with banks is the benefit of a dedicated stress testing architect. That role would endeavor to understand the current state environment. The stress test architect would be a decision maker who can break down organizational bottlenecks and bring a bank from a current state to a better state. Why better state instead of future state? In my opinion, to date, no one has a clear vision of what a future state design looks like. It varies from bank to bank, given how different systems have evolved and how organizations are structured, such as their automation type and architecture.
Finally, I recommend that banks establish a Stress Testing and Capital Planning Office, which is a group that functions as the “go-to” center of competence for keeping up with policies, expectations around internal controls, governance, research, writing procedures, and validation plans. The validation or governance function would ensure both primary models and feeder models into the primary models are being validated and assessed in the proper fashion. The office may include elements of technical writing, particularly around policy and procedure.
The Stress Testing Office would most likely contain the stress testing architect, but the office itself would fulfill a variety of different duties. Other facets of the office include data management and workflow control. One example is making sure internal audit groups responsible for checking elements of the stress testing process are performing their duties and putting a harness around the entirety of the exercise to make sure it is working properly. The office may report to the board, keeping the board abreast of the stress testing program quality, and also serve as the regulatory liaison to ensure the regulators are given all the necessary information about the stress testing and capital planning process.
The Stress Testing Office may fit within the finance group, risk management group, or form a part of regulatory reporting with the understanding that headcount is needed to manage the entirety of the process in a structured and efficient manner.
Increasing the quality of loss estimation models across all asset classes is an area of primary focus for 2014 and will most likely continue indefinitely. Banks need to think more about back-testing and benchmarking models. On the PPNR side, the integration of loss estimation within the margin element of the PPNR calculation will be important.
When estimating a balance sheet over a nine-quarter period, banks will have different estimates of new business production or origination strategy as they move through a base case, more adverse, or an idiosyncratic scenario. Many finance groups that typically generate a base case new business origination strategy are not necessarily accustomed to estimating what the credit-adjusted new business volumes would be, and their associated Basel risk-weighted asset category. They are not comfortable estimating the associated credit distribution of that new business volume – especially under adverse economic conditions – and need input from credit risk and the risk originating business line. When banks have to produce a pro forma forecast that estimates risk-weighted assets at every quarter end, they will have to estimate the business volume under the different scenarios, with the added dimension of the credit quality of that new business origination. This includes understanding not only volume, but also the maturity of that new business origination, the product category, and the pricing spread (e.g., fixed, floating, compared to the index rate, etc.). The supervisors will not expect a simple qualitative overlay or average risk ratings by account type as has been acceptable so far, and will instead look for quantitative estimation of these levels with more specificity.
On the non-interest revenue / non-interest expense side, in most cases the estimation of those levels typically comes from the finance group, or fairly simple regressions based on past data. Regulators are concerned about how this estimation is impacted by stressful conditions. The legal costs associated with mortgage put backs, for example, might increase substantially in another housing crisis, as well as foreclosure costs, and other litigation and operational expenses. Another example would be if there was a catastrophe in the money fund industry, and people rushed to the bank. This might sound like a great outcome for banks, but there is such a thing as being too liquid. The regulatory expectation is that the forecast will be more sensitive to the economic scenarios and more thought will be put into the quantification of those elements.
On the data management side, there has been a significant under-investment in automating the data layers. Banks need to better automate that process, link to different core systems, and bring that data into a common framework for purposes of submitting the required regulatory reports.
The issue of capital forecasting and forecasting RWA is going to be potentially an area of increased focus for CCAR banks. I do not think it is going to be an issue for DFAST banks, given the fact a lot of Basel I reporting rules are still in play. For larger banks, creating a better process for forecasting RWA is going to be an important piece. And new business volumes and spreads need to be linked up in a tighter fashion.
The banks with over $50 billion in assets have been hit hard for a number of years on the analytics side. The focus on infrastructure, governance, and process should be a high priority. Several of the banks that published in March 2013 were hit with concerns regarding process and governance. In my opinion, that trend will continue. The Capital Planning at Large Bank Holding Companies: Supervisory Expectations and Range of Current Practice report makes the same point. Judging by the number of weaknesses identified in the report, it is clear that the supervisory authorities have plenty of ammo. Larger banks should still keep a laser-like eye on analytics, but clearly think about infrastructure, as well as tricky areas like new business volumes and creating more sensitivity around the non-interest revenue / non-interest expense pieces.
For the banks with under $50 billion in assets, the problem is a little bit easier. The regulators expect the processes around the loss estimation for credit risk to be far more advanced than other elements of the stress test, especially for banks in that $10 to 50 billion in assets category.
Organizations often focus on minimizing cost rather than on understanding how best to model a particular asset class. For example with C&I loans, banks have to model stressed PDs at a minimum by sector and certainly from a bottom-up perspective, rather than from a top-down perspective. Banks often need to invest in better rating systems. If that means acquiring a new spreading tool that allows them to apply a default PD or LGD model to estimate those two quantities, then they need to invest. They may also need to simultaneously invest in new analytics and techniques to transform measures into stressed measures.
In my opinion, regulators will not be satisfied with top-down modeling approaches for asset classes that have idiosyncratic features, even for banks with $10-50 billion in assets. Supervisory authorities have been expecting enhanced rating systems for years. DFAST banks that still have not invested in them may learn a harsh lesson in 2014 about what is and is not satisfactory. That is why the banks with under $50 billion in assets should primarily focus on credit risk analytics and rating systems, as well as allowance methodologies.
Explores how North American financial institutions can leverage stress testing regulations to add value to their business, for compliance and beyond.
Previous ArticleThe Challenges of Stress Testing US Structured Finance
Thomas Day, Senior Director, Mehna Raissi, Director, and Chris Shayne, Director discuss credit risk management and loss modeling in a stress testing environment.
May 2014 WebPage Thomas Day, Mehna Raissi, Chris Shayne
In this paper, we first provide a background on stress-testing, discuss infrastructure challenges and issues related to legacy data and remediation requirements, including the costs and benefits of improved data management and the challenges of managing multiple hierarchies and reporting dimensions required by the Supervisory Authorities. Next, we cover data governance issues, the data requirements of meeting U.S. stress-testing mandates, and the basic elements of a sound data management infrastructure.
March 2014 Pdf Thomas Day, John Haley
In this webinar, originally recorded on September 17, 2013, Thomas Day discusses best practices for expected loss and pre-provision net revenue forecasting, integration of stress testing into your business architecture, and transforming stress testing from a regulatory exercise to a strategic management tool.
December 2013 WebPage Thomas Day
This article provides a summary of the mid-cycle stress test results, including observations about scenarios, loss estimates and PPNR, disclosures, and areas for improvement.
November 2013 WebPage Thomas Day
This article discusses two conceptual approaches for modeling stressed credit losses: top-down and bottom-up. It highlights the benefits and challenges of using each approach and regulatory expectations.
November 2013 WebPage Thomas Day, Anna Krayn
Stress Testing Webinar Series: Macroeconomic Conditional Pre-provision Net Revenue (PPNR) Forecasting
This webinar discusses the primary challenges confronting banks when forecasting macroeconomic conditional pre-provision net revenue (PPNR), best practices for forecasting macroeconomic conditional PPNR, and the tools and techniques used by Moody’s Analytics to address the challenges.
October 2013 WebPage Thomas Day, Dr. Amnon Levy, Robert Wyle
In this Moody's Analytics webinar, Thomas Day and other Moody's Analytics experts discuss Macroeconomic Conditional Loss Forecasting. Given the criticality of loss estimation, and the need for different models by asset class, we cover loss estimation for Retail Exposures (non-mortgage), Structured Portfolios, Wholesale C&I (non-public), and Wholesale (public).
October 2013 Pdf Thomas Day, Dr. Cristian deRitis, Luis Amador
Banks' funding activities and liquidity management will be the focus of increased regulator attention in coming years. This presentation helps risk managers meet best practices.
October 2013 Pdf Thomas Day
This article provides a summary of the 2013 CCAR and Dodd-Frank Act Stress Tests, and compares the results with the 2012 stress tests.
September 2013 WebPage Thomas Day