Integrating Macroeconomic Scenarios into a Stress Testing Framework
This article describes the three principles that need to be understood and analyzed for banks to have a realistic chance of integrating alternative scenario work into their stress testing workflow.
The integration of alternative scenarios into stress testing frameworks remains an open challenge for most financial institutions. Existing processes rely heavily on manual spreadsheet work that carries inefficiencies and computational challenges.
The first principle refers to the ability to define and understand the shape and nature of any given scenario. It relates to practical questions, such as: What is driving the shocks in a given scenario? Which provided assumptions need translation into a portfolio? How are we going to manage the mapping/translation of the assumptions into all remaining macroeconomic and financial series?
The second principle relates to the cross-consistency of the forecasts, answering queries such as: How can we ensure that the shapes of all economic time series are in line with each other? What happens if it is difficult to reconcile the dynamics of a given variable with others that are being calculated in our models? Achieving a consistent set of economic parameters is not a trivial task and can pose formidable challenges to existing models and systems.
The third principle refers to the severity (or probability) of any given scenario. Many stress testing frameworks are centered on probability thresholds that need to be measured at a certain point in a distribution. Systems should be able to rank-order scenarios and potentially compute likelihoods of alternative scenario forecasts. The comparison of a specific scenario against the 2009-2010 crisis is a recurring practical request. Stress testing frameworks should be able to handle such queries.
These three principles illustrate the features necessary to develop an integrated stress testing system, but should not be seen as “sufficient” conditions that automatically guarantee integration.
(A) First principle: scenario definition
The first important building block for a scenario-driven stress test is the ability to characterize or define a scenario. In practice, this translates into a combination of (i) qualitative guidelines for the scenario’s shape (the so-called “narratives”) and (ii) quantitative projections of specific macroeconomic and financial series (the so-called “assumptions” or “known shocks”). The extent of provided assumptions varies widely – from just a handful of key economic factors to a fairly comprehensive list of financial and macroeconomic drivers. The practical challenge for the modeler is to map (i.e., translate) the given assumptions into a larger set of economic and financial variables that will be used at a later stage in the stress testing process.
This critical starting point can be achieved by splitting the task into two steps:
- (A.1) Expanding the assumptions to other “core” macro series.
- (A.2) Using the outputs of the core model to extend the shocks to a large set of economic and financial parameters. This second step is achieved through the use of so-called “satellite models.”
The next two sections describe these steps with the help of some mathematics. The underlying logic is simple. The process starts with a set of assumptions, captured in a vector x. The next step consists of translating these shocks into a group of “core” macroeconomic and financial series, collected in the vector y. With the information from the pair (x,y) we populate values for a long list of parameters, grouped into what we refer to as the vector of satellite models: z=(z^{1},z^{2},…,z^{S}). The final output of the scenario phase gets summarized in the triplet (x,y,z), This combined vector is the starting point for the modeling of credit, market, liquidity, and operational risk parameters.
(A.1) Core macroeconomic model
The stress testing program starts with a set of assumptions, grouped in a vector of exogenous variables x∈R^{N}. A typical economic model will contain relationships between:
- These series
- Other core macro variables, for instance y∈R^{M}
- Random variables or shocks (e.g., residuals in empirical models or “innovations” in structural macro models), for instance ε∈R^{T}
The target variables to calculate are those contained in the vector of endogenous variables y∈R^{M}. These targets are related to the assumptions embedded in x through a generic system of equations. We represent this system as the mapping F:R^{N} x R^{T} x R^{M}→R^{M} so that F(x,ε,y)=0. With a fixed value for the vector of assumptions x, the modeler will obtain projections for the remaining economic variables and potential paths for the residuals grouped in ε. The analyst faces a risk of multiplicity; i.e., having several solutions for (ε,y) that are consistent with the vector x.
Time-series models are a good example of standard core macro models. These equations contain lags of the endogenous variables together with (x,ε) as right-hand-side drivers of the model:
(x_{t},y_{t})=f(x_{t-1},x_{t-2},...,x_{t-L},y_{t-1},y_{t-2},...,y_{t-L},ε_{t})
This system highlights the key challenge for solving core models: the set of variables x_{t} still appears on the left-hand-side of the equation. In other words, the system defines the relationship between x and y implicitly – through a generic implicit function F(x,ε,y)=0 – and not necessarily by following an explicit functional form, such as y=f(x,ε). Solving implicit functions can require the modelers to work around local solutions and run simulations to understand the shape of the vectors (ε,y) around a given value of x. The problem of multiplicity present in this phase is very hard – perhaps impossible – to overcome in practice.
The completion of phase (A.1) provides the modeler with a full set of macro projections for all core economic series, grouped in the pair (x,y). These vectors will be used as drivers on several satellite models, as described in section (A.2).
(A.2) Satellite macroeconomic and financial models
Having completed the calculation of all core macro series, a natural next step is to run satellite models to expand the forecasts to a larger set of economic and financial variables. Some market risk portfolios, for example, will require hundreds if not thousands of “parameter estimates” to carry on with the stress test. From a mathematical viewpoint, the key distinction between these satellite equations and the core model is that satellite variables are typically derived directly and explicitly from assumptions of the core variables.
Figure 1. Phases of the scenario workflow
Source: Moody's Analytics
Figure 2. Satellite equations positioned around a core macroeconomic model
Source: Moody's Analytics
In formal terms, consider a group of S satellite models, labeled s∈{1,2,3,…,S}. Each of these equations is such that the endogenous variables, z^{s}∈R^{Ps}, can be obtained as an explicit mapping of the core economic variables: z^{s}=g^{s} (x,y). In other words, there are no feedback effects between satellite variables, z^{s}, and core macro series, (x,y).
To illustrate the concept of satellite models, the previous time-series example was extended to the behavior of z^{s}_{t}, with lags of (x,y,z) and a residual term, μ^{s}_{t}, as potential explanatory variables:
With z^{s}_{t} as the only variable on the left-hand-side of the equation, the relationship is unidirectional: from (x,y) to z. This simple time-series satellite model allows for no interactions with other satellite variables nor any feedback between z^{s}_{t} and the economic assumptions in (x,y).
The final output of these two phases can be summarized in the triplet (x,y,z)=(x,y,z^{1},z^{2},…,z^{S})=(x,y,g^{1} (x,y),g^{2} (x,y),…,g^{S} (x,y)). These elements represent the starting point of the stress testing process that is needed to carry on with the subsequent modeling and analytical steps (such as the calibration of market and credit risk parameters to the scenario assumptions, mapping of scenario parameters to liquidity metrics and balance sheet items, etc.). Properly defining a scenario is paramount, as it serves as a necessary building block for an integrated stress testing framework.
(B) Second principle: consistency of macroeconomic scenarios
A critical challenge for implementing specific scenarios into a stress testing process is ensuring the overall consistency of the economic variables that characterize any given scenario. The severity of the shocks, timing of the recession, speed of the recovery and other dimensions need to match across all variables to obtain a consistent scenario. The practical challenge – translated into our mathematical set-up – amounts to “solving” the system F(x,ε,y)=0, having started with the assumptions imposed on the vector x. But what if these assumptions contradict the expected shape or behavior that have been predicted in the macro models? The error term (or random variable) ε serves as a residual term to try to “close the gap” between the values of y and x. In practice, if it is hard to make sense of the relationship between elements of y and x (i.e., if the error term is very significant), modelers tend to isolate those “problematic” variables and build specific satellite equations for them. Under such circumstances, some of the relationships described in F(x,ε,y)=0 are no longer valid and they are modeled through satellite equations.
The math can be summarized as follows: (a) a subset of endogenous variables (the so-called “problematic” series), say y̅;, gets mapped through satellite models: y̅=g^{s̅}(x); (b) The remaining ones, say y̿, are solved through the conditional sub-system:
The ability to identify, document, and fix scenario inconsistencies is crucial to the stress testing scenario-building process. Underestimating this challenge can lead to an inordinate amount of workload for the analysts who are responsible for subsequent phases of the stress testing process. Making sense of the stressed numbers will be nearly impossible if the starting assumptions carry tangible inconsistencies.
(C) Third principle: scenario severity/probability
Many stress testing frameworks are built around specific probabilities/severities of the initial shocks/scenarios. To this end, how can banks compute the probability of a given macroeconomic scenario? The multidimensional nature of macro forecasts makes the assignment of a single probability not a trivial exercise. Not only do banks face several variables within a scenario, but they also observe them over subsequent points in time. The analyst must produce a mapping of this multi-variate object into a probability, a number between 0 and 1. In mathematical terms, the task is to translate the values of the vector (x,y,z) into p=P(x,y,z)∈[0,1].
In practice, this is typically handled through simulation exercises. The steps involved in the process are as follows:
1. (C.1) Simulating the core macroeconomic model: Use the core macroeconomic model to simulate paths for the variables (x,y). Simulations are typically based on shocks to the random variable ε combined with shocks to the parameter estimations or model (usually referred to as “coefficient-uncertainty”). Each realization of the shocks (i.e., each simulation), say j∈{1,2,…,J}, will produce a set of values for the vector (x^{j},y^{j},ε^{j}).
2. (C.2) Running the simulations through satellite models: For each realization of the shocks, the vector of simulated economic parameters is mapped, (x^{j},y^{j}), into all satellite models. Each satellite model could bring its own source of additional uncertainty (through μst in the example of a time-series satellite model). The outcome of this second step is summarized with the set of simulated paths: (x^{j},y^{j},z^{1,j},z^{2,j},…,z^{S,j})=(x^{j},y^{j},z^{j}), for all j∈{1,2,…,J}.
3. (C.3) Computing scenario probabilities/ severities: After completing the simulation exercise, the final step is to compute a probability for each simulated economic variable: (x^{j},y^{j},z^{j}). To simplify things, decompose the vector of macroeconomic and financial series into observations over time: (x^{j},y^{j},z^{j})=(x^{j}_{t},y^{j}_{t},z^{j}_{t}), for all points in time t. Assuming the last observed historical value is t̃, the probability P(x^{j}_{t>t̃},y^{j}_{t>t̃},z^{j}_{t>t̃ }) for each simulation j∈{1,2,…,J} needs to be computed.
Consider a practical example where the first variable on the vector of original economic assumptions, x^{1}_{t}, represents the unemployment at time t. The simplest probability metric is constructed by rank-ordering the simulations according to their forecasted peak unemployment rate. In other words, comparing the projected peak rate against the overall distribution of unemployment rates observed in history, as well as in any simulated path. This effect reduces the dimension of the problem to a single number: the positioning of the simulated peak unemployment rate against its overall distribution. The problem with such a simplistic probability metric is that it misses the potential duration of the crisis and ignores other economic sectors beyond labor market conditions. A more robust probability/scoring mechanism will combine alternative severity dimensions into an overall score, illustrated by extending the unemployment rate example with alternative score loadings, such as: (a) number of periods with an increasing unemployment rate, (b) number of periods with unemployment rates above a long-term average, (c) predicted peak of the unemployment rate, and (d) number of periods with the unemployment rate within the top 25% of the sample (first quartile). These loadings (factors) are assigned specific weights in the final score.
In mathematical terms, let q∈{1,2,…,Q} represent the set of alternative score loadings (or marginal factors). The total score can be computed as a convex combination of these marginal scores:
To complete the final step, scores are mapped into probabilities.
This framework will help analysts attach probabilities to each and every economic scenario that is being proposed. For those frameworks that are built around severity/ probability thresholds, this foundation helps integrate the scenario phase into the overall stress testing program.
The integration of alternative scenarios into stress testing frameworks remains an ongoing challenge for most financial institutions. The three principles discussed in this article help illustrate how to develop an integrated stress testing system, but should not be seen as automatically guaranteeing integration. Banks need to understand and analyze the three principles to have a realistic chance of integrating their alternative scenario work into their stress testing workflow.
Featured Experts
Steven G. Cochrane
Leading APAC economist oversees regional economic analysis and forecasting; presents company’s economic research and outlook, and leads consulting projects to help clients assess effects of these developments on their business.
Alfredo Coutiño
Acclaimed international economist and economics thought leader focusing on macroeconomic analysis and policy, economic consulting, econometric modeling and forecasting for Mexico and Latin America.
Richard Cross, PhD
Richard is the Director of the Quantitative Research Group at Moody's Analytics, responsible for numerous analytical productivity and data quality initiatives. He designs, implements, and operates systems that apply lean manufacturing principles to data production. Prior to Moody’s Analytics, he was a consultant with McKinsey. He has a PhD and an MS in aerospace engineering from Georgia Tech, and an SB in aeronautics and astronautics from MIT.
As Published In:
Details how global risk managers can comply with new regulations, better manage risk, and meet business and industry demands.
Related Articles
ESG Score Predictor: Applying a Quantitative Approach for Expanding Company Coverage
Assessing Environmental, Social, Governance (ESG) and climate risk is often subject to data constraints, including limited company coverage. This paper provides an overview of Moody’s ESG Score Predictor, an analytical framework that can expand coverage gaps by generating a wide array of ESG and climate risk metrics.
Climate Risk Macroeconomic Forecasting - Executive Summary
This paper describes Moody's Analytics approach to generating climate risk scenarios.
Global Economic Outlook: December 2020
Presentation slides from the Council of the Americas CFO Forum of 2020.
Continued Stress of the UK Mortgage Market
We use the UK Mortgage Portfolio Analyzer to assess the adverse economic impact from of the global pandemic on a representative portfolio of the UK mortgages.
COVID-19: Living Through the Stress Test of the U.K. Mortgage Market
We use the Moody's Analytics Mortgage Portfolio Analyzer to quantify the impact of this significant economic stress on a portfolio of U.K. mortgages.
Analytical Solutions for Multi-Period Credit Portfolio Modelling
A framework for credit portfolio modelling where exact analytical solutions can be obtained for key risk measures such as portfolio volatility, risk contributions to volatility, Value-at-Risk (VaR) and Expected Shortfall (ES).
Analytical Solutions for Multi-Period Credit Portfolio Modelling
A framework for credit portfolio modelling where exact analytical solutions can be obtained for key risk measures such as portfolio volatility, risk contributions to volatility, Value-at-Risk (VaR) and Expected Shortfall (ES).
Dynamic Model-Building: A Proposed Variable Selection Algorithm
In this article, we propose an innovative algorithm that is well suited to building dynamic models for credit and market risk metrics, consistent with regulatory requirements around stress testing, forecasting, and IFRS 9.
U.K. Residential Mortgages Risk Weights: PRA Consultation Paper CP29/16
This paper presents best practices for addressing PRA Consultation Paper CP29/16.
Probability-Weighted Outcomes Under IFRS 9: A Macroeconomic Approach
In this article, we discuss development of a framework that addresses the forward-looking and probability-weighted aspects of IFRS 9 impairment calculation using macroeconomic forecasts. In it, we address questions around the practical use of alternative scenarios and their probabilities.