Integrating Macroeconomic Scenarios into a Stress Testing Framework
This article describes the three principles that need to be understood and analyzed for banks to have a realistic chance of integrating alternative scenario work into their stress testing workflow.
The integration of alternative scenarios into stress testing frameworks remains an open challenge for most financial institutions. Existing processes rely heavily on manual spreadsheet work that carries inefficiencies and computational challenges.
The first principle refers to the ability to define and understand the shape and nature of any given scenario. It relates to practical questions, such as: What is driving the shocks in a given scenario? Which provided assumptions need translation into a portfolio? How are we going to manage the mapping/translation of the assumptions into all remaining macroeconomic and financial series?
The second principle relates to the crossconsistency of the forecasts, answering queries such as: How can we ensure that the shapes of all economic time series are in line with each other? What happens if it is difficult to reconcile the dynamics of a given variable with others that are being calculated in our models? Achieving a consistent set of economic parameters is not a trivial task and can pose formidable challenges to existing models and systems.
The third principle refers to the severity (or probability) of any given scenario. Many stress testing frameworks are centered on probability thresholds that need to be measured at a certain point in a distribution. Systems should be able to rankorder scenarios and potentially compute likelihoods of alternative scenario forecasts. The comparison of a specific scenario against the 20092010 crisis is a recurring practical request. Stress testing frameworks should be able to handle such queries.
These three principles illustrate the features necessary to develop an integrated stress testing system, but should not be seen as “sufficient” conditions that automatically guarantee integration.
(A) First principle: scenario definition
The first important building block for a scenariodriven stress test is the ability to characterize or define a scenario. In practice, this translates into a combination of (i) qualitative guidelines for the scenario’s shape (the socalled “narratives”) and (ii) quantitative projections of specific macroeconomic and financial series (the socalled “assumptions” or “known shocks”). The extent of provided assumptions varies widely – from just a handful of key economic factors to a fairly comprehensive list of financial and macroeconomic drivers. The practical challenge for the modeler is to map (i.e., translate) the given assumptions into a larger set of economic and financial variables that will be used at a later stage in the stress testing process.
This critical starting point can be achieved by splitting the task into two steps:
 (A.1) Expanding the assumptions to other “core” macro series.
 (A.2) Using the outputs of the core model to extend the shocks to a large set of economic and financial parameters. This second step is achieved through the use of socalled “satellite models.”
The next two sections describe these steps with the help of some mathematics. The underlying logic is simple. The process starts with a set of assumptions, captured in a vector x. The next step consists of translating these shocks into a group of “core” macroeconomic and financial series, collected in the vector y. With the information from the pair (x,y) we populate values for a long list of parameters, grouped into what we refer to as the vector of satellite models: z=(z^{1},z^{2},…,z^{S}). The final output of the scenario phase gets summarized in the triplet (x,y,z), This combined vector is the starting point for the modeling of credit, market, liquidity, and operational risk parameters.
(A.1) Core macroeconomic model
The stress testing program starts with a set of assumptions, grouped in a vector of exogenous variables x∈R^{N}. A typical economic model will contain relationships between:
 These series
 Other core macro variables, for instance y∈R^{M}
 Random variables or shocks (e.g., residuals in empirical models or “innovations” in structural macro models), for instance ε∈R^{T}
The target variables to calculate are those contained in the vector of endogenous variables y∈R^{M}. These targets are related to the assumptions embedded in x through a generic system of equations. We represent this system as the mapping F:R^{N} x R^{T} x R^{M}→R^{M} so that F(x,ε,y)=0. With a fixed value for the vector of assumptions x, the modeler will obtain projections for the remaining economic variables and potential paths for the residuals grouped in ε. The analyst faces a risk of multiplicity; i.e., having several solutions for (ε,y) that are consistent with the vector x.
Timeseries models are a good example of standard core macro models. These equations contain lags of the endogenous variables together with (x,ε) as righthandside drivers of the model:
(x_{t},y_{t})=f(x_{t1},x_{t2},...,x_{tL},y_{t1},y_{t2},...,y_{tL},ε_{t})
This system highlights the key challenge for solving core models: the set of variables x_{t} still appears on the lefthandside of the equation. In other words, the system defines the relationship between x and y implicitly – through a generic implicit function F(x,ε,y)=0 – and not necessarily by following an explicit functional form, such as y=f(x,ε). Solving implicit functions can require the modelers to work around local solutions and run simulations to understand the shape of the vectors (ε,y) around a given value of x. The problem of multiplicity present in this phase is very hard – perhaps impossible – to overcome in practice.
The completion of phase (A.1) provides the modeler with a full set of macro projections for all core economic series, grouped in the pair (x,y). These vectors will be used as drivers on several satellite models, as described in section (A.2).
(A.2) Satellite macroeconomic and financial models
Having completed the calculation of all core macro series, a natural next step is to run satellite models to expand the forecasts to a larger set of economic and financial variables. Some market risk portfolios, for example, will require hundreds if not thousands of “parameter estimates” to carry on with the stress test. From a mathematical viewpoint, the key distinction between these satellite equations and the core model is that satellite variables are typically derived directly and explicitly from assumptions of the core variables.
Figure 1. Phases of the scenario workflow
Source: Moody's Analytics
Figure 2. Satellite equations positioned around a core macroeconomic model
Source: Moody's Analytics
In formal terms, consider a group of S satellite models, labeled s∈{1,2,3,…,S}. Each of these equations is such that the endogenous variables, z^{s}∈R^{Ps}, can be obtained as an explicit mapping of the core economic variables: z^{s}=g^{s} (x,y). In other words, there are no feedback effects between satellite variables, z^{s}, and core macro series, (x,y).
To illustrate the concept of satellite models, the previous timeseries example was extended to the behavior of z^{s}_{t}, with lags of (x,y,z) and a residual term, μ^{s}_{t}, as potential explanatory variables:
With z^{s}_{t} as the only variable on the lefthandside of the equation, the relationship is unidirectional: from (x,y) to z. This simple timeseries satellite model allows for no interactions with other satellite variables nor any feedback between z^{s}_{t} and the economic assumptions in (x,y).
The final output of these two phases can be summarized in the triplet (x,y,z)=(x,y,z^{1},z^{2},…,z^{S})=(x,y,g^{1} (x,y),g^{2} (x,y),…,g^{S} (x,y)). These elements represent the starting point of the stress testing process that is needed to carry on with the subsequent modeling and analytical steps (such as the calibration of market and credit risk parameters to the scenario assumptions, mapping of scenario parameters to liquidity metrics and balance sheet items, etc.). Properly defining a scenario is paramount, as it serves as a necessary building block for an integrated stress testing framework.
(B) Second principle: consistency of macroeconomic scenarios
A critical challenge for implementing specific scenarios into a stress testing process is ensuring the overall consistency of the economic variables that characterize any given scenario. The severity of the shocks, timing of the recession, speed of the recovery and other dimensions need to match across all variables to obtain a consistent scenario. The practical challenge – translated into our mathematical setup – amounts to “solving” the system F(x,ε,y)=0, having started with the assumptions imposed on the vector x. But what if these assumptions contradict the expected shape or behavior that have been predicted in the macro models? The error term (or random variable) ε serves as a residual term to try to “close the gap” between the values of y and x. In practice, if it is hard to make sense of the relationship between elements of y and x (i.e., if the error term is very significant), modelers tend to isolate those “problematic” variables and build specific satellite equations for them. Under such circumstances, some of the relationships described in F(x,ε,y)=0 are no longer valid and they are modeled through satellite equations.
The math can be summarized as follows: (a) a subset of endogenous variables (the socalled “problematic” series), say y̅;, gets mapped through satellite models: y̅=g^{s̅}(x); (b) The remaining ones, say y̿, are solved through the conditional subsystem:
The ability to identify, document, and fix scenario inconsistencies is crucial to the stress testing scenariobuilding process. Underestimating this challenge can lead to an inordinate amount of workload for the analysts who are responsible for subsequent phases of the stress testing process. Making sense of the stressed numbers will be nearly impossible if the starting assumptions carry tangible inconsistencies.
(C) Third principle: scenario severity/probability
Many stress testing frameworks are built around specific probabilities/severities of the initial shocks/scenarios. To this end, how can banks compute the probability of a given macroeconomic scenario? The multidimensional nature of macro forecasts makes the assignment of a single probability not a trivial exercise. Not only do banks face several variables within a scenario, but they also observe them over subsequent points in time. The analyst must produce a mapping of this multivariate object into a probability, a number between 0 and 1. In mathematical terms, the task is to translate the values of the vector (x,y,z) into p=P(x,y,z)∈[0,1].
In practice, this is typically handled through simulation exercises. The steps involved in the process are as follows:
1. (C.1) Simulating the core macroeconomic model: Use the core macroeconomic model to simulate paths for the variables (x,y). Simulations are typically based on shocks to the random variable ε combined with shocks to the parameter estimations or model (usually referred to as “coefficientuncertainty”). Each realization of the shocks (i.e., each simulation), say j∈{1,2,…,J}, will produce a set of values for the vector (x^{j},y^{j},ε^{j}).
2. (C.2) Running the simulations through satellite models: For each realization of the shocks, the vector of simulated economic parameters is mapped, (x^{j},y^{j}), into all satellite models. Each satellite model could bring its own source of additional uncertainty (through μst in the example of a timeseries satellite model). The outcome of this second step is summarized with the set of simulated paths: (x^{j},y^{j},z^{1,j},z^{2,j},…,z^{S,j})=(x^{j},y^{j},z^{j}), for all j∈{1,2,…,J}.
3. (C.3) Computing scenario probabilities/ severities: After completing the simulation exercise, the final step is to compute a probability for each simulated economic variable: (x^{j},y^{j},z^{j}). To simplify things, decompose the vector of macroeconomic and financial series into observations over time: (x^{j},y^{j},z^{j})=(x^{j}_{t},y^{j}_{t},z^{j}_{t}), for all points in time t. Assuming the last observed historical value is t̃, the probability P(x^{j}_{t>t̃},y^{j}_{t>t̃},z^{j}_{t>t̃ }) for each simulation j∈{1,2,…,J} needs to be computed.
Consider a practical example where the first variable on the vector of original economic assumptions, x^{1}_{t}, represents the unemployment at time t. The simplest probability metric is constructed by rankordering the simulations according to their forecasted peak unemployment rate. In other words, comparing the projected peak rate against the overall distribution of unemployment rates observed in history, as well as in any simulated path. This effect reduces the dimension of the problem to a single number: the positioning of the simulated peak unemployment rate against its overall distribution. The problem with such a simplistic probability metric is that it misses the potential duration of the crisis and ignores other economic sectors beyond labor market conditions. A more robust probability/scoring mechanism will combine alternative severity dimensions into an overall score, illustrated by extending the unemployment rate example with alternative score loadings, such as: (a) number of periods with an increasing unemployment rate, (b) number of periods with unemployment rates above a longterm average, (c) predicted peak of the unemployment rate, and (d) number of periods with the unemployment rate within the top 25% of the sample (first quartile). These loadings (factors) are assigned specific weights in the final score.
In mathematical terms, let q∈{1,2,…,Q} represent the set of alternative score loadings (or marginal factors). The total score can be computed as a convex combination of these marginal scores:
To complete the final step, scores are mapped into probabilities.
This framework will help analysts attach probabilities to each and every economic scenario that is being proposed. For those frameworks that are built around severity/ probability thresholds, this foundation helps integrate the scenario phase into the overall stress testing program.
The integration of alternative scenarios into stress testing frameworks remains an ongoing challenge for most financial institutions. The three principles discussed in this article help illustrate how to develop an integrated stress testing system, but should not be seen as automatically guaranteeing integration. Banks need to understand and analyze the three principles to have a realistic chance of integrating their alternative scenario work into their stress testing workflow.
As Published In:
Details how global risk managers can comply with new regulations, better manage risk, and meet business and industry demands.
Related Insights
Article
Dynamic ModelBuilding: A Proposed Variable Selection AlgorithmIn this article, we propose an innovative algorithm that is well suited to building dynamic models for credit and market risk metrics, consistent with regulatory requirements around stress testing, forecasting, and IFRS 9. 
Whitepaper
U.K. Residential Mortgages Risk Weights: PRA Consultation Paper CP29/16This paper presents best practices for addressing PRA Consultation Paper CP29/16. 
Article
ProbabilityWeighted Outcomes Under IFRS 9: A Macroeconomic ApproachIn this article, we discuss development of a framework that addresses the forwardlooking and probabilityweighted aspects of IFRS 9 impairment calculation using macroeconomic forecasts. In it, we address questions around the practical use of alternative scenarios and their probabilities. 
Article
Complying with IFRS 9 Impairment Calculations for Retail PortfoliosThis article discusses how to address the specific challenges that IFRS 9 poses for retail portfolios, including incorporating forwardlooking information into impairment models, recognizing significant increases in credit risks, and determining the length of an instrument's lifetime. 
Article
Advanced Estimation and Simulation Methods for Retail Credit Portfolios: Frequentist vs. Bayesian TechniquesIn this article, we compare the results of estimating retail portfolio risk parameters (e.g., PDs, EADs, LGDs) and simulating portfolio default losses using traditional – frequentist – methods versus Bayesian techniques. 
Presentation
MultiPeriod Credit Risk Analysis: A MacroScenario Approach Presentation SlidesIn this presentation, Dr. Juan Licari of Moody's Analytics will present an innovative framework for stochastic scenario generation that allows risk managers and economists to build multiperiod environments, integrating conditional credit and market risk modeling to meet dynamic stress testing needs. 
Presentation
Market Risk Stress Testing Models Presentation SlidesIn this presentation, Dr. Juan Licari presents a twostage process that generates consistent, transparent scenariospecific forecasts for all relevant market and credit risk instruments, ensuring crossconsistency between projections for macroeconomic and financial series. 
WebinaronDemand
Market Risk Stress Testing ModelsIn this presentation we present a twostage process that generates consistent, transparent scenariospecific forecasts for all relevant market and credit risk instruments, ensuring crossconsistency between projections for macroeconomic and financial series. 
WebinaronDemand
MultiPeriod Credit Risk Analysis: A MacroScenario ApproachIn this presentation, we present an innovative framework for stochastic scenario generation that allows risk managers and economists to build multiperiod environments, integrating conditional credit and market risk modeling to meet dynamic stress testing needs. 
WebinaronDemand
IFRS 9 Impairment Webinar Series – Models for ImplementationThis webinar discusses determining the best approaches for model development and governance for IFRS 9 Impairment calculations. 
Article
MultiPeriod Stochastic Scenario GenerationRobust models are currently being developed worldwide to meet the demands of dynamic stress testing. This article describes how to build consistent projections for standard credit risk metrics and marktomarket parameters simultaneously within a single, unified environment. 
Article
ArbitrageFree Scenarios for Solvency IIThis article discusses a macroeconomic forecasting model that is able to generate arbitragefree scenarios. 
Presentation
Handling low default portfolios under stressRegulators are challenging how to perform stress testing on low default portfolios by reviewing bank's PD models for RWA stress testing, in the absence of data they need to be convinced of the methodology used. In this Moody's Analytics webinar we put forward a statistical approach to stress testing low default portfolios with practical case studies 
WebinaronDemand
Gauging the Risk of Europe's Banks: What Might the ECB Find?The European Central (ECB) has begun a yearlong comprehensive assessment of the Euro area banking system. In this webinar, Moody's Analytics seeks to provide a default datadriven context for the ECB's exercise and a preview for what is to come. 
Whitepaper
Modelling and Stressing the Interest Rates Swap CurveWe present a twostep modelling and stress testing framework for the term structure of interest rates swaps that generates sensible forecasts and stressed scenarios out of sample. Our methodology is able to replicate two important features of the data: the dynamics of the spread across maturities and the alignment of the key swap rates tenor points to their corresponding government yields. Modern models of the term structure of interest rates typically fail to reproduce these and are not designed for stress testing purposes. We present results for the euro, the U.S. dollar, and British pound swap curves. 
Article
A Macroeconomic View of Stress TestingThis article discusses how developing deterministic scenarios form a macroeconomic view on stress testing that helps to uncover system or enterprisewide vulnerabilities and assist banks in making more informed business decisions. 
Article
Stress Testing of Retail Credit PortfoliosIn this article, we divide the stress testing process for retail portfolios into four steps, highlighting key activities and providing details about how to implement each step. 
Whitepaper
Reverse Stress Testing from a Macroeconomic Viewpoint: Quantitative Challenges & Solutions for its Practical ImplementationThis whitepaper examines the challenge of multiplicity in reverse stress testing, where the same outcome can be obtained with multiple combinations of risk factors and economic scenarios. 
Article
A Macrofinance View on Stress TestingFor most financial practitioners, stresstesting is a “mustdo” activity, even if it is not a regulatory requirement. Such stresstesting encompasses a wide range of sophisticated and quantitative exercises, including assessments of market, credit and liquidity risks. This article discusses several approaches and outlines a foundation for a robust and consistent stresstesting framework. 
Presentation
Reverse Stress Testing: Challenges and BenefitsReverse stress testing is becoming recognised throughout the world for its benefits. This presentation explains what reverse stress testing is and what it can achieve, along with the challenges it presents. Here we show you why reverse stress testing can lead to a deeper understanding of an organisation's susceptibility to risk and why it is a valuable tool for any risk management strategy. 
Article
Modeling and Stressing the Interest Rates Swap CurveThis article presents a twostep modeling and stress testing framework for the term structure of interest rates swaps that generates sensible forecasts and stressed scenarios out of sample. The results are shown for the euro, the US dollar, and British pound swap curves.
WebPage
Dr. Juan M. Licari, Dr. Olga LoiseauAslanidi, Dr. José SuárezLledó
