Featured Product

    Implementation of Least-Squares Monte Carlo (LSMC) in a Life Insurance Context – A Case Study

    Introduction

    This case study first introduces Value-at-Risk (VaR) and its use for assessing economic capital, and focuses on the challenges of its implementation in the insurance sector. It addresses the idea of using analytical approximations (liability proxy functions) with market-consistent liability valuations to reduce the computational burden of the 1-year VaR calculation. It then provides a high-level overview of the proxy fitting methods and their relative strengths and weaknesses, as well as outlining the key criteria a good fitting method is expected to meet.

    This case study then reviews the experiences of Generali Deutschland in developing liability proxy functions for a 1-year Value-at-Risk assessment of a complex, long-term life insurance business. The scope of this case study covers the calculation of the probability distribution forecast of different life insurance business units’ own funds by means of LSMC, as well as the deduction of the respective risk capital. It then presents a validation and interpretation of the results.

    About Generali Deutschland Group

    With a premium income of about €17.2 billion and over 13.5 million customers, the Generali Deutschland Group is the second-largest primary insurance group in the German market. The Generali Deutschland Group includes companies such as Generali Versicherungen, AachenMünchener, CosmosDirekt, Central Krankenversicherung, Advocard Rechtsschutzversicherung, Deutsche Bausparkasse Badenia and Dialog, as well as the Group-owned service providers Generali Deutschland Informatik Services, Generali Deutschland Services, Generali Deutschland Schadenmanagement and Generali Deutschland SicherungsManagement.

    1 VaR and Life Insurers

    The 1-year Value-at-Risk of a market-consistent balance sheet has emerged as the standard quantitative definition of economic capital, with many of the world’s leading multinational insurance groups using VaR to develop models to assess internal economic capital. It is the fundamental measure of the Pillar I solvency capital requirement (SCR) in Solvency II and also features in the International Association of Insurance Supervisors’ (IAIS) suggested solvency capital methodology. Furthermore, Solvency II’s Internal Model requirements demand that the model is capable of providing the full probability distribution of the year-end net asset value, as well as the specific 99.5th VaR percentile.

    VaR initially emerged in the banking sector in the 1990s as a basis for risk capital definition. Its implementation in the insurance sector has its own particular challenges:

    • Market-consistent liability valuation. Market-consistent life insurance liability valuations cannot be obtained from a Bloomberg screen. Moreover, the complexity of some insurance liabilities (particularly the long-term path-dependent guarantees found in the life insurance business) means that their valuation functions cannot be obtained from standard derivative pricing libraries. In these respects, the insurance liabilities are no different than the exotic OTC derivatives that are incorporated into a bank’s VaR implementations. However, insurance liabilities are typically more illiquid and longer-term than most of the instruments on banks’ balance sheets. These characteristics shift the valuation process from an objective mark-to-market to a more subjective market-consistent. The market-consistent valuation process can require firms to make difficult assumptions in areas such as the extrapolation of market prices (yield curves, option-implied volatilities), the impact that illiquidity has on a market price, and volatility and correlation for risks where virtually no relevant market price can be obtained (e.g., real estate, private equity). It is even more difficult to make these assumptions when the modelling challenge pertains not only to current valuation assumptions, but also to codifying how they behave across the full range of 1-year risk factor distributions used in economic capital assessment. For example, how does the illiquidity discount embedded in long-term illiquid assets change after equities fall 40%?
    • Use of a probability-based risk measure. Historically, the calculation of regulatory solvency capital has generally not used confidence levels that explicitly target a level of probability. Instead, it has relied on actuarial concepts such as margins for prudence. While the 1-year horizon that has emerged as the standard for economic capital assessment for insurance groups is radically short-term compared to the actuary’s traditional focus on the ability to fund liability cash flows as they fall due, it is still significantly longer-term than the projection horizons used in banks’ VaR calculations (which typically use a 10-day horizon). Having a projection horizon that is many times longer arguably results in relevant historical data series that are many times smaller. Put simply, it is hard not to feel some discomfort about basing the estimation of 1-in-200 year events on the analysis of 50 years of historical data.
    • Technology. Some important life insurance liabilities are sufficiently complex as to require the use of a Monte Carlo simulation in their market-consistent liability valuation. The VaR requirement for the calculation of the 1-year-ahead probability distribution of the market-consistent balance sheet creates a Nested Stochastic modelling requirement. As illustrated in Figure 1, thousands of 1-year “real-world” risk factor simulations will be generated, and a market-consistent valuation that requires thousands of simulations will be required by each of these real-world simulations. Insurance groups’ liability valuation software systems are generally not capable of supporting the running of millions of liability cash flow simulations and were never intended for such a task. The 1-year VaR requirement demands a major technology upgrade or a computational approximation in implementing the 1-year VaR assessment.
    Figure 1. 1-year VaR and the nested stochastic problem
    1-year VaR and the nested stochastic problem
    Source: Moody's Analytics

    The calculation requirements of the nested stochastic implementation are heavily demanding. While some firms explore IT solutions that can support the simulation requirements of the nested stochastic problem, the overwhelming majority use a form of approximation. The specific approaches can differ significantly, but they can generally all be described as a way of approximating how the market-consistent liability valuation changes as a function of the balance sheet’s risk factors. These functions can be referred to as liability proxy functions. The next section discusses the statistical methods that can be used to parameterize these functions.

    2 Methodologies for Proxy Fitting

    The previous section introduced the idea of using analytical approximations (liability proxy functions) with market-consistent liability valuations to reduce the computational burden of the 1-year VaR calculation. An economic capital assessment is only as accurate as its liability proxy functions, so the development of these functions is a crucial element of the economic capital calculation. A number of different statistical fitting methods have emerged for obtaining the liability proxy functions. This section provides a high-level overview of these methods and their relative strengths and weaknesses.

    Before surveying the various methods, this section outlines the key criteria a good fitting method is expected to meet:

    • Accuracy. This is the most fundamental and obvious requirement of the fitting method – the function is required to provide an accurate description of how market-consistent liability valuations behave as a function of a set of risk factors. Additional color can be added to this requirement: the functions are being used to estimate the 1-year 99.5% VaR of the market-consistent balance sheet, so it is particularly important that the function behaves well in the tail of risk factor scenarios. The risk factor scenarios associated with the 99.5th percentile outcome will be a function of the balance sheet’s asset-liability mismatches and risk management strategies. However, in general we can postulate that the function will be required to provide accurate performance in extreme risk factor scenarios, including complex joint risk factor events (e.g., option-implied equity volatility increasing when equity markets fall). This means that if liabilities have non-linear risk factor exposures and / or risk factor exposures that change as other risk factors change, the liability proxy function’s accuracy is likely to be an important determinant of its performance against this criterion.
    • Measurability of errors. To assess the accuracy of a liability proxy function, firms will perform validation valuations – i.e., use several full sets of market-consistent simulations to re-value the liability in various stress scenarios. This approach can be applied to any function irrespective of the fitting methodology and should identify any major issues with the proxy function’s accuracy. However, it will also be helpful to have a meaningful measure of the statistical quality of the fit, particularly in the tails. This can be useful in making objective choices in the fitting process, for example in determining the level of parsimony that can be obtained and avoiding over-fitting.
    • Fitting efficiency. All of the liability proxy-fitting methods use liability simulations to produce the fitted function. The key rationale for the use of liability proxy functions has been lost if the number of simulations required for an accurate fit is similar to the total number of simulations that will be required in the full nested stochastic simulation implementation. More generally, if two proxy-fitting functions equally meet the other fitting criteria, and one method requires a materially smaller amount of simulations to obtain the same accuracy, firms would expect that it is the superior method.
    • Ease of implementation. This can be a catchall criterion, but it is important in the context of the demand for a frequent, robust, and fast calculation of economic capital at regular intervals. Does the fitting method require the modeler to make difficult and subjective choices that can materially impact the result? Can the fitting process be easily automated? Is the fitting process easy to communicate to regulators and senior managers? Is the fitting process replicable by others? The need for unexpected manual intervention in the fitting process has negative implications for all of these questions.
    • Use as a practical management tool. The Internal Model is an element of a principle-based solvency assessment process and, as such, it is expected that it will be used to inform management decision-making. This investment can provide useful information to the business and Solvency II’s Use Test codifies this requirement. In the context of a life insurance business, it is likely to be more widely supported if the method can positively answer questions, such as:

      - Is the fitting method sufficiently accurate and robust so that it can estimate how economic capital may behave across different scenarios?
      - Can the fitting method be easily extended to provide multi-time-step estimates of the liability value behavior?
      - Can the fitting method easily reflect different dynamic management action algorithms?

    2.1 Stress Test and Correlate

    The Stress Test and Correlate method is the simplest liability proxy fitting method. Indeed, it is so simple that it at first does not appear to fit the structure described above, where thousands of real-world simulations of the balance sheet are generated with the aid of liability proxy functions.

    The method entails calculating economic capital for each of the risk factors to which the balance sheet is exposed. Each risk factor is stressed to its 99.5th percentile value and a full market-consistent liability valuation by simulation is done under that stress. The economic capital requirements produced for each risk factor are then aggregated using a correlation matrix that describes the risk factors’ joint dependencies.

    It can be shown that this method is equivalent to assuming the liability proxy function is a linear function of the risk factors (no higher-order or cross-terms in the polynomial risk factor function) and that the risk factors are jointly normally distributed with 99.5th percentiles equal to those assumed in the stress tests and with correlations as described in the economic capital correlation matrix. Figure 2 illustrates the linear function fit to an insurance guarantee cost.

    Figure 2. Actual guarantee cost and linear function fitted using a stress test and correlate method
    Actual guarantee cost and linear function fitted using a stress test and correlate method
    Source: Moody's Analytics

    This method’s likely performance is appraised against the criteria set out at the start of the section:

    • Accuracy. The method implicitly assumes a linear liability proxy function (i.e., no higher-order or cross-terms are assumed in the function) and the linear terms of the function are set consistently with the 99.5th stress valuations produced for each risk factor. Naturally, the accuracy of this proxy method will depend on the extent to which non-linear terms are present in the ‘true’ value of the liability. The inaccuracy of this method will tend to increase with the number of risk factors (as the method assumes the liability proxy function’s cross-terms are zero, and these cross-terms will tend to have a bigger impact on the economic capital when there are many combinations of correlated risk factors).
    • Measurability of errors. There is no way of assessing the statistical impact of the linearity simplifications embedded in the Stress Test and Correlate method, other than by running out-of-sample validation tests.
    • Fitting efficiency. The Stress Test and Correlate method requires relatively few simulations to be run in its implementation. A full set of market-consistent simulations is required for each individual risk factor. As is covered later, the relatively small amount of total scenarios required by this method is due to the small number of parameters that the function has to fit, rather than because the use of simulations is inherently efficient. For example, if there are ten risk factors and a firm uses 10,000 market-consistent simulations for each valuation, this implies a total of 100,000 simulations for the economic capital assessment. Note that in this example a basic nested stochastic approach would require 10,000 x 10,000 = 10 million simulations (assuming 10,000 1-year real-world simulations are used).
    • Ease of implementation. As the method is well defined, its implementation should be highly automated and without subjectivity or manual intervention (for a given risk factor model / calibration assumption). However, the modeler is restricted to assuming a joint Gaussian dependency structure between risk factors. This may be viewed as highly unrealistic and imprudent for some risk factor combinations (e.g., dependency between equity indices).
    • Use as a practical management tool. This method’s use as a practical management tool is highly restricted. The breadth of assumptions embedded in the method means that it will tend not to provide an accurate description of the full probability distribution for the year-ahead balance sheet. Moreover, there is little prospect of the method being extended to support the development of accurate multi-time-step, multi-year balance sheet projections.

    Overall, this is an ‘entry-level’ method. It does not require much thought and it can be implemented using a relatively small amount of simulations. However, its simplifications create the risk of producing material errors in the description of liability value behavior, and its restrictions on risk factor modelling assumptions can limit the modeler to unrealistic and imprudent risk factor assumptions.

    2.2 Curve Fitting

    Curve fitting is a more general liability proxy function fitting technique that has been widely used in Solvency II Internal Models. The basic idea is that the modeler specifies a particular n-parameter risk factor function to describe the one-year-ahead liability valuation, and then performs n full simulation-based market-consistent liability valuations that are used to parameterize the function. There are no constraints on the form of the risk factor function and typically it would be a polynomial function with some higher-order terms and cross-terms.

    The method is fairly transparent and intuitive, but it has two fundamental drawbacks that become increasingly problematic as the liability complexity increases.

    First, it requires the modeler to specify the functional form that the liability valuation will take. The modeler therefore must rely on judgment to determine which parameters are likely to be important and which can safely be assumed to be zero. For complex liabilities that are exposed to many risk factors and where there are many non-linear terms driving the valuation, this can be very challenging. When validation results show a poor quality of fit there is no easy way of identifying what is driving the fitting errors other than extensive trial and error in analyzing the impact of new parameters.

    And second, the simulation efficiency of the method does not scale well for large numbers of risk factors and polynomial terms. Note that the number of possible cross-terms in a polynomial function increases exponentially with the number of variables in the function. For example, suppose the polynomial function is to include all quadratic terms (including cross-terms). Then a 1, 2, 3 and 4 risk factor polynomial has 2, 5, 9 and 14 parameters respectively. Recalling the example described in section 2.1, if there are 10 risk factors and the chosen polynomial function has 25 terms, and each full simulation-based market-consistent liability valuation requires 10,000 simulations, then a total of 250,000 simulations are required in the liability proxy function fitting process. The fact that much greater simulation efficiency is possible in fitting processes is demonstrated below.

    Briefly summarizing the curve fitting method against five performance criteria:

    • Accuracy. The flexibility of the method means that, in theory, it is capable of producing highly accurate liability proxy functions. However, for the reasons described above, it may be challenging to produce an implementation of this method that produces highly accurate liability proxy fits for complex non-linear liabilities that are driven by multiple risk factors.
    • Measurability of errors. Similarly to the Stress-Test-and-Correlate method, there is no way of assessing the statistical impact of the simplifications embedded in the modeler’s assumed functional form, other than by running validation tests. The fitting method does not provide statistical information on what fitting errors have arisen or the source of those errors.
    • Fitting efficiency. The total number of simulations required in the fitting process scales linearly with the number of parameters in the function. The candidate number of parameters in the function scales exponentially with the number of risk factors. We will see below that much greater simulation efficiency can be obtained.
    • Ease of implementation. Once the functional form has been specified by the modeler, the implementation process is reasonably well-defined and mechanical, though the modeler will have some subjective choices to make about where in the risk factor space to fit the function (e.g. if fitting a quadratic function for a liability valuation’s behavior with respect to equity returns, two stress test valuation results are required, and the fit will likely be sensitive to what two equity return stresses are used). However, the bigger implementation challenge lies in the subjective specification of the functional form that is the starting point for the entire fitting process.
    • Use as a practical management tool. The method can provide more information and flexibility than the Stress-and-Correlate method. It is still very difficult to generalize the method to support multi-time-step projection. Perhaps more importantly, the method provides no insight into how the subjective specification of the functional form needs to be revised as management actions change (e.g., does a new product feature, asset strategy, bonus strategy, or assumption on policyholder behavior result in the selected functional form no longer being adequate, and if so, what new parameters need to be included?).

    So, in summary, it is a step forward from the Stress Test and Correlate method, but there are still significant limitations and implementation challenges associated with the method when it is applied to proxy fitting complex multi-risk factor, non-linear liabilities.

    2.3 Least-Squares Monte Carlo

    The Least-Squares Monte Carlo (LSMC) technique has emerged as a more sophisticated statistical method that addresses some of the failings of curve fitting. Its output takes exactly the same form as curve fitting (i.e., a fitted polynomial risk factor function for the year-ahead market-consistent liability value). However, the method addresses two of the key issues that arise in curve fitting: first, it does not require the modeler to make strong assumptions about the form of the function and, second, it can fit a large number of parameters with significantly greater simulation efficiency than the standard curve fitting method. Finally, unlike the other methods discussed, LSMC will also naturally provide objective statistical measures of the quality of fit that the fitted function offers.

    So what is LSMC? LSMC is a general statistical technique that has been used widely in Monte Carlo simulation applications, particularly in the nested stochastic problems that can arise in finance. The basic idea is most easily understood if it is contrasted with the standard curve fitting method. In curve fitting, the n parameters of the polynomial function are fitted by performing n stress test liability valuations. These valuations use the same number of simulations (and same random number seed) as the base case valuation to ensure that they provide an accurate measure of the change in liability valuation that occurs in the stress test. LSMC uses a different fitting strategy: to fit the n parameters, firms run many more stress tests than n (e.g., 10,000 stress tests), but the valuations use very few simulations (e.g., two).

    So LSMC results in many, very inaccurate re-valuations as opposed to curve fitting’s n accurate re-valuations. Firms may wonder why that represents forward progress. The simple reason is because the valuation inaccuracies that occur in the LSMC re-valuations are independent of each other and so their many errors can be averaged out (and the inaccurate valuations are unbiased estimates of the liability value so they converge to the correct answer). This approach is summarized in Figure 3 below.

    Figure 3. Summary of the LSMC approach to 1-year VaR implementation
    Summary of the LSMC approach to 1-year VaR implementation
    Source: Moody's Analytics

    This concept of averaging out independent errors using regression is powerful, particularly when the liability is a function of many risk factors (in statistical jargon, when the fitting space has high dimension). In this case, curve fitting demands firms make difficult choices about where in the high-dimensional space to focus their simulation firepower, but LSMC permits the whole space to be scanned without requiring any guesswork about where it would count the most. This provides two key advantages: it allows many more parameters to be considered in the fitting process and the averaging out through regression results in an inherently more statistically-efficient fitting process. Typical implementations of LSMC would use around 25,000 ‘outer’ simulations and two inner simulations per outer. This results in a total of 50,000 required simulations in the fitting process. Note that curve fitting typically requires around 250,000 simulations (and would result in a function that is less accurate and more difficult to validate).

    The fundamental difference in the fitting strategies of LSMC and curve fitting is illustrated in Figure 4.

    Figure 4. LSMC and curve fitting: contrasting the fitting methods
    LSMC and curve fitting: contrasting the fitting methods
    Source: Moody's Analytics

    2.4 Replicating Portfolios

    The Replicating Portfolio (RP) approach can be viewed as another variation in the list of liability proxy fitting methods. Again, this approach can best be understood by contrasting it with the basic curve fitting method. One of the key limitations of curve fitting is that it requires the modeler to specify the functional form of the risk factor function. This is a subjective and difficult task for complex non-linear liabilities that are a function of many risk factors. LSMC addresses this issue by using a more powerful statistical approach to the fitting process. The Replicating Portfolio method addresses the curve fitting limitation in a different way: it requires the modeler to specify assets that have similar characteristics to the liabilities, and the fitting process then expresses the liability valuation proxy function as a linear combination of these asset valuations. This combination can be intuitively considered as an asset portfolio. The portfolio weights are determined by finding the combination of weights that produce asset cash flows that best fit the liability cash flows produced under various stochastic fitting scenarios. The aim of the fitting process is to find portfolio weights that produce asset portfolio cash flows that replicate the liability cash flows across all scenarios, hence the term Replicating Portfolio.

    One of the attractions of this approach is that it arguably can provide a more intuitive form of proxy function. While LSMC will result in a higher-order general polynomial function that may be difficult to explain (e.g., why does the equity return cubic term have a value of x), the RP approach aims to provide portfolio weights in asset holdings that everyone can ‘touch and feel’. Another advantage is that it can make the integrated projection of asset and liability values easier. The RP technique was pioneered by asset risk modelling firms seeking an easy way to project insurance assets and liabilities – in that context, representing liabilities as an asset portfolio is attractive.

    However, the RP approach has a major and fundamental limitation: it requires the modeler to find assets that can produce cash flows that fit well with the cash flow behavior produced by the liabilities. Firms need a liability proxy function as they require an approximate solution to the nested stochastic problem that is created by the need to use Monte Carlo simulations to value complex liabilities. If assets can be readily identified that produce cash flow behavior that replicates the liability cash flows, it is likely that analytical solutions are available for the liability valuation and no nested stochastic problem needs to be solved. On the other hand, if the liabilities are complex, it will usually be very difficult to find assets that can the liability cash flow behavior, unless firms resort to ‘synthetic’ assets (i.e., make up asset structures that behave the same as liabilities). But then firms need to value those complex assets, and this does not represent any obvious progress in terms of solving the nested stochastic problem. Also, using synthetic assets detracts from the key fundamental benefit of using RP – obtaining an intuitive description of liabilities by expressing them as a portfolio of well-understood assets.

    For these reasons, it is difficult to envisage a scenario where the Replicating Portfolio method is an effective and efficient liability proxy fitting method. This has largely been borne out in practice: most of the insurance firms that started using RP as a 1-year VaR economic capital method have found that its level of accuracy and validation performance is inadequate for the purpose of a Solvency II Internal Model.

    3 A 1-year VaR Life Insurance Case Study: Generali Deutschland

    This section presents a case study of the development of liability proxy functions for a 1-year VaR assessment of a significant and complex life insurance business. In 2011, Generali Deutschland (GD) started testing the LSMC approach for risk aggregation in the life insurance / Solvency II context in cooperation with Barrie & Hibbert (B&H), which was acquired by Moody’s Analytics. The GD team describes their experience in this section.

    GD performed an initial case study for different life business units using a stochastic ALM model environment. The goal was the estimation of the probability distribution forecast (PDF) of the own funds as required in the Solvency II framework. Beyond the one-year PDF application, GD also tested several extended applications of the LSMC technique – for example, fitting extended proxy functions including important management action parameters or model inputs as additional risk factors for validation purposes (see section 4 for an example).

    3.1 Description of the Setting

    The scope of this case study comprehends the calculation of the probability distribution forecast of different life insurance business units’ own funds by means of LSMC, as well as the deduction of the respective risk capital. It presents a validation of the results and gives an interpretation.

    The Prophet ALS model release corresponding to the MCEV 2012 calculations for the respective business units is chosen as an environment for the stochastic ALM calculations. This is consistent with the modeled liabilities that are also the status of official MCEV 2012 calculations with a start of projection December 31, 2012.

    GD calculated the business’s Present Value of Future Profits (PVFP, i.e., the mean of the discounted shareholder profits) with a scenario budget of 50,000 stochastic simulations for the fitting, divided into 25,000 outer (stressed) scenarios with two inner valuation scenarios each. The outer scenarios were stressed with respect to seven risk drivers, four market risks, and three underwriting risks. For market risks, GD took into account two factors of the interest rate movement (short-term and long-term shock), the equity performance, and the credit default intensity.

    Risk-neutral inner scenarios are generated using the B&H market-consistent ESG. For the interest rates, the extended Two-Factor-Black-Karasinski model has been chosen. The stresses act onto the stochastic factors for the short rate (short-term shock) or the mean reversion rate (long-term shock), respectively. Equity performances in the framework are modeled by an index that is built from an excess return on the short rate, modeled with the so-called “time varying deterministic volatility” model (TVDV), a one factor lognormal stochastic model. A stress at time zero is implemented through a stressed starting value of the index. The credit risk is modeled according to the JLT model, which uses a transition matrix for the rating scaled by the “default intensity” Credit_Pi, that is modeled risk-neutrally by a Cox-Ingersoll-Ross process. The latter is also the starting point for the credit risk stresses.

    The underwriting risks – longevity, mortality trend, and lapse – have been chosen as example risks for the case study. In the fitting scenarios, the best estimate rates for these risks are stressed with multiplicative shocks drawn from uniform distributions.

    To evaluate the liability proxy function, real-world distributions for all used risk factors are needed. Due to the case study character of this model calculation, it was out of scope to generate real-world distributions for the market risks matching the actual assets of GD’s life business units. As an approximation, GD used market risk distributions calibrated for generic asset portfolios provided by B&H. Regarding the underwriting risks, normal distributions with suitable widths in accordance to the current internal model were chosen.

    The numbers presented in the following sections were calculated on the original ALM models of GD’s business units but were anonymized for this publication. For the presentation of the results, GD chose examples from different business units to illustrate different applications.

    3.2 Fitting Scenarios

    To obtain an optimal result in the regression, the fitting data must be distributed evenly over the complete high dimensional risk driver space. At the same time, correlations or repeating patterns must be avoided. In our approach, this is ensured by the use of SOBOL numbers.

    Figure 5. Different numbers of fitting points
    Different numbers of fitting points
    Source: Moody's Analytics

    As a demonstrating example, this is shown in Figure 5 for the risk drivers: interest rate risk and equity risk. For any number of data points, the space is covered uniformly and the corner areas are well captured. For 25,000 fitting points, the coverage density of the risk driver space is really high. But even only 10,000 points provides a decent coverage density.

    The LSMC technique requires a small set (usually two simulations) of market-consistent scenarios to be generated for every fitting point. Each of these market-consistent scenario sets needs to be calibrated to the market prices that are implied for the fitting scenario. This implies that thousands of market-consistent ESG re-calibrations must be generated, so this process must be automated to be practical. The Moody’s Analytics Proxy Generator software provides this automation so that a single automated process can generate the entire fitting scenario production.

    3.3 Calibration of the Proxy Function

    Having produced the fitting PVFPs, the regression is performed to obtain the liability proxy function for the PVFP. The starting positions of the risk drivers for each outer scenario have been used as inputs, as well as the corresponding estimates for the stochastic PVFPs in each outer scenario, obtained by averaging over the two inner scenarios. The base functions chosen for the regression were Legendre polynomials, which have the beneficial properties of completeness and orthogonality on a finite interval. For practical purposes, all risk driver ranges have been transformed to the interval [-1,1]. The regression was carried out using a forward stepwise model selection algorithm together with the Akaike Information Criterion. The maximum order of the polynomial was limited to six for the main orders and four for cross orders. As output, the regression produced PVFP proxy functions for each business unit. It is important to note that the fitting of other metrics such as the Best Estimate Liabilities (BEL) or the total market value of assets is feasible as well using the outputs from the same fitting run.

    3.4 Validation of the Results

    Before using the proxy function for capital estimation purposes we need to validate the fit. There are several possibilities how to stepwise validate the proxy functions.

    3.4.1 Graphical description of the interrelation of risks

    In a first step, the proxy function can be easily analyzed graphically – just polynomial evaluation is required (e.g., in Excel) – by plotting the PVFP function in one or two risk factor dimensions while leaving all other risk factors on their base value. Thus, the extremal behavior of the proxy function is shown. An example from the case study is presented in Figure 6: the risk factors lapse and long term interest rate are plotted over their full fitting range and the behavior of the (stochastic) PVFP estimates in both dimensions is directly seen. A justification of the behavior of the function can be done; bearing in mind that the characteristics of the liabilities of the business unit are in scope. Of course, different guarantee levels in the liabilities and different best estimate levels of lapse rates due to different business mixes lead to different shapes of PVFP surfaces. The plots can be quickly created for the various combinations of risk factors. This also reveals mistakes in the data handling (e.g., if wrong input data found their way into the regression). The LSMC approach followed the market, and underwriting risk factors are included in the same proxy function. Thus, the interrelation of both risk types can be investigated.

    Figure 6. PVFP in dependence of interest rate and lapse
    PVFP in dependence of interest rate and lapse
    Source: Moody's Analytics

    A good starting point for investigating the plot is checking the corner points and the plausibility of their connection. For example, GD observed a small PVFP for low interest rates and low lapse. Increasing lapse (holding low interest rates fixed) leads to an increase of the PVFP (due to high guarantees for this business unit). The PVFP is very high for high interest rate and low lapse due to high capital gains and lower for high interest rate and high lapse.

    3.4.2 Out-of-Sample Test: Directly calculated PVFP vs. LSMC estimates

    The next step compares proxy function estimates for certain risk driver displacements with values obtained from the stochastic ALM model by performing a full valuation run (e.g., 1000 simulations) with a scenario set calibrated to the respective risk driver position (leaving all other risk drivers on their base value).

    It is important to check if the proxy function provides a good estimate for the base case valuation. In the case study, the base scenario from the official MCEV valuation was the starting point for the creation of the fitting scenarios. Thus, the proxy function estimate when inserting the “0” vector (all risk drivers correspond to their MCEV base parameter) is compared to the stochastic PVFP from the official MCEV run and a small deviation below 2% is observed.

    Validation in single risk factor dimensions

    Out-of-sample tests were performed, for instance, in the interest rate risk driver dimension by calculating six exact points with the ALM model and comparing them to the corresponding proxy function estimates:

    Table 1. Validation for interest rate risk factor
    Validation for interest rate risk factor
    Source: Moody's Analytics

    As shown by the relative deviations, the proxy function provides accurate results for the interest rate risk factor. Out-of-sample tests can be performed in all risk driver dimensions. For another business unit, GD also showed the validation results for the risk factor lapse (see table 2).

    Table 2. Validation for lapse risk factor
    Validation for lapse risk factor
    Source: Moody's Analytics

    Also, the proxy function estimates are very accurate for underwriting risks.

    Validation in combined risk factor dimensions

    Out-of-sample tests can also be performed for joint stresses of several risk factors. For this purpose, the proxy function is evaluated by a vector of a combined displacement of the risk factors. This estimate is compared to the result of a full valuation run of the ALM model, which is performed with a stochastic scenario set calibrated to the joint displacement of the risk factors. For one of the business units, GD showed the validation results for joint shocks of risk factors interest rate, equity, and credit (Figure 7).

    Figure 7. Validation results for joint shocks of risk factors interest rate, equity, and credit
    Validation results for joint shocks of risk factors interest rate, equity, and credit
    Source: Moody's Analytics

    The proxy function also gives good estimates in combined risk driver displacements.

    There are many more possibilities for validating the proxy functions, such as by deriving confidence intervals. GD also checked that the results were stable toward changes in the regression assumptions, such as the main orders for single or cross terms.

    When assessing the results of the out-of-sample tests, the following points need to be considered:

    • The number of outer scenarios was chosen as 25,000 for testing purposes. The quality of fit might even improve when increasing the number of outer scenarios.
    • The validation is performed against the stochastic ALM model. The validation points received from a valuation run with 1,000 simulations still have an inherent simulation error.

    3.5 Results derived from the LSMC Proxy Function

    Once the proxy function is validated, it can be used for capital calculations. For example, the following quantities can be estimated by evaluating the polynomial:

    • The base stochastic PVFP
    • The empirical PVFP distribution (PDF)

      - The VaR at any confidence level desired
      - Other risk measures, such as the expected shortfall

    • The SCR as distance between base PVFP and VaR at 99.5% level

    Example in this case study: GD inserted 50,000 realizations of the risk factors in the proxy function to derive the PDF:

    • The VaR at 99.5% level is given by the 251st worst PVFP
    • The expected shortfall at 99.5% level is given by the mean over the 250 worst PVFPs

    Below is an example for one of GD’s business units (Figure 8).

    Figure 8. Base PVFP, PDF, VaR, expected shortfall
    Base PVFP, PDF, VaR, expected shortfall
    Source: Moody's Analytics

    4 Extended Case Study: Model Parameters as Additional Risk Drivers

    One of the remarkable strengths of the LSMC approach is the universal applicability for the calculation of sensitivities – for example, management actions and the underlying assumptions or parameters describing the situation at the start of the projection – and the investigation of their impact on economic capital figures. Without LSMC techniques, it would require great effort to determine the SCR for a couple of sensitivities, because even for a standard formula approach it would be necessary to perform one complete stochastic run for each parameter and each stress, which should be considered in the SCR.

    The use of LSMC methods simplifies this task significantly. GD may perform a usual fitting run, supplemented by one or more additional SOBOL numbers scaling the parameters under investigation in a certain range. Inserting values in the resulting polynomial (i.e., an extended proxy function), the SCR can be computed easily for all combinations of parameters within the fitting range. As an example for this method, this section describes how to examine the influence of the initial buffer situation on the Solvency II coverage of a German life insurer.

    4.1 Motivation

    For German life insurance business, the value RfB (Ruckstellungen fur Beitragsruckerstattung, or reserves for premium refunds) plays an important role as a buffer.

    Therefore, it is useful to have an idea of how the Solvency II coverage ratio is affected if the RfB is reduced or increased, especially as the management can influence the height of the RfB and its components by allocating parts of the gross surplus to the overall RfB and by deciding the amount of next year’s tied RfB by the bonus declaration.

    Background

    The RfB is the German policyholder bonus reserve. Part of it – the so-called “tied RfB” – covers the bonus payments officially declared for the following year. Another part of it – the terminal bonus reserve – covers the future terminal bonus payments for contracts that mature after the following year. The remaining part of the RfB – the so-called “free RfB” – is not tied to any particular payment to a particular policyholder. The free RfB is meant to cover some future bonus payments to policyholders. However, under the German law, the insurer can ask for the regulator’s approval to use parts of the free RfB to avoid bankruptcy in a crisis. Thus, the free RfB is an important buffer protecting shareholders of a German insurer from future capital injections to a certain extent.

    4.2 Fitting

    In order to receive a proxy function for the PVFP including the main risk drivers (interest rate, credit risk, and equity stress) and a parameter that determines the size of the RfB at projection start, GD performed a fitting run using a uniform distribution for the scaling of the RfB.

    In this process, the range for the change in the RfB is -100 m (corresponding to parameter -1) to +100 m (corresponding to parameter +1) so that the RfB varies between 784 m and 984 m for a base value of 884 m).

    Thus, a function was obtained:

    PVFP = f(Interest Rate, Equity, Credit, Initial RfB Level)

    4.3 Results

    4.3.1 Impact on the Base valuation

    Evaluating the proxy function with the “zero” vector (all parameters correspond to their base calibration) leads to an estimation for the base PVFP:

    PVFPbase = f(0,0,0,0) = 312 m

    GD examined the influence of the RfB at projection start on the unstressed PVFP by regarding the values f(0,0,0,x).

    Table 3. Impact of the RfB on the base valuation PVFP
    Impact of the RfB on the base valuation PVFP
    Source: Moody's Analytics

    As expected, the stochastic PVFP increases with the height of the initial RfB.

    4.3.2 Application in a Solvency II Context

    The influence on a single SCR can be observed from Figure 9 using interest rate as a first example.

    Figure 9. PVFP in dependence of interest rate and starting RfB
    PVFP in dependence of interest rate and starting RfB
    Source: Moody's Analytics

    With an assumption on the interest rate shock (approximately 20 bp parallel shift down), GD calculated the SCRIR in Table 4.

    Table 4. Impact of the RfB on the interest rate SCR
    Impact of the RfB on the interest rate SCR
    Source: Moody's Analytics

    In this case, the SCRIR decreases with increasing RfB. This seems to be plausible, because firms may assume that low interest rates lead to a higher number of bad scenarios, where the RfB can be used as a buffer to avoid shareholder injections (see above).

    It must be noted that all values were directly estimated with the proxy function. More single SCRs (equity, lapse, etc.) may be derived the same way, if considered in the fitting run. These single SCRs can be aggregated with a Var-CoVar approach to obtain an overall SCR for a certain height of the RfB.

    Alternatively the influence of the initial RfB on the SCR can be estimated by a full probability distribution approach. Assuming a real-world distribution for all risk drivers used in the fitting (i.e., interest rate, credit, and equity) firms get a PVFP distribution for every value (within the fitting range) of the starting RfB inserted in the proxy function. From this distribution, the Value at Risk, the Expected Shortfall, and the SCR can be deduced.

    When evaluating the function for three different starting RfBs, the following PVFP distributions are obtained.

    Figure 10. Density distribution of the PVFP for different RfB levels

    Red line: RfB = 804
    Black line: RfB = 884 m (base)
    Green line: RfB = 964 m

    Density distribution of the PVFP for different RfB levels
    Source: Moody's Analytics

    These distributions lead to the following Solvency II key figures:

    Table 5. Solvency II key figures for different RfB levels
    Solvency II key figures for different RfB levels
    Source: Moody's Analytics

    Altogether, there is a double impact on the Solvency II coverage ratio. On the one hand, the base PVFP in the enumerator increases with more initial RfB available. On the other hand, the SCR in the denominator decreases.

    Conclusion

    This case study reviews the experiences of Generali Deutschland in developing liability proxy functions for a 1-year Value-at-Risk assessment of a complex, long-term life insurance business.

    The Generali experience has highlighted the statistical accuracy and practical implementation of the Least-Squares Monte Carlo approach to proxy fitting. Taking everything into consideration, the Generali team has concluded that the LSMC technique can be used for several important applications in the Solvency II context, such as the one-year VaR estimation from a full distribution of own funds (PDF) or extended applications such as quantifying the impact of parameter/assumption changes on economic capital numbers.

    In the case study, the LSMC proxies were validated in detail and were shown to produce accurate and robust estimates. A treatment of market, credit, and underwriting risks in a single approach was shown to be feasible and the approach requires less expert judgment and manual implementation intervention than other proxy methods.

    Having achieved very good validation results, the Generali team has repeated the 1-year VaR case study in three consecutive MCEV environments and is now implementing LSMC as their Solvency II Internal Model target methodology.

    Given the flexibility of the LSMC approach, it may also be applied to multi-year projections of capital requirements in the ORSA context.

    Featured Experts
    As Published In:
    Related Articles
    Article

    ORSA – a Forward-Looking View of Capital and Solvency

    This article discusses how insurers should look beyond the next year, and build a capability to project solvency capital requirements under a range of scenarios – helping link capital with strategic business decisions.

    May 2014 WebPage Gavin Conn, Brian Heale, Craig Turnbull
    Whitepaper

    Illiquid Assets and Capital-Driven Investment Strategies

    The risk-based nature of Solvency II creates an opportunity for asset managers to play a more strategic role in insurance asset management — capital-driven investment could be for the insurance industry what liability-driven investment has been for pension funds. In this paper we look at how to measure the illiquidity premiums on offer across the increasingly diverse range of asset classes that a long-term illiquid liability writer such as a fixed annuity business can consider investing in. These risk-adjusted return measures are then used alongside a Solvency II-style capital model to generate capital-driven investment metrics.

    March 2014 Pdf Harry Hibbert, Craig Turnbull, Gavin Conn
    Whitepaper

    Multi-year Modeling of Greeks Using Least Squares Monte Carlo: An Exotic Option Case Study

    In this paper we extend the analysis contained in a previous case study by considering a more complex example: a lookback option. We show that the methodology can produce a similar quality of fitting performance for the lookback option as in the vanilla option case. We also discuss the methodology adjustments necessary for the Greeks fitting strategy in order to accurately fit to forms of path-dependent, exotic options such as lookbacks.

    January 2014 Pdf Aubrey ClaytonDr. Steven Morrison, Craig Turnbull, Karthik Vijayapalan
    Whitepaper

    Dynamic Hedge Projection and the Multi-period Modeling of Greeks

    To obtain recognition for the risk mitigation benefits of hedging in their regulatory capital assessments, variable annuity writers in North America and Europe must perform stochastic projections of the behaviour of their dynamic hedging programs over the lifetime of these long-term liabilities However, the computational difficulties of this calculation result in many firms being either unable to obtain realistic levels of capital relief, or undertaking enormous complex nested stochastic calculations that are expensive, unwieldy and that may involve arbitrary simplifications that undermine confidence in their results. We believe this paper breaks new ground by introducing an entirely different methodology for addressing the highly demanding modeling required in this area, and one which is significantly more efficient, accurate and objective than those applied in industry up until now.

    November 2013 Pdf Dr. Steven MorrisonAubrey Clayton, Craig Turnbull, Oldrich Alfons Vasicek, Karthik Vijayapalan
    Whitepaper

    Multi-year Projection of 1-yr VaR Capital Requirements and Free Surplus

    A recent research report presented methodologies and case studies for the development of proxy functions for use in efficient multi-year projection of the market-consistent liability values of complex life liabilities. This report further extends the applicability of these methodologies to a third application: the multi-year projection of 1-year VaR capital requirements. We examine how to fit multi-year liability value functions for use in the calculation of projected 1-year Value-at-Risk capital requirements as well as liability valuation.

    November 2013 Pdf Dr. Steven Morrison, Craig Turnbull, Naglis Vysniauskas
    Whitepaper

    Multi-year Projection of Run-off Conditional Tail Expectation (CTE) Reserves

    In this paper we describe and demonstrate how the capability to efficiently produce robust and accurate proxy functions for CTE(70) run-off reserve behavior across a wide range of multi-timestep, multi-risk-factor scenarios can significantly enhance firms' forward solvency projection analytics. This can be extremely useful for firms to project their balance sheets and reserving and capital requirements as part of ORSA and other business planning requirements.

    July 2013 Pdf Dr. Steven Morrison, Craig Turnbull, Karthik Vijayapalan
    Whitepaper

    ORSA: Prospective Solvency Assessment and Capital Projection Modelling

    Insurers need a modelling capability that meets two key requirements. The first is the ability to determine appropriate multi-year scenarios (deterministic stress tests or stochastic) in which to project the insurer's business. The second is the ability to accurately assess the capital requirements that would be created within these scenarios. In this report we provide you with an overview of the fundamental challenges of ORSA and practical guidance on the approach you might take.

    June 2013 Pdf Craig Turnbull, Andy Frepp
    Whitepaper

    Validation of Risk Factor Modelling in 1-Year VaR Capital Assessments

    This paper discusses the validation of the risk factor element of firms' internal models, particularly in the context of 1-year VaR capital models. Examples of back-testing, sensitivity testing and stress and scenario testing of equity, interest rate and credit models are developed to illustrate the challenges that a rigorous validation process must address.

    April 2013 Pdf Gavin Conn, Craig Turnbull, Sandy Sharp
    Whitepaper

    Multi-year Projection of Market-consistent Liability Valuations

    Insurance groups, motivated by ORSA and wider business planning requirements, are increasingly interested in making medium-term forward projections of their regulatory and economic capital requirements across a range of future economic and business conditions. This paper presents the technical methodologies required to support this type of multi-year projection capability for market-consistent liability valuations, together with a case study that illustrates the applications of this capability in multi-year stochastic simulations, reverse stress testing and stress and scenario testing.

    April 2013 Pdf Dr. Steven Morrison, Craig Turnbull, Naglis Vysniauskas
    Whitepaper

    One-year projection of run-off conditional tail expectation (CTE) reserves

    This paper discusses whether the quantitative techniques that have been successfully applied to the nested stochastic challenge arising in one-year VaR in insurance economic capital can also be applied to another nested stochastic problem: that of making a one-year projection of run-off CTE reserve requirements.

    March 2013 Pdf Dr. Steven Morrison, Laura Tadrowski, Craig Turnbull
    RESULTS 1 - 10 OF 10