With regulators pushing for investment in stress testing, this is an opportunity for many bankers to change the way the business does its bottom-up planning, monitoring, and control.
Which came first, the chicken or the egg? Did regulators invent stress testing for banks, or has the banking community always undertaken it?
Despite any ‘official’ answers, there still seems to be an element of debate, or denial, in connection with both questions. People like to pretend there is a degree of uncertainty, and many answers may be correct in some circumstances. Either way, to debate the correctness of a given response is to miss the point.
There is no doubt that the crisis that started in 2007, and which evolved into the economic downturn, took banks by surprise. Worse, most financial institutions (the ‘egg’ for our purposes) were ill-prepared for such a turn of events, and initially had no idea how to react. Each day brought new and unwelcome surprises. With each new revelation, there would be a collective sigh of incredulity from the public at large. In short, banks were unprepared, not just for the specific circumstances of this particular crisis, but also generally for managing an evolving set of stress events.
The regulators (the ‘chicken’ in this illustration) have targeted this lack of preparedness with their stress testing programmes. Regulators need to assess the impact of different scenarios on the wider economy, as a component of their macroprudential supervision. They also want to avoid the institution-specific ignorance that prevailed at the heart of the financial crisis. By forcing banks to undertake stress testing, they are raising standards within and across the industry, for both macro- and micro-prudential purposes.
The banks’ need to allocate the additional resources (human and technical) in order to comply with these regulatory requirements is inevitably a source of resentment. At a time of change and cost constraint, the additional burden is an unwelcome overhead. But when banks reflect on whether such frustration is with the regulators, or whether it is with their own inability to respond, it usually turns out to be largely the latter.
Banks are gradually discovering that these competencies add value for their own purposes. Indeed, stress testing, or at least scenario analysis, is something that has always occurred in banks – just not on a scale or to a level of sophistication that is commensurate with what is now increasingly recognised as necessary. So they are therefore finding that the new, additional capabilities help with both day-to-day management of enterprise risk and also with the planning and monitoring processes.
Daily decisions within banks regularly take scenario analysis in to consideration. At the most basic level, credit analysis is all about ‘what if…?’: What if the client fails? What if the customer loses their job? What if they do not win that critical contract? What if the key employee/director of a corporate client leaves? Are there enough reserves within this corporate client to allow it to weather a downturn?
At the other extreme, the annual medium-term planning round is about working out the best strategy for capital and resource allocation over the period ahead, in light of what has happened over the last 12 months, and considering different scenarios for what might happen over the next three to five years. All this is a form of stress testing; i.e., considering the outcome for individual situations (whether in respect of customers, business units, or across the enterprise as a whole, and whether for credit risk, liquidity risk, market risk, operational risk, or any of the other risks encountered by a bank) in light of prevailing and anticipated scenarios.
The key ingredient missing from such routine and traditional stress analysis is ‘aggregation’. The real challenge is aggregating scenarios for individual borrowers, liquidity positions, or capital requirements. In short, this is about bottom-up information analysis; taking individual data, combining it with other data, modelling it to transform it into meaningful information, and then further aggregating it for business intelligence purposes.
Banks have long known that such business intelligence would mean the planning process is much better informed. Armed with such insights, the key areas of stress analysis – the quantification of risk appetite, allocation of capital, targeting of an appropriate balance between risk and reward, funding / liquidity planning, asset and liability management, etc. – would all be much more robust.
Because the necessary capabilities for all this have been missing, banks have not had, in turn, a robust platform from which to then assess how to manage the consequences of different stress scenarios. Nor have they had the ingredients for defining early warning indicators. So monitoring the evolution of the balance sheet or P&L, and spotting signs of deterioration (or at least change) early enough in the cycle to allow corrective levers to be pulled, has been a process of trial and error.
The reason these competencies have been missing is twofold. On the one hand, the technology – the necessary measures and associated computing power – has only been around for the last ten or fifteen years. On the other hand, there have been (and continue to be) so many other competing pressures for investment spend that, certainly in a period of growth, the business case for other demands on available resources (e.g., customer service, product, or market share) was deemed the priority.
Which is why, with regulators forcing the pace on investment in stress testing, this is an opportunity that many bankers relish: the chance to change the way the business does its bottom-up planning, monitoring, and control, with a clear conscience. In the past, those with a responsibility for risk management within the organisation, from board level downward, might have wished for more resources in order to undertake such bottom-up analysis. Today, with banks being required to deliver on these things for regulatory compliance purposes, there is a window of opportunity for these wishes to come true. This is about the prioritisation of resources. Whilst historically the business benefits of stress testing might have been recognised, now the investment in the necessary competencies can be legitimately prioritised. Many rightly argue that the banks that implement these capabilities will be arming themselves with clear competitive advantages.
Ultimately, there is one overarching benefit to a greater investment in stress testing capabilities for internal business purposes (as opposed to regulatory compliance purposes). Banks generally exist in order to provide a return to their owners, the shareholders. Shareholders generally require returns that are robust, growing, and sustainable. They also want to have faith in the business management to deliver on these things. That faith is underpinned by transparency of information and by a strong track record. Stress testing for internal management purposes is ultimately about the generation of such business intelligence. It supports transparency and, if acted upon (with the right governance, and with appropriate monitoring and controls so that the consequences of evolving and often unexpected change can be acted upon), ensures that the track record is clearly in evidence.
To summarise, the banking community has always undertaken forms of stress testing, but the recent regulatory emphasis on it as an organisational competency is changing the way banks approach the management of risk across the enterprise. This benefits the banks, their shareholders, and also the wider economy. As for the case of the chicken and egg, science suggests that it is the egg that came first; it is not possible to genetically modify a living/breathing creature (at least, not without modern science), and therefore the evolution of the chicken into the form currently recognised in nature has to have been through a process of mutation or modification during growth in the egg. And yet the debate will continue…
Leading economist; recognized authority and commentator on personal finance and credit, U.S. housing, economic trends and policy implications; innovator in econometric and credit modeling techniques.
Douglas W. Dwyer leads Corporate Credit Research in Predictive Analytics. This group produces credit risk metrics of small businesses, medium sized enterprises, large corporations, financial institutions, and sovereigns worldwide. The group’s models are used by banks, asset managers, insurance companies, accounting firms and corporations to measure name specific credit risk for a wide variety of purposes. We measure credit risk using information drawn from financial statements, regulatory filings, security prices, derivative contracts, behavioral and payment information. For each asset class, the methodology is developed based on the available information for each obligor. <br><br> Current projects include developing a climate adjusted probability of default and incorporating ESG factors into credit analytics. We also are developing an approach to produces comparable PDs across asset classes that opportunistically uses whatever information is available. <br><br> Prior to working at Moody’s Analytics, Dr. Dwyer was a Principal at William M. Mercer, Inc., in their Human Capital Strategy practice. Dr. Dwyer earned a Ph.D. in Economics at Columbia University and a B.A. in Economics from Oberlin College.
Dr. Juan M. Licari is a managing director at Moody's Analytics. Juan and team-members are responsible for the research and analytics that enable our quantitative solutions. The team helps our customers solve complex business problems; adding value through data and analytics.
Focuses on stress testing in Europe – how banks can build an effective stress testing program, achieve critical business objectives, and ensure regulatory compliance.
Previous ArticleBreaking the Silos in Stress Testing
Ask the senior management of a bank what they regard as the most important aspect of Enterprise Risk Management (‘ERM') and the chances are they will tell you it is the ability to have a holistic view of the risks being run by the organization. Their perspective is typically a top-down view and seldom do they think of it in terms of the core bottom-up enabler for ERM data. Why? Because data is a given and we live with what is available.
In recent years, banks have invested in enterprise risk management (ERM), but have these investments been smart, operationally efficient and effective? The answer to this depends on whether they believe being operationally efficient is an unrealistic utopia, or whether they see it as a key building block for best practice ERM, and therefore a strategic imperative. This article attempts to explain these two positions by outlining current drivers pushing banks toward more effective ERM investment.
Embedding the Bank of International Settlements (BIS) Risk Data Aggregation and Reporting Principles Across the Organization
This paper considers the 14 Risk Data Aggregation and Reporting Principles recently described by the Basel Committee for Banking Supervision and assesses their associated costs and benefits. In addition, it explores the benefits of embedding the Principles in two contexts: corporate credit origination and overall enterprise-wide risk management.
The presentation gives a summary of the key changes under Basel III and their impact. It then drills down on best practices in Enterprise Risk Management, and then concludes linking to Pillar II, ICAAP and Economic Capital Management
This presentation includes liquidity risk and stress testing, differences between liquidity management and liquidity risk and Pillar III and liquidity risk.