Funds transfer pricing (FTP) is of growing concern to banks and regulators. But what does FTP have to do with stress testing? A comprehensive FTP framework can help organizations use the results of stress tests to forecast their P&L across departments and lines of business, ensuring that each unit’s strategy aligns with that of the greater organization.
Virtually all banks use funds transfer pricing, and yet there are no practices common to all. Consequently, regulators are asking them to improve their FTP systems. Banks are developing comprehensive frameworks to meet these demands, but stress testing has been left out of that framework.
Even though banks do not believe that stressed scenarios should be part of an FTP transaction, integrating stress testing into an FTP framework is more important now than ever.
While each bank has its own organizational structure, some FTP components can be valued using market prices – hence, the methodology is the same for all banks. But costs or risks connected with some types of transactions are generally not monitored until a critical loss occurs. This is what recently happened, for example, with higher liquidity funding costs and low interest rates.
The liquidity crisis severely impacted FTP systems. Once banks took liquidity funding costs into account, they realized that some of their transactions were barely profitable.
In 2009, the European Banking Authority (EBA) published guidelines on liquidity cost allocation, emphasizing that banks should have “robust strategies, policies, processes, and systems” for liquidity risk management that should “include adequate allocation mechanisms of liquidity costs, benefits, and risks.”1
But the liquidity components of the FTP – also called liquidity transfer pricing – are not the only components that need to be carefully monitored. Due to tough economic conditions and the cost of regulatory compliance for both capital and liquidity ratios, the overall P&L of some business units has dropped dangerously close to zero.
The main goals of a robust and consistent FTP system are to correctly allocate the P&L to each line of business and to forecast different costs in different scenarios. The general framework will generate P&L for different departments and, depending on the way this framework is built, guide each department to a specific P&L strategy. It is therefore critical that this framework be aligned with a bank’s overall strategy to help incentivize its teams to make profitable business decisions and better manage its overall P&L.
Unfortunately, moving to a comprehensive framework will increase the costs allocated to each line of business, as it will reveal new FTP components that were either previously hidden or not monitored. Internal costs will be higher for all transactions. This is why some transactions may appear to have a negative P&L, a change banks will need to explain to all business units. The framework will then become more than a technical tool – it will indicate the need for a strong change management plan.
Banks should take several factors into account when designing a comprehensive FTP framework.
If a bank wants to learn which transactions are profitable, it must calculate FTP at the transaction level. (Aggregation is usually not possible or advisable; funds should be aggregated prudently, if at all.) For example, a global FTP given to a line of business without taking into account the way the principal is amortized would lead to a higher maturity mismatch.
Another pitfall of an FTP calculation is the use of inconsistent methodologies across a bank. Most of the time, banks use different methodologies for each line of business. This can result in incentives and behaviors that are not necessarily aligned with the firm’s overall strategy. At any time, the sum of the P&L across all lines of business must equal the P&L of the overall firm.
Finally, the framework should be updated as frequently as possible. A system that is not updated regularly runs the risk of lagging behind the rate of transactions, especially as markets tend to move very quickly.
Once this framework has been put into place, P&L can be calculated at the line-of-business level. The total P&L of the bank can be divided along lines of business, if the “transfer” units in charge of managing the different types of costs/risks are taken into account.
There are now different departments or profit centers, none of which is only a cost center. They will each be charged for what they cost, making it easier to calculate their P&L. The ability to concretely measure risk is very important from an analysis point of view.
To better drive business, it is also critical to run simulations to forecast P&L under different scenarios, including stress test scenarios.
The 2008 financial crisis prompted risk managers to focus on assessing risks under stressed scenarios. Regulators, as well as the top management within organizations, are now asking for P&L metrics as a standard output of any stress testing exercise.
If organizations want to analyze the results of their stress test reports for their impact on P&L, they need to build a comprehensive framework like the one previously described. However, they are likely to run into two stumbling blocks.
First, FTP is not ordinarily calculated using a stress testing scenario. Banks that use this methodology will likely be less competitive than banks that do not because their FTP costs are higher, leading to higher client rates. In other words, banks that calculate FTP with this framework are accounting for an additional cost that other banks might not consider.
However, this cost is real and should be measured. Take, for example, the likelihood of a customer using a line of credit that is higher than in a stressed scenario. The cost of liquidity would be higher, ending in a loss for the treasury. If this scenario is not part of a bank’s FTP framework, it should – at a minimum – be part of its risk appetite framework to make the bank aware of the real risks presented by that scenario.
Second, it is very important to be able to measure and forecast the P&L of each business unit for various scenarios. For example, low interest rates tend to result in a lowering of the commercial margin. If, on the asset side of the business, a client asks for lower rates, the bank is more or less forced to comply to keep the customer happy and stay competitive. But on the liability side, there is a barrier that cannot be breached: 0%. No customer would accept negative rates for a checking account.
Because of this tightened margin – the difference between the rate received on the asset side and the rate paid on the liability side – it is important to measure FTP rates and forecast them with a high degree of accuracy.
A bank’s senior management and operational teams tend to view stress testing as only a regulatory compliance or reporting obligation, not as a benefit to their day-to-day work. But they must acknowledge that stress testing scenarios are important for measuring extreme events and their consequences, particularly FTP components, which have historically been neglected (e.g., liquidity risk before the subprime crisis).
Ignoring certain risks is dangerous for a bank’s risk management operations, even if those risks are unlikely to be realized. Neglecting to price these risks at the P&L of the bank will lead to incorrect incentives for operational teams and, ultimately, affect the organization’s profitability. Risks can also be managed or monitored using a risk appetite framework, but for consistency the risk appetite should be reflected in the FTP price so that all units can see the direct financial impact of the risk on their P&L.
In conclusion, banks will find that investing in a comprehensive framework will prove to be more effective in the long run, as there will be less important losses. If losses do occur, they will already be measured and priced for all business units. FTP is one of the most efficient tools for spreading risk appetite to all teams of the bank.
1 EBA, Guidelines on liquidity cost benefit allocation, October 2010.
2 European Parliament and Council Directive, Directive 2009/111/EC, September 2009.
3 Financial Services Authority, FTP Treasurer Letter, July 2010.
Juan M. Licari, PhD, is Chief International Economist with Moody's Analytics. As the Head of Economic and Credit Research in EMEA, APAC and Latin America, Juan and his team specialize in generating alternative macroeconomic forecasts and building econometric tools to model credit risk portfolios.
Leading APAC economist oversees regional economic analysis and forecasting; presents company’s economic research and outlook, and leads consulting projects to help clients assess effects of these developments on their business.
Focuses on helping financial institutions improve their data management practices and capabilities for enhanced risk management, business value, and regulatory compliance.
Previous ArticleStrong Data Management – An Absolute Necessity
With greater clarity of the regulatory compliance environment than at any time since before the financial crisis, banks have an excellent opportunity to get off the compliance treadmill and move forward with strategic technology platforms for managing risk.
November 2017 WebPage Karen Moss, Nicolas Kunghehian
This whitepaper covers the challenges and best practices for closer alignment of liquidity risk management and regulatory reporting.
October 2017 Pdf Nicolas Kunghehian, Karen Moss
The new Basel Committee on Banking Supervision (BCBS) standards for IRRBB come into force January 1, 2018. This paper looks at the standards from a practical implementation point of view and raises some of the main challenges.
September 2017 Pdf Nicolas Kunghehian, Anne Deotto
In this webinar, experts from Moody’s Analytics will demonstrate the three steps to managing liquidity, compliance and the business.
January 2016 WebPage Nicolas Kunghehian, Pierre Mesnard
This webinar looks at the need for data quality when managing volatile ratios in a short period of time, improving performance in a low interest rate environment and fulfilling the detailed reports required by supervisors.
April 2015 WebPage Nicolas Kunghehian
This article compares the similar concepts of enterprise risk management and integrated risk management, and considers what risk practitioners can learn from an analysis of the best practices of each in order to strengthen businesses.
November 2014 WebPage Nicolas Kunghehian
Integrating different risks in a single framework greatly benefits all financial institutions – leading to better communication, risk assessment, and long-term performance.
November 2013 WebPage Nicolas Kunghehian
This article illustrates that a crisis can occur, or be exacerbated, when risks are managed in different silos in banks. It first defines the different types of risks that can be correlated and provides examples that illustrate how banks should model the different risks together.
September 2013 WebPage Nicolas Kunghehian
La récente crise financière a mis l'accent sur le besoin de plus de stress tests comme un des principaux outils de la gestion des risques. Une organisation centralisée de la gestion des scenarios doit être au centre de l'architecture de la gestion des risques de l'entreprise.
March 26, 2012 Pdf Nicolas Kunghehian
The presentation looks at what is the impact of the new Basel III regulation on the liquidity framework, what are the best practices for Asset & Liability Management, Economic scenario generation and calculation techniques, Managing the Basel III ratios
January 2012 Pdf Nicolas Kunghehian