BaFin published supervisory principles for the use of algorithms in the decision-making process of financial institutions. The principles are intended to promote responsible use of big data and artificial intelligence and facilitate control of the associated risks. The principles constitute preliminary ideas for minimum supervisory requirements related to the use of artificial intelligence and form the basis for discussions with various stakeholders. These principles can serve as guidance for entities under the supervision of BaFin. Additionally, BaFin and Bundesbank aim to publish a discussion paper by mid-July on the use of machine learning in Pillar 1 and Pillar 2 models; the focus will be on models that must be granted regulatory approval in the context of solvency supervision or models that are subject to prudential review.
To formulate the principles as precisely as possible, an algorithm-based decision-making process has been broken down into two phases—development and application. The development phase examines how the algorithm is selected, calibrated, and validated. For instance, principles for the development phase relate to the relevant data strategy as well as documentation to ensure clarity for both internal and external parties. In the application phase, the results of an algorithm must be interpreted and included in the decision-making process. This can either be done automatically or by involving experts. A functioning mechanism comprising elements such as sufficient checks and feedback loops for the development phase must be established in all cases. Aside from these two phases, certain overarching principles are important for the creation and application of the algorithm and these include:
- Clear management responsibility. Senior management must have sufficient technical expertise. Also, reporting lines and reporting formats must be structured to ensure that communication is risk-appropriate and geared to the specific requirements of the target audience, from the modeler right up to senior management. Moreover, the business-wide strategy for using algorithm-based decision-making processes should be reflected in the IT strategy. There must also be staff with the necessary technical knowledge in the independent control functions.
- Appropriate risk and outsourcing management. Senior management is responsible for establishing a risk management system that has been adapted for the use of algorithm based decision-making processes. If applications are used by a service provider, senior management must also set up an effective outsourcing management system. Responsibility, reporting, and monitoring structures must be set out clearly within this context. Measures to minimize cyber-security risks should also be adapted if required and the complexity and data dependency of the algorithm must be considered.
- Prevention of bias. Bias must be prevented to be able to reach business decisions that are not based on systematically distorted results, to rule out bias-based systematic discrimination of certain groups of customers, and to rule out any resulting reputational risks. This is a key principle since such biases may occur from the development of the process to its application.
- Ruling out of types of differentiation prohibited by law. In case of certain financial services, the law stipulates that certain characteristics may not be considered for differentiation purposes—that is, to calculate risk and prices. If conditions are systematically set out on the basis of such characteristics, there is a risk of discrimination. Such a risk also exists if these characteristics are replaced with an approximation. This would be associated with increased reputational risks and, in some cases, legal risks. In certain circumstances, BaFin might consider it necessary to take measures to address violations of statutory provisions. Companies should, therefore, establish (statistical) verification processes to rule out discrimination.
These principles represent a milestone in the efforts of BaFin and Bundesbank to create legal and application certainty for the responsible use of big data and artificial intelligence in the financial sector.
Keywords: Europe, Germany, Banking, Big Data, Artificial Intelligence, Supervisory Principles, Cyber Risk, Algorithms, Outsourcing, Pillar 1, Pillar 2, Regtech, BaFin
Previous ArticleEIOPA to Consider Liquidity Risk in Stress Test for 2021
The Australian Prudential Regulation Authority (APRA) found that Heritage Bank Limited had incorrectly reported capital because of weaknesses in operational risk and compliance frameworks, although the bank did not breach minimum prudential capital ratios at any point and remains well-capitalized.
The Office of the Superintendent of Financial Institutions (OSFI) released the annual report for 2020-2021.
The Australian Prudential Regulation Authority (APRA) published, along with a summary of its response to the consultation feedback, an information paper that summarizes the finalized capital framework that is in line with the internationally agreed Basel III requirements for banks.
The Committee on Payments and Market Infrastructures (CPMI) and the International Organization of Securities Commissions (IOSCO) issued a consultative report focusing on access to central counterparty (CCP) clearing and client-position portability.
The Australian Prudential Regulation Authority (APRA) released the final Prudential Practice Guide on management of climate change financial risks (CPG 229) for banks, insurers, and superannuation trustees.
The European Banking Authority (EBA) Single Rulebook Question and Answer (Q&A) tool updates for this month include answers to 10 questions.
The European Commission, or EC, finalized the Implementing Regulation 2021/2017 with respect to the benchmark portfolios, reporting templates, and reporting instructions for the supervisory benchmarking of internal approaches for calculating own funds requirements.
The European Commission (EC) has adopted a package of measures related to the Capital Markets Union.
The European Council adopted its position on two proposals that are part of the digital finance package adopted by the European Commission in September 2020, with one of the proposals involving the regulation on markets in crypto-assets (MiCA) and the other involving the Digital Operational Resilience Act (DORA).
The Prudential Regulation Authority (PRA) is proposing, via the consultation paper CP21/21, to apply group provisions in the Operational Resilience Part of the PRA Rulebook (relevant for the Capital Requirements Regulation or CRR firms) to holding companies.