Christopher Woolard of FCA on AI and Future of Regulation
Christopher Woolard of FCA spoke at a conference in London, focusing on the issue of whether decisions that materially affect lives of people can be outsourced to a machine and what this might mean for the future of regulation. He mentioned that, as a regulator, FCA considers the use of artificial intelligence, also commonly referred to as AI, in financial services from the three main perspectives of continuity, public value, and collaboration. He reinforced that regulators, academics, industry, and the public need to work together to develop a shared understanding that will determine the approach to be taken to answer the questions artificial intelligence poses in the years ahead. These developments involve "big and complex questions" that "go beyond the day-to-day operations of FCA and other regulators," with the key question being "how can we ensure the regulatory framework adapts to the changing economic, demographic, and political environment in which it operates?"
Highlighting the results of a joint survey by FCA and BoE to assess the state of play on artificial intelligence, Mr. Woolard mentioned that the use of artificial intelligence in the firms it regulates is best described as nascent. The technology is employed largely for back-office functions, with customer-facing technology largely in the exploration stage. He also highlighted that the risks presented by artificial intelligence will be different in each of the contexts in which it is deployed. After all, the risks around algorithmic trading will be totally different from those that occur when artificial intelligence is used for credit ratings purposes or to determine the premium on an insurance product. FCA does not have one universal approach to harm across financial services because harm takes different forms in different markets and, therefore, has to be dealt with on a case-by-case basis; it will be the same with artificial intelligence too. He added, if firms are deploying artificial intelligence and machine learning, they need to ensure they have a solid understanding of the technology and the governance around it.
Next, he highlighted the growing consensus around the idea that algorithmic decision-making needs to be "explainable." For example, if a mortgage or life insurance policy is denied to a consumer, the reasons for denial need to be explained. However, the question is what should be the extent and level of explainability (to a consumer or an informed expert) and what takes precedence—the accuracy of prediction or the abilit to explain it. The challenge is that explanations are not a natural by-product of complex machine learning algorithms. It is possible to "build in" an explanation by using a more interpretable algorithm, but this may dull the predictive edge of the technology. This is why FCA has partnered with The Alan Turing Institute to explore the transparency and explainability of artificial intelligence in the financial sector, said Mr. Woolard. Through this project, FCA wants to move the debate from the high-level discussion of principles (which most now agree on) toward a better understanding of the practical challenges on the ground that machine learning presents.
With regard to collaboration, he emphasized that these innovations cannot be developed in isolation and the problems that artificial intelligence and machine learning have the potential to solve are cross-border, cross-sector, sometimes cross-agency. He also highlighted that, at an international level, FCA is leading a workstream on machine learning and artificial intelligence for IOSCO, exploring issues around trust and ethics and what a framework for financial services might look like. It is also looking inward and asking whether it can do anything differently as a regulator to ensure that it is ready for the challenges of the future. he added that, at a basic level, firms using this technology must keep one key question in mind, not just "is this legal?’"but "is this morally right?" Regulators have a range of powers and tools to tackle these issues now but, with the increasing use of technology, those tools may need to be updated for a fully digital age. "This is something we will be thinking about in our own work on the ‘Future of Regulation.'" FCA is taking a fundamental look at how it carries out conduct regulation and shapes the regulatory framework going forward, in what it calls its "Future of Regulation" project.
Related Link: Speech
Keywords: Europe, UK, Banking, Insurance, Securities, Artificial Intelligence, Machine Learning, Regtech, Future of Regulation, FCA
Previous Article
ISDA Publishes Guidelines for Smart Derivatives ContractsRelated Articles
BIS Report Notes Existing Gaps in Climate Risk Data at Central Banks
A Consultative Group on Risk Management (CGRM) at the Bank for International Settlements (BIS) published a report that examines incorporation of climate risks into the international reserve management framework.
EBA Publishes Multiple Regulatory Updates for Regulated Entities
The European Banking Authority (EBA) published the final guidelines on liquidity requirements exemption for investment firms, updated version of its 5.2 filing rules document for supervisory reporting, and Single Rulebook Question and Answer (Q&A) updates in July 2022.
APRA Consults on Prudential Standard for Operational Risk
The Australian Prudential Regulation Authority (APRA) is seeking comments, until October 21, 2022, on the introduction of CPS 230, which is the new cross-industry prudential standard on operational risk management.
EC Amends Rule on Securitizations; ESRB Updates Reciprocation Measures
The European Commission published a Delegated Regulation 2022/1301 on the information to be provided in accordance with the simple, transparent, and standardized (STS) notification requirements for on-balance-sheet synthetic securitizations.
APRA Announces Revisions to Capital Framework for Banks
The Australian Prudential Regulation Authority (APRA) is announced revisions to the capital framework for authorized deposit-taking institutions to implement the "unquestionably strong" capital ratios and the Basel III reforms.
EBA Examines Remuneration Data and Use of Large Exposure Exemptions
The European Banking Authority (EBA) published a report that examines the use of certain exemptions included in the large exposures regime under the Capital Requirements Regulation (CRR).
UK Authorities Publish Discussion Paper on Critical Third Parties
The Bank of England (BoE), the Prudential Regulation Authority (PRA), and the Financial Conduct Authority (FCA) published a joint discussion paper that sets out potential measures to oversee and strengthen the resilience of services provided by critical third parties to the financial sector in UK.
BoE Issues Update on Ongoing Data Transformation Program
The Bank of England (BoE) issued a communication to firms to provide an update on the progress of the joint data transformation program—which is being led by BoE, the Financial Conduct Authority (FCA), and the industry—for the financial sector in UK.
EBA Issues Draft Methodology and Templates for 2023 Stress Tests
The European Banking Authority (EBA) published the draft methodology, templates, and template guidance for the European Union-wide stress test in 2023.
EBA Issues SREP Guidelines and Standards for Investment Firms
The European Banking Authority (EBA) and the European Securities and Markets Authority (ESMA) jointly published the final guidelines on common procedures and methodologies for the supervisory review and evaluation process (SREP) for investment firms.