IOSCO Proposes Guidance on Artificial Intelligence/Machine Learning
IOSCO proposed guidance for regulation and supervision of the use of artificial intelligence and machine learning by market intermediaries and asset managers. The consultation proposes six measures to assist IOSCO members in creating appropriate regulatory frameworks to supervise market intermediaries and asset managers that use artificial intelligence and machine learning. Annexes to this consultation report offer information on the guidance published by supranational bodies (such as IMF and FSB) and discuss how various regulators worldwide are addressing the challenges created by artificial intelligence and machine learning. Comment period on this consultation will expire on October 26, 2020.
IOSCO surveyed and held round table discussions with market intermediaries and conducted outreach to asset managers to identify how artificial intelligence and machine learning are being used and the associated risks. IOSCO identified the use of artificial intelligence and machine learning by market intermediaries and asset managers as a key priority. The potential risks and harms that were identified in relation to the development, testing and deployment of artificial intelligence and machine learning include governance and oversight; algorithm development, testing and ongoing monitoring; data quality and bias; transparency and explainability; outsourcing; and ethical concerns. The report proposes the following six measures to assist IOSCO members in creating appropriate regulatory frameworks to supervise market intermediaries and asset managers that use artificial intelligence and machine learning:
- Regulators should consider requiring firms to have designated senior management responsible for the oversight of the development, testing, deployment, monitoring and controls of artificial intelligence and machine learning. This includes requiring firms to have a documented internal governance framework, with clear lines of accountability. Senior management should designate an appropriately senior individual (or groups of individuals), with the relevant skill set and knowledge to sign off on initial deployment and substantial updates of the technology.
- Regulators should require firms to adequately test and monitor the algorithms to validate the results of an artificial intelligence and machine learning technique on a continuous basis. The testing should be conducted in an environment that is segregated from the live environment prior to deployment to ensure that artificial intelligence and machine learning behave as expected in stressed and unstressed market conditions and operate in a way that complies with regulatory obligations.
- Regulators should require firms to have the adequate skills, expertise and experience to develop, test, deploy, monitor and oversee the controls over the artificial intelligence and machine learning that the firm utilizes. Compliance and risk management functions should be able to understand and challenge the algorithms that are produced and conduct due diligence on any third-party provider, including on the level of knowledge, expertise and experience present.
- Regulators should require firms to understand their reliance and manage their relationship with third party providers, including monitoring their performance and conducting oversight. To ensure adequate accountability, firms should have a clear service level agreement and contract in place clarifying the scope of the outsourced functions and the responsibility of the service provider. This agreement should contain clear performance indicators and should also clearly determine sanctions for poor performance.
- Regulators should consider what level of disclosure of the use of artificial intelligence and machine learning is required by firms. Regulators should consider requiring firms to disclose meaningful information to customers and clients around their use of artificial intelligence and machine learning that impact client outcomes. They should also consider what type of information may be required from firms to ensure they can have appropriate oversight of those firms.
- Regulators should consider requiring firms to have appropriate controls in place to ensure that the data that the performance of the artificial intelligence and machine learning is dependent on is of sufficient quality to prevent biases and is sufficiently broad for a well-founded application of artificial intelligence and machine learning.
The proposed guidance, if implemented, should help ensure that firms have adequate control frameworks to appropriately use artificial intelligence and machine learning. The use of artificial intelligence and machine learning will likely increase as the technology advances and it is plausible that the regulatory framework will need to evolve in tandem to address the associated emerging risks. Therefore, this report, including the definitions and guidance, may need to be reviewed and/or updated in the future.
Related Links
Comment Due Date: October 26, 2020
Keywords: International, Banking, Securities, Artificial Intelligence, Machine Learning, Guidance, Fintech, Guidance, Fintech, Regtech, Governance, Big Data, IOSCO
Previous Article
US Agencies Finalize Amendments to Swap Margin RuleRelated Articles
BOE Sets Out Its Thinking on Regulatory Capital and Climate Risks
The Bank of England (BOE) published a working paper that aims to understand the climate-related disclosures of UK financial institutions.
OSFI Finalizes on Climate Risk Guideline, Issues Other Updates
The Office of the Superintendent of Financial Institutions (OSFI) is seeking comments, until May 31, 2023, on the draft guideline on culture and behavior risk, with final guideline expected by the end of 2023.
BIS Paper Examines Impact of Greenhouse Gas Emissions on Lending
BIS issued a paper that investigates the effect of the greenhouse gas, or GHG, emissions of firms on bank loans using bank–firm matched data of Japanese listed firms from 2006 to 2018.
HMT Mulls Alignment of Ring-Fencing and Resolution Regimes for Banks
The HM Treasury (HMT) is seeking evidence, until May 07, 2023, on practicalities of aligning the ring-fencing and the banking resolution regimes for banks.
BCBS Report Examines Impact of Basel III Framework for Banks
The Basel Committee on Banking Supervision (BCBS) published results of the Basel III monitoring exercise based on the June 30, 2022 data.
PRA Consults on Prudential Rules for "Simpler-Regime" Firms
Among the recent regulatory updates from UK authorities, a key development is the first-phase consultation, from the Prudential Regulation Authority (PRA), on simplifications to the prudential framework that would apply to the simpler-regime firms.
DNB Publishes Multiple Reporting Updates for Banks
DNB, the central bank of Netherlands, updated the list of additional reporting requests and published additional data quality checks and XBRL-Formula linkbase documents for the first quarter of 2023.
NBB Sets Out Climate Risk Expectations, Issues Reporting Updates
The National Bank of Belgium (NBB) published a communication on climate-related and environmental risks, issued an update on XBRL reporting
EBA Updates Address Securitization Standards and DGS Guidelines
The European Banking Authority (EBA) published the final draft of the regulatory technical standards that set out conditions for assessment of homogeneity of the underlying exposures in simple, transparent, and standardized (STS) securitizations.
FSB Publishes Letter to G20, Sets Out Work Priorities for 2023
The Financial Stability Board (FSB) published a letter intended for the G20 Finance Ministers and Central Bank Governors, highlighting the work that FSB will take forward under the Indian G20 Presidency in 2023