You are here
AI can oil financial markets, but call goes out to manage risks
The use of artificial intelligence (AI) and machine learning could lead to greater efficiency and interconnectedness between financial markets - but only if the potential risks are properly managed, an international financial body has warned.
The Financial Stability Board (FSB), which monitors and makes recommendations about the global financial system, noted in a report on Wednesday that financial institutions are already actively using AI and machine learning to assess credit quality, in pricing and marketing insurance contracts and in automating client interactions.
Hedge funds, broker-dealers and other firms are using it to find signals for higher, uncorrelated returns and to optimise trade execution.
And both public and private sector institutions could use these technologies for regulatory compliance, surveillance, data-quality assessment and fraud detection.
These developments could benefit the financial system, the FSB noted.
For example, AI and machine learning could lead to more efficient processing of information on credit risks and lower-cost customer interaction.
The internal, or back office, applications of AI and machine learning could also improve risk management, fraud detection and compliance with regulatory requirements, potentially at lower cost.
In portfolio management, the more efficient processing of information from AI and machine learning applications could boost the efficiency and resilience of financial markets, which would then reduce price misalignments earlier and reduce crowded trades.
Finally, with use cases by regulators and supervisors, there is potential to increase supervisory effectiveness and perform better systemic-risk analysis in financial markets, the FSB said.
However, it added: "As with any new product or service, there are important issues around appropriate risk management and oversight.
"One risk is that the use of AI and machine learning could create 'black boxes' in decision-making that could create complicated issues.
"In particular, it may be difficult for human users at financial institutions - and for regulators - to grasp how decisions, such as those for trading and investment, have been formulated."
The FSB added: "Moreover, the communication mechanism used by such tools may be incomprehensible to humans, thus posing monitoring challenges for the human operators of such solutions."
The network effects and scalability of new technologies could also give rise to third-party dependencies.
"This could in turn lead to the emergence of new systemically important players that could fall outside the regulatory perimeter," it warned.
As with any new product or service, it will be important to assess uses of AI and machine learning in view of their risks, including adherence to relevant protocols on data privacy, conduct risks and cybersecurity, the FSB noted.
"Adequate testing and 'training' of tools with unbiased data and feedback mechanisms is important to ensure applications do what they are intended to do."