Technologies

Can algorithms destabilize the entire financial system?

[ad_1]

The World Economic Forum’s new report, Navigating Uncharted Waters, was produced in collaboration with Deloitte and published today. It is based on more than 10 months of in-depth research, global workshops and input from companies such as RBS, Morgan Stanley, UBS, JP Morgan, Credit Suisse, Microsoft, BlackRock and the New York Stock Exchange.

The report warns that the widespread adoption of artificial intelligence (AI) has the potential to create a fundamentally different kind of financial system in which human-machine interactions grow even as humans struggle to understand the opaque behavior of AI systems.

“As a result, crises and critical events may occur more frequently and market shocks may intensify,” the report noted.

==

==

"Emerging risks will no longer sit neatly within a single regulated institution, but will instead be dispersed across an interconnected set of actors that includes small niche fintech companies and large technology companies."

As a result, the report suggests that supervisors and regulators will need to reinvent themselves as centers of system-wide intelligence.

The rise of AI-powered systems has some people worried that machines are becoming too complex to understand. Others fear that the people behind the machines are the biggest cause for concern.

The report warns that optimizing algorithms that compete with each other can inadvertently destabilize markets. For example, two AI systems can continuously bid against each other, optimizing their actions to achieve a single goal such as the highest market price or return.

"The average market price continues to rise as they repeatedly outbid each other until one bidder is no longer able to sustain its bids due to constraints on profitability," the report warned. "Over time, this competitive optimization may lead to players' balance sheets deteriorating, encouraging riskier behavior in order to maintain profitability, or leaving them out of the market altogether," the report said.

Deloitte Canada partner and head of global banking and capital markets consulting Rob Galasky said the use of AI in financial services will require an openness to a fundamentally new way of protecting the ecosystem, different from the tools of the past.

"To accelerate the pace of AI adoption in industry, institutions must take the lead in developing and offering new frameworks that address new challenges, working with regulators along the way," he said.

The report notes that AI-first financial services firms face higher risks by deploying emerging technologies without regulatory clarity. But they also stand to gain the most.

"AI offers financial services providers the opportunity to build on the trust their customers place in them to improve access, improve customer outcomes and drive market efficiency," said World Economic Forum head of financial services Matthew Blake .

"This can offer competitive advantages to individual financial firms while improving the wider financial system if implemented appropriately."

Algorithmic bias is a major concern for financial institutions, regulators and customers around the use of AI in financial services. AI's unique ability to rapidly process new and different types of data raises concerns that AI systems may develop unintended biases over time; coupled with their opaque nature, such deviations may go undetected.

Despite these risks, AI also provides an opportunity to reduce unfair discrimination or exclusion, for example by analyzing alternative data that can be used to assess customers with "thin files" that traditional systems cannot understand due to lack of information.

"Given that AI systems can act autonomously, they can likely learn to engage in collusion without any instruction from their human creators, and perhaps even without any explicit, traceable communication," the report noted.

"This challenges traditional regulatory constructs for detecting and prosecuting collusion and may require a review of existing legal frameworks."


[ad_2]

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *