Technologies

CBA tags in AI to address transaction-based abuses

[ad_1]

The Commonwealth Bank of Australia (CBA) has announced that it will deploy new artificial intelligence technology and machine learning techniques to proactively detect financial abuse committed through its platforms.

CBA’s general manager of community and customer vulnerability, Justin Tsuei, said technology-facilitated abuse was a serious problem for the bank, describing it as “totally unacceptable behaviour”.

“The new model, which uses advanced artificial intelligence and machine learning techniques, allows us to provide a more targeted and proactive response than ever before,” explained Mr. Tsuei.

The bank’s new AI model complements the less complex block filter CBA implemented in 2020, which prevented transaction descriptions that included language the bank identified as threatening, harassing or abusive.

“We want to ensure that our customers feel safe when using our platforms, and it is our responsibility to do everything we can to ensure the right security measures are in place in our channels,” Mr Tsuei said.

CBA framed the move as an extension of the bank’s broader strategy around deploying AI-powered technology and innovation across its digital channels.

“As Australia’s leading digital bank, we are constantly looking for new and better ways to improve our products, channels and services,” Mr Tsui said.

CBA estimated that over 100,000 abusive transaction descriptions were blocked in the three-month period from May to the end of July 2021.

“With this new model, we are not only able to proactively detect potential abuse in transaction descriptions, but we can do so at an incredible scale,” said Mr. Tsuei.

The bank revealed that the new AI-powered detection model identified 229 senders of potentially serious abuse in both NetBank and the CommBank app, which were then manually reviewed by the bank.

In some extreme circumstances, CBA said it will go so far as to terminate a customer’s banking relationship if they continue to violate their policies by engaging in abusive, threatening or harassing behavior through transaction descriptions.

“Using AI technology and machine learning techniques to help us tackle a serious problem like technology-facilitated abuse shows how we can use innovative technology to create a safer banking experience for all customers, especially those in vulnerable circumstances such as victims of domestic and family violence,” said Mr Tsuei.


[ad_2]

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *