In a world first, CBA is sharing its AI model to help reduce the misuse of technology

In a world first, CBA is sharing its AI model to help reduce the misuse of technology

Angela McMillan, Client Solicitor at CBA Group, said: “Financial abuse occurs when money is used to control a partner, and it is one of the most powerful ways to keep someone trapped in an abusive relationship. “Unfortunately we see perpetrators using all sorts of ways to circumvent existing measures such as using… Message field to send abusive or threatening messages when making a digital transaction.

“We developed this technology because we noticed that some customers were using transaction descriptions as a way to harass or threaten others.

“Using this model, we can examine unusual transaction activity and identify patterns and situations that are considered high risk so the bank can investigate and take action.”

The model detects about 1,500 high-risk cases annually.

“By sharing our source code and model with any bank in the world, this will help financial institutions gain better insight into the abuse facilitated by technology. This can help inform what action a bank may choose to take to help protect “Customers.”

The use of AI demonstrates how innovative technology can create a safer banking experience for all customers, especially those living in vulnerable circumstances.

The model and source code are available this week through the bank’s partnership with on GitHub, the world’s largest platform for hosting source code. The model was created by CBA and the source code was developed in partnership with the bank’s exclusive partner and global leader in AI,

The AI ​​model complements the bank’s automatic blocking filter introduced in 2020 across its digital banking channels to stop transaction descriptions that include threatening, harassing or abusive language.

In an effort to combat the misuse of technology, the bank has implemented the following:

  • Automatic filter blocks abusive, threatening or offensive words in digital payment transactions. It has so far blocked nearly a million transactions since its implementation in 2020
  • Artificial intelligence and machine learning to detect more malicious forms of misuse in transactions. From here, the bank can manually review these cases and take the necessary actions. The model is fully operational and has detected more than 1,500 cases annually since its implementation in 2021

The announcement follows the bank’s trial with NSW Police earlier this year to refer perpetrators of financial abuse to the police, with customer consent.

For more details on Next chapterVisit:

Anyone concerned about their finances due to domestic or family violence or coercive control can call the Next Chapter team on 1800 222 387 for support – regardless of who they bank with.

If you or someone you know is experiencing domestic or family violence, call 1800RESPECT (1800 737 732) or visit

In case of an emergency or if you do not feel safe, always call 000.

¹ A May 2023 community survey of over 10,000 Australians on financial exploitation, commissioned by the Commonwealth Bank of Australia ABN 48 123 123 124 and conducted online by YouGov.

    (Tags for translation) Media release 

You may also like...

Leave a Reply