Your Payment Text Field Messages Are Being Watched, so Don’t be an Egg

Your Payment Text Field Messages Are Being Watched, so Don’t be an Egg
Image: Getty Images

We’ve all done it. Transferred a friend some money after they paid for lunch and left a ‘funny’ message or remark in the payment text field. I for one have had my fair share of sexual innuendo or notes pointing to illegal activities attached to a $20 payment that was actually for a burger. But Austrac is watching and while it doesn’t want you to stop having fun, it wants to stop this feature being used for abuse.

In early 2017, the Australian Transaction Reports and Analysis Centre (Austrac) stood up a public-private initiative to follow the money trail in a bid to “harness and turbo-charge the collective knowledge of government and industry”.

Comprised of the financial sector, non-government organisations, law enforcement and national security agencies, the Fintel Alliance was shaped to combat money laundering and terrorism financing, with the mandate to collectively stop criminals.

Part of its remit has since stretched to cover more domestic issues, such as the misuse of the payment text field.

In a report published Friday, the Fintel Alliance said it has identified an increase in the misuse of payment text fields in financial transactions as a method of criminal communication or abuse, rather than the primary purpose of transferring funds.

“Instead the transaction text fields are being used with increasing frequency to communicate for the purpose of stalking, harassing and threatening behaviour, or to avoid law enforcement scrutiny,” it wrote.

Payment text fields that make explicit threats to a victim or contain profanities that are considered abusive or offensive are commonly identified by financial service providers. However, Austrac said a significant challenge in identifying legitimate instances of criminal communication is the high volume of ‘false positives’ detected when cross referencing payment text fields with pre-established terms lists deemed to be inappropriate.

There’s a whole bunch of criteria the banks consider when determining something is off, such as payment frequency, value (for example, if it’s a dozen $1 transactions in quick succession) and the already known relationship between the two parties.

But when it comes to finding abuse, AI is used to trawl payment text field messages for known ‘bad’ words, as well as to determine sentiment.

Words or phrases that have a dual meaning often appear in the payment text field and present detection challenges. For example, the words ‘pig’ and ‘dog’ have an everyday meaning and appear in legitimate, non-threatening payment text fields but can also be used in a threatening or abusive manner. And they’re not even the ‘swears’ that can actually be terms of endearment in Aussie slang.

Westpac announced it would no longer allow messages containing abuse to be sent in transaction descriptions. In February, it said it wanted to create a safer digital banking experience for customers and send a clear signal that abusive messages in payment transactions will not be tolerated. And the Commonwealth Bank has also been doing the same, developing an AI machine learning model to help it further identify payments that contain abusive transaction descriptions.


If you or someone you care about needs support, please call LifeLine Australia on 13 11 14, the National Sexual Assault, Domestic Family Violence Counselling Service on 1800 737 732 or MensLine Australia on 1300 789 978. 

If life is in danger, call 000.