Machines far from ready to replace humans when to comes to tackling money laundering
LONDON • When HSBC Holdings thwarted a US$500 million (S$691 million) central bank heist, sophisticated computer software didn't raise the alarm.
The funds flowed undetected from Angola's reserves to a dormant company's account in London. It was a teller at a suburban bank branch who became suspicious, declined a request to transfer US$2 million, and triggered a review that uncovered the scam, according to one account of the episode.
That was two years ago, and the finance industry's battle to stop the illicit transfer of as much as US$2 trillion a year around the globe hasn't become any easier. At least a half-dozen lenders in Europe have found themselves at the centre of fresh allegations of dirty money schemes in the past year. The wave of scandals - at Denmark's Danske Bank, Deutsche Bank, and others - is undermining confidence in the industry well beyond the individual institutions involved.
Financial services executives have had little choice but to significantly step up regulatory efforts; more than one in 10 now spend in excess of 10 per cent of their annual budgets on compliance, according to financial adviser Duff & Phelps. Banks are eager to find ways to bring that spending down - management, employees and shareholders never want to spend on what are effectively internal cops.
Today, there's a sense that growth may be peaking. About two-thirds of institutions considered systemically important on a global level, a leading indicator for the industry, expect the size of their compliance teams to remain unchanged or shrink, according to a Thomson Reuters Regulatory Intelligence report. The largest companies want to adapt their teams to grow or scale back as necessary, the report said.
That has led to buzz that banks are deploying artificial intelligence (AI) to replace surveillance staff. HSBC last year started using AI to screen transactions, and the two biggest Nordic banks have said they are replacing compliance staff with algorithms. Online banking start-ups, such as Revolut, which rely on computerised efficiency to compete with established lenders, are finding compliance a challenge they need to address.
So far, machines are confined to simple know-your-customer (KYC) applications and are far from ready to replace humans, says Professor Tom Kirchmaier, a visiting fellow at the London School of Economics' Centre for Economic Performance. He is not optimistic that a major advance is afoot, either. "There's a lot of talk but no action," he says.
Ask ING Group, which last year paid € 775 million (S$1.2 billion) to settle an investigation by a Dutch prosecutor into alleged money laundering and other corrupt practices. Even though the bank uses machine learning to filter out false alerts on potential bad actors, the lender has had to ramp up the number of individuals handling KYC procedures. It has tripled compliance personnel in the Netherlands over eight years; staff dedicated to KYC account for 5 per cent of total employees.
Banks and tech companies need to overcome a number of obstacles for AI to succeed in tackling money laundering.
For starters, they need better customer data, which is often neither current nor consistent, especially when a bank spans multiple jurisdictions. Enhancing the quality and frequency of data gathering is a crucial first step.
Banks are also constrained in their ability to detect bad behaviour, with or without computers, because competitors and national law enforcement agencies won't share data. Across Europe, for example, regulation and enforcement are split along national borders.
Lenders would benefit from a common European anti-money-laundering regulator, data sharing among banks, and a more open dialogue with bank supervisors, Citigroup analysts wrote in a note to clients in June.
When banks do share information, it's often unhelpful. They tend to over-report suspicious activity to the relevant agencies to shed responsibility, but the enforcement authorities typically don't provide their findings to the financial companies. What's more, banks, wanting to shield bigger clients from unnecessary scrutiny, often under-report activity they should be flagging, according to Prof Kirchmaier. That leads to potentially suspicious transactions being classified as normal.
The algorithms learn to replicate those types of decisions.
In short, the historical data set available to train the machines is misleading, complicating their ability to learn detection. Criminals, by contrast, are constantly adapting their ways, finding new routes for their cash when existing ones are blocked off. Catching tomorrow's money launderers requires anticipating where they will move next. Will they trade gold or crypto assets? When parameters change even slightly, AI struggles to stay ahead of the criminals.
Trust in financial services after the 2008 crisis is taking a very long time to rebuild. Banks are wary that they risk teaching machines to stereotype customers based on where they come from or where they do business.
"Ethical concerns associated with AI are rightfully restraining banks' full embrace of machine learning," says chief product officer Alexon Bell at Quantexa, a London-based data analytics company that counts HSBC among its customers.
Regulators, frustrated with the slow speed of change, have encouraged banks to deploy more technology. One thing seems clear: Compliance spending at banks may be moving away from employing humans to adopting new software. But for now, those living and breathing internal cops are here to stay.
A version of this article appeared in the print edition of The Straits Times on August 22, 2019, with the headline 'The one job in banking the robots can't take'. Print Edition | Subscribe
We have been experiencing some problems with subscriber log-ins and apologise for the inconvenience caused. Until we resolve the issues, subscribers need not log in to access ST Digital articles. But a log-in is still required for our PDFs.