The saying goes that our eyes are the window to the soul. Perhaps, over time, they will serve a less romantic purpose – as windows to making money.
Researchers at Carnegie Mellon University in Pittsburgh, one of the leading institutions for artificial intelligence (AI) research, have embarked on a study using facial recognition algorithms to track the expressions of traders. Their goal: finding correlations between mood swings and market swings. If the traders look enthusiastic, it might be time to buy. Are there more furrowed brows than usual? Could be time to sell. The provisional US patent application was filed in September 2022.
“The market is driven by human emotions,” said Professor Mario Savvides, the project’s lead scientist. “What came to us is, can we abstract things like expression or movements as early indications of volatility?”
The main phase of the study will take place over 12 months beginning in the third quarter of 2023, and involve about 70 traders at investment firms mostly located in the United States. They will all have cameras mounted on their computers to record their faces and gestures throughout the day.
The cameras will be linked to software from Oosto, which hopes to develop an alert system for trends in traders’ faces, or a volatility index it can sell to investment firms. Footage of each individual will stay on his own computer or his physical premises; only data and numbers representing expressions and gestures will be uploaded to the researchers.
A person’s face is made up of 68 different points that frequently change position, according to Prof Savvides, who co-authored a study on facial “landmarks” in 2017. His system will also track a trader’s gaze to see if he is talking to a colleague or looking at his screen, and note if his peers are doing the same thing.
“We have a whole toolbox of searching algorithms that we will be testing to see if they correlate to a market signal,” he said.
Advertisers already use facial analysis to study how exciting an ad is, while retailers use it to see how bored customers are and hiring managers to determine, rather creepily, if a job candidate is enthusiastic enough.
Trading algorithms have for years tried to harness information from the weather, social media or satellites, but there is something a little demeaning about the traders themselves being exploited for data. The researchers are also arguably putting traders into a never-ending feedback loop where their actions and decisions become derivative and their notoriously lemming-like behaviour is amplified.
If you thought the market was already driven by a herd-like mentality, this will probably make it worse – but that is also how the market works.
“Everyone on the street talks,” said one trader in London (not part of the study) who said they would find such alerts about their peers’ sentiment useful. “The whole part of doing what we do is discuss ideas and share information... Non-verbal communication is massive.”
Years back, trading floors were loud places where people would often talk on three or four phone lines at the same time; now, many communicate over chat rooms, and talking is minimal.
Facial analysis, like the kind being used by Carnegie Mellon, opens a bigger can of worms. Last summer, Microsoft vowed to eliminate its facial analysis tools, which estimated a person’s gender, age and emotional state, admitting that the system could be unreliable and invasive.
This might not matter too much for traders who are eager to lap up whatever data they can for an edge. But this study – if successful – could embolden research into analysing faces for other purposes, like assessing one’s emotional state during a work meeting.
“If you are doing a business deal over Zoom, can you have an AI read the face to tell if someone is calling your bluff, or being a hard negotiator?” asked Prof Savvides. “It is possible. Why not?” BLOOMBERG