“The market is driven by human emotion,” said Mario Savvides, the project’s lead scientist. “What has come to us, can we abstract things like expression or movements as early indications of instability? Everyone gets excited, or everyone shrugs or scratches their head or leans forward… Does everyone have a reaction within five seconds?’
The main phase of the study will take place over a 12-month period, starting in the third quarter of 2023, and will involve around 70 traders at investment firms located primarily in the US. They will all have cameras mounted on their computers to record their faces and gestures throughout the day, according to Savvides. The cameras will be linked to software from Israeli company Oosto, formerly known as AnyVision Interactive Technologies Ltd., which hopes to develop an alert system for trends in traders’ faces, or a volatility index, that it can sell to investment firms.
Oosto, which makes facial recognition scanners for airports and workplaces, declined to name the companies involved in the research, but said those companies would get early access to any new tools that came out of the research. Each individual’s footage will remain on their own computer or in their physical space; only data and numbers representing their expressions and gestures will be uploaded to researchers.
The human face is made up of 68 different points that change position frequently, said Savvides, who co-authored a 2017 study on facial “landmarks.”
Its system will also track a salesperson’s gaze to see if they’re talking to a colleague or looking at a screen, and see if their peers are doing the same. “We have a whole toolkit of search algorithms that we will test to see if they correlate with a market signal,” Savvides said. “We are looking for a needle in a haystack.”
Advertisers are already using facial analytics to study how exciting an ad is, while retailers are using it to see how bored customers are and hiring managers are using it to determine, rather creepily, whether a job candidate is excited enough.
Investigating the stock market seems more dystopian at first glance. Trading algorithms have tried to use information from the weather, social media or satellites for years, but there is something a little disparaging about traders themselves being exploited for data. The researchers also arguably put traders in an endless feedback loop where their actions and decisions become derivative and their famous lemming-like behavior is reinforced. If you thought the market was already driven by a herd mentality, this will probably make it worse, but that’s how the market works.
“Everyone on the street is talking,” said one trader in London (not part of the study), who said they would find such alerts about the mood of their peers useful. “All part of what we do is discussing ideas and sharing information. . . . Nonverbal communication is massive.” Years ago, trading floors were loud places where people often talked on three or four phone lines at the same time; now most people communicate through chat rooms and the conversation is minimal.
But the study also points to another inconvenient phenomenon. Facial recognition is here to stay, and its more controversial cousin, facial analysis, may be too. Despite all the concerns that have been raised about facial recognition, including the mistakes it can make as a surveillance tool, tens of millions of us still use it to unlock our phones without hesitation.
Facial analysis, as used by Carnegie Mellon, opens a bigger can of worms. Last summer, Microsoft Corp. pledged to eliminate its facial analysis tools that estimated a person’s gender, age and emotional state, admitting that the system could be unreliable and invasive. what data can be added for the edge. But this study, if successful, could spur research into analyzing faces for other purposes, such as assessing a person’s emotional state during a work meeting.
“If you’re doing a business deal on Zoom, can AI read your face to tell if someone’s calling your bluff or being a tough negotiator?” Savvides asks. “It’s possible. Why not?”
Zoom Video Communications Inc. introduced a feature last year that tracks sentiment during a recorded work meeting. A software aimed at sales professionals called Zoom IQ scores meeting participants from 0 to 100, with anything above 50 indicating greater engagement in the conversation. The system does not use facial analysis, but tracks speaker engagement or how long a person waits to respond and offers a rating at the end of the meeting.
More than two dozen human rights organizations have called on Zoom to stop working on the feature, arguing that sentiment analysis is based on pseudoscience and is “inherently biased.” A Zoom spokesperson said the company still sells the software and that it “turns customer interactions into meaningful insights.”
You could argue that Carnegie researchers shouldn’t care what a facial analysis tool tells them about the emotions of their traders; they just need to find the patterns that indicate correlation and feed those numbers into the search algorithm. But the negative side of turning emotions into numbers is just that. it risks devaluing one of the most fundamental characteristics of being human. It would be better if it didn’t catch.
More from Bloomberg Opinion.
• Why casinos spy on their ultra-rich customers. Parmy Olson
• Be careful, here are the predictions for 2023. John Otters
• Magnus Carlsen’s strongest future opponent is AI Tyler Cowen
(1) Amazon continues to sell facial analysis software that estimates someone’s gender and also predicts whether they’re happy, confused, disgusted, or more.
This column does not necessarily reflect the opinion of the editorial board or of Bloomberg LP and its owners.
Parmy Olson is a technology columnist for Bloomberg Opinion. He is a former Wall Street Journal and Forbes reporter and author of We Are Anonymous.
More stories like this are available at bloomberg.com/opinion