Meta tests facial recognition tool to detect scams misusing celeb images, recover hacked accounts

Sign up now: Get ST's newsletters delivered to your inbox

FILE PHOTO: The logo of Meta Platforms' business group is seen in Brussels, Belgium December 6, 2022. REUTERS/Yves Herman/File Photo

Meta does not have any figures about the accuracy of the system but early tests have shown promising results.

PHOTO: REUTERS

Follow topic:

SINGAPORE – A group of celebrities, content creators and politicians worldwide, including key public figures in Singapore, will be enrolled in a facial recognition trial by Meta to identify scam advertisements that misuse their likeness.

The system will compare the faces in the ads with their profile pictures to detect any unauthorised use on Facebook and Instagram – both run by the US tech giant.

Meta will inform a group of some 50,000 public figures through in-app notifications that they are being enrolled in the service. They can opt out at any time, Meta said in an announcement on Oct 22.

In a separate security feature, Meta is also using the same facial recognition technology to verify the identity of users to recover hacked accounts faster.

Meta’s director of global threat disruption, Mr David Agranovich, said on Oct 21 that the pilot initiative using popular accounts aims to combat fraudulent ads that use real images of famous people to bait users to engage with the ads, which typically lead to scam sites, where they are asked to share personal information or make transactions.

The ploy, commonly called “celeb-bait”, is against Meta’s policies and undermines trust in its social media platforms, Mr Agranovich told the media.

The test hopes to enrol users who have been targeted by celeb-baiting, he said.

Suspected ads that contain a photo of a public figure’s face will be run past a facial recognition software that compares the face on the ad with the public figure’s Facebook and Instagram profile pictures.

“If we confirm a match and that the ad is a scam, we’ll block it,” said Meta.

“We immediately delete any facial data generated from ads for this one-time comparison regardless of whether our system finds a match, and we don’t use it for any other purpose,” it added.

The review is automated rather than performed manually by a person as it is more accurate and done in real time, said Meta, adding: “If the system detects a match, we will check to confirm if the ad is a scam. If we confirm that the ad is a scam, we will take it down.”

Meta is no stranger to the use of automated software to review content. The new system aims to increase the speed and accuracy of detecting scams targeting well-known users.

There has been no shortage of scam ads that have slipped past such systems, including fake crypto ads misusing images of

local celebrities

such

as actress Rebecca Lim

and pop star JJ Lin, demonstrating the challenges for publishers to monitor the sheer volume of digital ads distributed on their sites.

Artificial intelligence-generated videos of

politicians such as Prime Minister Lawrence Wong

and Senior Minister Lee Hsien Loong have also been used by fraudsters to promote investment scams and other schemes.

Meta has also come under pressure from Australian regulators to crack down on celebrity baiting after at least 100 reports of such content since April from the Australian Financial Crimes Exchange.

In early October, Meta removed around 8,000 scam ads, many of which deceived consumers into investing in fake schemes.

When asked, Mr Agranovich said Meta did not have any figures for how accurate the system is, but he said early tests have shown promising results.

Public figures will be notified in advance that their accounts have been chosen for the pilot initiative, and have the option to opt out of the test, said Meta.

It did not provide details on the number of Singapore accounts that will be enrolled.

Recovering accounts

The same technology is being used to verify video selfies as a means for users to regain access to hacked accounts. Launching progressively from Oct 21, the recovery system compares users’ video selfies against their profile pictures on the accounts they are trying to access.

The solution addresses the

frustrations of victims

who have long criticised social media platforms for their sluggish response in verifying and recovering compromised accounts. For content creators and businesses, in particular, social media accounts and their thousands of followers can be vital to their livelihoods.

Meta plans to use video selfies to speed up account recovery and make it more difficult for fraudsters to use fake documents to mimic their victims’ identities.

A similar technology has been adopted to unlock mobile phones and for identity verification on Singpass.

Video selfies uploaded to Meta will be encrypted and stored securely. Meta said: “It will never be visible on their profile, to friends or to other people on Facebook or Instagram.

“We immediately delete any facial data generated after this comparison regardless of whether there’s a match or not.”

See more on