Ads pushed to people interested in knowing 'How to burn Jews'? Another Facebook facepalm


Facebook said in a statement that it had removed the ability to buy targeted marketing based on topics such as "Jew haters" and "how to burn Jews".
PHOTO: AFP

SAN FRANCISCO (Reuters) - Facebook Inc this week stopped advertisers from targeting messages to people interested in topics such as "Jew haters" and "how to burn Jews" after journalists inquired about it, the news organisation ProPublica reported on Thursday (Sept 14).

ProPublica, a non-profit outlet based in New York, said it found the topics in Facebook's self-service ad-buying platform and paid US$30 (S$40.40) to test them with its own content.

Another category it found was "History of 'why Jews ruin the world.'"

The anti-Semitic categories were created by an algorithm rather than by people, ProPublica reported. Some 2,300 people had expressed interest in them.

Facebook, the world's largest social network, said in a statement that it had removed the ability to buy targeted marketing based on those topics and believed the use of the topics in ad campaigns had not been widespread.

Along with Alphabet Inc's Google, Facebook dominates the fast-growing market for online advertising, in part because it lets marketers target their ads based on huge volumes of data.

Facebook, though, has had difficulty ensuring that advertisers on its self-service system comply with its terms and conditions.

Last year, ProPublica reported that Facebook allowed advertisers to exclude users by race when running housing or other ads, despite a prohibition on such ads under the US Fair Housing Act 1969.

Facebook last week said an operation likely based in Russia spent US$100,000 on thousands of US ads promoting social and political messages over a two-year period through May, fueling concerns about foreign meddling in US elections.

The company said it shut down 470 "inauthentic" accounts as part of an internal investigation into those ads.

The anti-Semitic targeting categories likely were generated because people listed those themes on their Facebook profiles as an interest, an employer or field of study, ProPublica reported.

Mr Rob Leathern, product management director at Facebook, said in a statement on Thursday that sometimes content appears on the network that "violates our standards".

"In this case," he went on, "we've removed the associated targeting fields in question. We know we have more work to do, so we're also building new guardrails in our product and review processes to prevent other issues like this from happening in the future."

Facebook said it was considering other changes to its advertising platform, such as adding more reviews of targeting categories before they show up in the self-service platform.

Join ST's Telegram channel and get the latest breaking news delivered to you.