Letter of the week: AI chatbots shouldn’t be allowed to spread untruths

Sign up now: Get ST's newsletters delivered to your inbox

epa11338415 A person walks past an AI graphic illustration during the 'CYBERSEC 2024' cyber security expo in Taipei, Taiwan, 14 May 2024. The 10th edition of CYBERSEC 2024, which is held from 14-16 May under the theme 'Generative Future,' gathers leading cybersecurity professionals for more than 300 presentations and features exhibitions from over 500 global brands at Asia’s biggest CYBERSEC EXPO. This year introduces new segments such as the Cyber Taiwan Pavilion, Cyber Talent, and the AIoT and Hardware Security Zone. EPA-EFE/RITCHIE B. TONGO

Laws need to be strengthened to protect individuals and institutions against defamatory content and untruths churned out by generative AI, says the reader.

PHOTO: EPA-EFE

Google Preferred Source badge

I was astonished to read that when Straits Times reporter Osmond Chia fed the question “Who is Osmond Chia?” into Meta AI’s chatbot, it spat out a list of criminal charges under his name – instances of the chatbot confusing his name with the crime headlines he has reported (

Ever looked yourself up on a chatbot? Meta AI accused me of a workplace scandal

, May 20).

This cannot be a satisfactory situation.

Imagine an employer being fed erroneous information linking a potential hire to unsavoury matters which have nothing to do with him other than, say, sharing the same name or as a result of the AI algorithm’s confusion, like in Mr Chia’s case.

Surely laws need to be strengthened to protect individuals and institutions against defamatory content and untruths churned out by generative AI? I don’t see how it is fair to let these tech companies get away with reputational murder.

While the aggrieved party has the right to sue the tech firm, the reality is that people may be unaware that disparaging information about them is lurking out there.

The onus shouldn’t be on people to ask about themselves to ensure that the tech bots haven’t maligned them.

Peh Chwee Hoe

See more on