For subscribers

Deepfake nude apps are ruining lives and have no place in app stores

The technology has unleashed a wave of harassment and exploitation. A coordinated approach is needed to tackle this escalating crisis.

Sign up now: Get ST's newsletters delivered to your inbox

Deepfake-enabling tools and platforms, with their accessibility and potential for harm, have escalated into a significant threat.

Deepfake-enabling tools and platforms, with their accessibility and potential for harm, have escalated into a significant threat.

PHOTO: LIANHE ZAOBAO

Chew Han Ei

Follow topic:

Soon, creating and sharing sexually explicit “deepfakes” will become a criminal offence in Britain. The move, announced on Jan 7 by the government, is aimed at tackling a surge in the proliferation of such images, mainly targeting women and girls. The problem isn’t confined to the UK.

With artificial intelligence or AI, deepfake technology has been used to create hyper-realistic, yet entirely fabricated images and videos. Pornographic images can be digitally doctored into the likeness of an unsuspecting, innocent person and have become a widespread tool for exploitation.

See more on