For subscribers

Why are we not doing more about deepfakes and the online abuse of women and girls?

“Nudify” apps are making underage girls as well as celebrities victims of fake porn images. Not enough is being done to hold tech giants accountable.

Sign up now: Get ST's newsletters delivered to your inbox

With just a single good image of a person’s face, it is now possible in just a half-hour to make a 60-second sex video of that person.

With a single good image of a person’s face, it is now possible, in just a half-hour, to make a 60-second sex video of that person.

ILLUSTRATION: NYTIMES

Nicholas Kristof

Follow topic:

Alarms are blaring about artificial intelligence (AI) deepfakes that manipulate voters, like the robocall sounding like President Joe Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.

Yet, there’s actually a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebrities and unknown children alike. One recent study found that 98 per cent of deepfake videos online were pornographic, and that 99 per cent of those targeted were women or girls.

See more on