Cheap AI tools fuel teen-driven rise in deepfake sex crimes in South Korea
Sign up now: Get ST's newsletters delivered to your inbox
Among suspects linked to such crimes, nearly 62 per cent were teenagers.
PHOTO: AFP
Follow topic:
SEOUL - A sharp rise in artificial intelligence-generated sex crimes in South Korea is being driven largely by teenagers, according to police, in what officials describe as a troubling intersection of cheap deepfake technology, digital manipulation and underage offenders.
Between November 2024 and October 2025, South Korean police apprehended 3,557 individuals for cybersexual violence.
The country’s National Office of Investigation revealed on Nov 16 that deepfake-related crimes
Among suspects linked to these crimes, nearly 62 per cent were teenagers, making minors the largest offender group in a criminal trend typically associated with adult actors and organised networks.
The South Korean authorities take a notably broad view of deepfake-related crime.
Unlike narrower international definitions, their approach includes not only synthetic pornography but also child sexual exploitation cases involving AI-assisted deception, blackmail or manipulation.
That framing expands the legal reach of deepfake laws and reflects how digital tools are being weaponised well beyond face-swopping apps or celebrity image misuse.
Two recent cases in local media reports illustrate the scale and method.
In one, a 15-year-old boy produced and distributed 590 deepfake porn videos featuring female celebrities.
He operated three Telegram channels with more than 800 users.
In another, four teenagers, including a 17-year-old ringleader, were arrested after luring victims via social media by claiming fake videos of them were already circulating.
They used that lie to pressure victims into creating actual explicit content of themselves. These four alone produced 79 illegal recordings over ten months.
The surge in cases follows a major legal change.
In October 2024, South Korea amended its sex crime legislation to remove the need to prove intent to distribute deepfake material.
Possession and viewing of such content is now punishable, which aligns the law more closely with the strict liability standards seen in some jurisdictions for child abuse content.
Out of 1,827 deepfake-related offenses identified under this broader legal category, police pursued enforcement in 1,462 cases, resulting in 1,438 arrests and 72 formal detentions.
Enforcement operations included expanded undercover work, the use of deepfake detection software, and cooperation with international platforms like Telegram
The authorities also took steps to reduce secondary harm to victims.
The police submitted over 36,000 removal requests to national regulators for harmful videos and referred more than 28,000 victims to the country’s digital sex crime support centre.
The government plans to continue its crackdown through October 2026, targeting not just creators and distributors but also consumers of illegal content.
Recognising the high rate of teenage involvement, police are developing prevention education with the Ministry of Education and pushing for closer cooperation with online platforms to enforce distribution bans.
“Cybersexual crimes are becoming more covert and more technologically advanced,” said Mr Park Woo-hyun, who heads the cyber investigation division at the Korean National Police Agency.
“These are serious offences that destroy victims’ dignity, and we will continue doing everything possible to eradicate them.” THE KOREA HERALD/ASIA NEWS NETWORK

