Deepfake sex crimes widespread among teens in South Korea
Sign up now: Get ST's newsletters delivered to your inbox
A recent report showed that there had been 180 criminal cases related to deepfake images in 2023.
PHOTO ILLUSTRATION: UNSPLASH
Follow topic:
SEOUL – Recent progress in video technology has had some alarming effects in South Korea, as a growing number of tech-savvy youngsters are using deepfake technology to produce sexual images of people, often their own peers, without their consent.
A recent report showed that there were 180 criminal cases related to deepfake images in 2023.
Of the 120 people punished for those crimes, 91 – or 75.8 per cent – were teenagers, according to a report compiled by Representative Cho Eun-hee of the People Power Party, which used data provided by the National Police Agency.
Both the number of deepfake-related crimes as well as the percentage of such crimes perpetrated by teens have been trending upwards. In 2022, there were 156 cases of deepfake crimes – 61 per cent of convictions were of teens.
“These digital sex crimes that inflict irreversible damage on the victims are spreading among teens, as if it were a game,” Ms Cho said, calling for a systematic revision to prevent such crimes.
Spreading of deepfake crime
On Aug 21, the Busan Metropolitan Office of Education said four middle school students were being investigated by the police for using deepfake technology to digitally clone the faces of 18 students and two teachers. They made some 80 pornographic images of the victims, which they shared via mobile messenger apps.
There were 12 cases of students spreading pornographic deepfakes of fellow students in Busan in 2023, but there have been 15 cases in the first six months of 2024 alone.
On Jeju Island, the police recently caught a teenage student at an international school making deepfake pornography using the faces of at least 11 of his fellow students.
Deepfake crimes can take various forms. Sometimes the images are used to bully a victim, but they are also created to make money.
In 2022, a high school student was found guilty of selling pornography – including doctored photos of real people – to 110 people online, in exchange for gift certificates.
An official at the state-run Sunflower Centre, which provides counselling for victims of sexual abuse, told local media that while the overwhelming majority of the cases involved male students victimising females, students of both genders have also been reported to have victimised students of the same gender.
South Korean teens have easy access to artificial intelligence (AI) services. A survey of 2,261 teens published in May by the National Information Society Agency found that about 77.5 per cent of teenagers in the country said they knew about generative AI, and over half – 52.1 per cent– said they had used it.
Generative technology itself can be used to create all kinds of images, written content and music, and is a tool used across a number of industries. Very few of these tech-savvy Korean teens use it for illegal means, as indicated by data from the police.
But like any tool, it can be harmful in the wrong hands, and there has been growing concern over the harm that can be inflicted by its abuse.
Concern over wrongful use of AI
Despite an increasing number of teens being punished for using deepfake technology, the penalties are seldom very serious. This is partly because punishment for minors is generally lighter.
Adults who process or edit false video, audio or photo content of another person in a form that causes sexual shame against the other person’s will, with intent to disseminate, can be punished by up to five years in prison, or a fine of up to 50 million won (S$50,000).
But the Hankyoreh newspaper recently reported that the actual punishment is usually far more lenient than the law suggests.
The paper’s analysis of 46 court verdicts related to fake videos showed that out of the 18 indicted only for spreading fake videos – excluding those who were charged with other crimes as well – only one person received a prison term, with 15 escaping with a suspended term, while two received a fine.
In one case, a young man who doctored a photo of his teenage cousin and shared it on a mobile messenger app was initially sentenced to two years in jail, but an appellate court reduced the punishment to a suspended term.
The court said in a verdict that the defendant was young, had no former criminal record, and that his parents had pledged to monitor him.
The law also stipulates that an offender must have the purpose of dissemination, meaning punishing those who possess such deepfake pornography is in a legal grey area.
As such, criminal experts stress that the government should take steps to educate students to make sure they realise the severity of the crime.
Professor of criminal psychology Lee Soo-jung, at Kyunggi University, said in an interview with local media that lessons on computer technology, such as coding, must cover the legal and ethical aspects of its use.
She stressed that such education must commence at a young age, as the risk of perceiving such crimes as merely pranks is higher in younger children. THE KOREA HERALD/ASIA NEWS NETWORK

