Europol warns of uptick in AI-aided child abuse images
Sign up now: Get ST's newsletters delivered to your inbox
Criminals have been using AI tools and services to carry out crimes, from online fraud and cyberattacks, to creating explicit images of children.
PHOTO: REUTERS
Follow topic:
THE HAGUE - Artificial intelligence-linked images of child sex abuse are on the rise, Europe’s policing agency warned on July 22, saying the material makes it increasingly difficult to identify victims and perpetrators.
Criminals have been using artificial intelligence (AI) tools and services to carry out crimes, from online fraud to cyber attacks to creating explicit images of children, Europol said.
“Cases of AI-assisted and AI-generated child sexual abuse material have been reported,” the Hague-based agency said.
“The use of AI which allows child sex offenders to generate or alter child sex abuse material is set to further proliferate in the near future,” Europol added in a 37-page report, looking at current online threats facing Europe.
The production of artificial abuse images increases “the amount of illicit material in circulation and complicates the identification of victims as well as perpetrators”, Europol said.
More than 300 million children a year were victims of online sexual exploitation and abuse, researchers at the University of Edinburgh said in May.
Offences ranged from so-called sextortion, where predators demand money from victims to keep images private, to the abuse of AI technology to create “deepfake” videos and pictures, the university’s Childlight Global Safety Institute said.
The advent of AI has caused growing concern around the world that the technology can be used for malicious purposes such as the creation of so-called deepfakes – computer-generated, often realistic images and video, based on a real template.
“The volume of self-generated sexual material now constitutes a significant and growing part of child abuse sexual material online,” Europol said.
“Even in cases when the content is fully artificial and there is no real victim depicted, AI-generated child sex abuse material still contributes to the objectification and sexualisation of children,” it said. AFP

