WASHINGTON • Dr Free Hess, a paediatrician and mother, had learnt about the chilling videos over the summer when another mum spotted one on YouTube Kids.
She said that minutes into the clip from a children's video game, a man appeared on the screen - giving instructions on how to commit suicide.
"I was shocked," Dr Hess said, noting that since then, the scene has been spliced into many more videos from the popular Nintendo game Splatoon on YouTube and YouTube Kids, a video app for children.
Dr Hess, from Ocala, Florida, has been blogging about the altered videos and working to get them taken down amid an outcry from parents and child health experts, who say such visuals can be damaging to children.
One on YouTube shows a man popping into the frame. "Remember, children," he begins, holding what appears to be an imaginary blade to the inside of his arm. "Sideways for attention. Longways for results."
"I think it's extremely dangerous for our children," Dr Hess said about the clips on Sunday in a phone interview with The Washington Post. "I think our children are facing a whole new world with social media and Internet access. It's changing the way they're growing, and it's changing the way they're developing."
A recent YouTube video viewed by The Post appears to include a spliced-in scene showing Internet personality Filthy Frank. It is unclear why he was edited into these clips, but his fans have been known to put him in memes and other videos.
I think it's extremely dangerous for our children... I think our children are facing a whole new world with social media and Internet access. It's changing the way they're growing, and it's changing the way they're developing.
DR FREE HESS, a paediatrician and mother, on the video clips.
Ms Andrea Faville, a spokesman for YouTube, said in a written statement that the company works to ensure that it is "not used to encourage dangerous behaviour and we have strict policies that prohibit videos which promote self-harm".
"We rely on both user flagging and smart detection technology to flag this content for our reviewers," Ms Faville added. "Every quarter, we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views."
The videos come amid mounting questions about how YouTube, the world's largest video-sharing platform, monitors and removes problematic content.
YouTube has long struggled with how to keep the platform free from such material - removing hateful and violent videos, banning dangerous pranks and cracking down on child sexual exploitation. YouTube said last month it was rebuilding its recommendation algorithm to prevent it from prompting videos that include conspiracy theories and other bogus information, though the videos would remain on the site.
Dr Nadine Kaslow, a past president of the American Psychological Association, told The Post that it is a "tragic" situation in which "trolls are targeting children and encouraging children to kill themselves".
Dr Kaslow, who teaches at Emory University's school of medicine, said that some children who are more vulnerable may be drawn to such grim content.