Disturbing content found on YouTube Kids

Parents say the app has videos with popular characters in violent situations or disturbing imagery that is not child-friendly

Nurse Staci Burns' three-year-old son, Isaac (centre), chanced on a clip on YouTube Kids that featured renderings of the popular Paw Patrol rescue dogs caught in a nightmarish situation. With them is her younger son, Asher, 10 months.
Nurse Staci Burns' three-year-old son, Isaac (centre), chanced on a clip on YouTube Kids that featured renderings of the popular Paw Patrol rescue dogs caught in a nightmarish situation. With them is her younger son, Asher, 10 months.PHOTO: NYTIMES

NEW YORK • It was a typical night in Ms Staci Burns' house outside Fort Wayne, Indiana. She was cooking dinner while her three-year-old son, Isaac, watched videos on the YouTube Kids app on an iPad.

Suddenly he cried out: "Mummy, the monster scares me."

When Ms Burns walked over, Isaac was watching a video featuring crude renderings of the characters from Paw Patrol, a Nickelodeon show that is popular among pre-schoolers, screaming in a car. The vehicle hurtled into a light pole and burst into flames.

The 10-minute clip, Paw Patrol Babies Pretend To Die Suicide By Annabelle Hypnotized, was a nightmarish imitation of an animated series in which a boy and a pack of rescue dogs protect their community from troubles such as runaway kittens and rock slides.

Parents and children have flocked to Google-owned YouTube Kids since it was introduced in early 2015. The app's more than 11 million weekly viewers are drawn in by its seemingly infinite supply of clips, including those from popular shows by Disney and Nickelodeon and the knowledge that the app is supposed to contain only child-friendly content.

But the app contains dark corners too, as videos that are disturbing for children slip past its filters, either by mistake or because people have found ways to fool the YouTube Kids algorithms.

In recent months, parents such as Ms Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes.

Many have taken to Facebook to warn others and share video screenshots showing moments ranging from a Claymation Spider-Man urinating on Elsa of Frozen to Nick Jr characters in a strip club.

"My poor little innocent boy, he's the sweetest thing," said Ms Burns, a nurse who credits the app with helping Isaac to learn colours and letters before other boys his age. "And then there are these horrible evil people out there that just get their kicks off making stuff like this to torment children."

Mr Malik Ducard, YouTube's global head of family and learning content, said the inappropriate videos were "the extreme needle in the haystack", but that "making the app family-friendly is of the utmost importance to us".

While the offending videos are a tiny fraction of YouTube Kids' universe, they are another example of the potential for abuse on digital media platforms that rely on computer algorithms, rather than humans, to police the content that appears in front of very young people.

And they show how rules that govern at least some of the content on children's television fail to extend to the digital world.

When videos are uploaded to YouTube, algorithms determine whether or not they are appropriate for YouTube Kids. The videos are continually monitored after that, said Mr Ducard, a process that is "multi-layered and uses a lot of machine learning". Several parents said they expected the app to be safer because it asked during set-up whether their child was in pre-school or older.

Mr Ducard said that while YouTube Kids might highlight some content, such as Halloween videos in October, "it isn't a curated experience". Instead, "parents are in the driver's seat", he said, pointing to the ability to block channels, set usage timers and disable search results.

Parents are also encouraged to report inappropriate videos, which someone at YouTube then manually reviews, he said.

He noted that in the past 30 days, "less than .005 per cent" of the millions of videos viewed in the app were removed for being inappropriate. "We strive," he added, "to make that fraction even lower."

Most videos flagged by parents were uploaded to YouTube in recent months by anonymous users with names such as Kids Channel TV and Super Moon TV.

The videos' titles and descriptions feature popular character names and terms including "education" and "learn colours". They are independently animated, presumably to avoid copyright violations and detection.

Some clips uploaded as recently as August have millions of views on the main YouTube site and run automatically placed advertisements, suggesting they are financially lucrative for the makers as well as YouTube, which shares in advertising revenue. It is not clear how many of those views came through YouTube Kids.

Using automation for online advertising has turned Google into a behemoth worth more than half a trillion dollars.

The company has faced a new wave of criticism in the past year for lacking human oversight after its systems inadvertently funded fake news sites and hateful YouTube videos.

Some parents have taken to deleting YouTube Kids. Others, like Ms Burns, still allow its use, just on a more limited, supervised basis.

"This is a children's application - it's targeted at children," said Ms Crissi Gilreath, a mother of two in Oklahoma, "and I just can't believe that with such a big company, they don't have people whose job it is to filter and flag."


A version of this article appeared in the print edition of The Straits Times on November 06, 2017, with the headline 'Disturbing content found on YouTube Kids'. Subscribe