After several months of controversy surrounding its content, the Google-owned video platform announced on its blog site, new features that allow parents to create a white-listed, non-algorithmic version of its Kids app. In addition to the comments, some YouTubers were posting disturbing and sometimes violent videos with keywords targeting kids.
Once the new setting is turned on, users can pick collections from trusted creators such as PBS and Kidz Bop, or themed collections curated by YouTube Kids itself.
YouTube now uses algorithms to decide which videos can appear on YouTube Kids.
YouTube hit a rough patch in late 2017, with advertisers pulling away from the service after news reports showed child predators using videos of young children as de facto chat rooms and after outcry over YouTube creators using children's characters like Elsa from Frozen or Nickelodeon's Peppa Pig and splicing in non-kid friendly language and themes. The post pointed out that YouTube Kids app team has focused on features that give parents more control of the content available on the app.
Launched three years ago, YouTube Kids this week is adding video collections from "trusted partners" and the YouTube Kids team. And lastly, YouTube Kids is adding even more security to its search-off feature. But the practice of addressing problem videos after children have already been exposed to them has bothered child advocates who want the more controlled option to be the default.
Currently, children can use the app's search function to watch videos, but these videos available through search are not subject to human review, but are selected by a special algorithm trained to return appropriate results, according to YouTube. It may be labourious, but with this new feature you'll be able to lock down what the kids are watching to only what you have pre-approved.
One of these changes will go into effect this week, while the other two will start sometime later.
The new feature gives new choices to parents to further customize YouTube Kids app.
Josh Golin, executive director of the Campaign for a Commercial Free Childhood, saw the move as broadly positive, but said that more parental controls do not absolve Google of its responsibility to keep inappropriate content out of the YouTube Kids app.