Google has been attempting to make its services a bit more child friendly in recent months, starting off with changes to the Play Store and now by creating a YouTube for Kids app in an attempt to ensure that children don’t come across inappropriate content.
However, it turns out that the YouTube app designed to make it more appropriate for children to use has come under fire for linking to “inappropriate” content. Two child advocacy groups have flagged up concerns with the Federal Trade Commission, as reported by The Wall Street Journal.
The complaint claims that videos found through the app contained explicit language, along with references to sex and drugs. Speaking to The Wall Street Journal, a lawyer working with the child advocacy groups said: “Google promised parents that YouTube Kids would deliver appropriate content for children, but it has failed to fulfill its promise”.
The BBC got in touch with YouTube and received the following statement: “We work to make the videos in YouTube Kids as family friendly as possible and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed.”
The YouTube kids app does come with some form of parental control. Google initially offered the YouTube Kids app as a way for children to access the streaming site, while only seeing curated content that was age appropriate.
Discuss on our Facebook page, HERE.
KitGuru Says: Google has never been that good at manually reviewing YouTube cases, which is why the horrific Content ID system is in place. I’m not too surprised that a few bad videos managed to slip through to the YouTube Kids app. Obviously, Google is going to have to clamp down on this if it wants to keep the child safety groups at bay, we all know what they can be like at times…
Via: The BBC