Recent news stories and blog posts highlighted the underbelly of YouTube Kids, Google’s children-friendly version of the wide world of YouTube. While all content on YouTube Kids is meant to be suitable for children under the age of 13, some inappropriate videos using animations, cartoons, and child-focused keywords manage to get past YouTube’s algorithms and in front of kids’ eyes. Now, YouTube will implement a new policy in an attempt to make the whole of YouTube safer: it will age-restrict inappropriate videos masquerading as children’s content in the main YouTube app.
The reasoning behind this decision has to do with the relationship between the main YouTube app and YouTube Kids (which has its own dedicated app). Before any video appears in the YouTube Kids app, it’s filtered by algorithms that are supposed to identify appropriate children’s content and content that could be inappropriate or in violation of any YouTube policies. YouTube also has a team of human moderators that review any videos flagged in the main YouTube app by volunteer Contributors (users who flag inappropriate content) or by systems that identify recognizable children’s characters in the questionable video.
If the human moderator finds that the video isn’t suitable for the YouTube Kids app, it will be age-restricted in the main YouTube app. No age-restricted content is allowed in the YouTube Kids app at all. As for those using the main YouTube app, age-restricted content cannot be viewed by anyone not logged into a YouTube account, anyone under the age of 18, or anyone with Restricted Mode turned on. According to a report from The Verge, YouTube claims this policy has been in the works for some time now and is not in response to the recent online concern.