YouTube has been at the center of plenty of controversy over the course of 2017. Back in January we saw advertisers pulling out due to questionable content uploaded to the site, which led to stricter policy changes and lots of demonetized videos. Now, YouTube has found itself in the same position again, as disturbing videos disguised as ‘family friendly’ content have come to light. To combat this, the site will be hiring as many as 10,000 people to start manually reviewing content.
Up until now, YouTube has largely relied on its algorithm to monitor and flag content automatically. However, in this case, there were many channels taking advantage of the YouTube Kids app and bypassing the systems the site has in place. As a result, YouTube CEO, Susan Wojcicki has said that the company will keep a closer eye on videos going forward, which is why thousands of people are being brought on board to manually review videos.
The idea is that by manually flagging content as appropriate or inappropriate in large numbers, the YouTube algorithm will be trained to react faster. Apparently, YouTube’s machine learning algorithm is already capable of taking down 70 percent of ‘extremist’ content within eight hours of upload.
By appointing a large number of people to flag inappropriate content aimed at children, the algorithm should also catch up and learn how to counter those videos too. Wojcicki’s post ends with a promise of greater transparency from YouTube in 2018, this will include reports containing data on the flags it gets alongside explanations for removed videos or comments that violate YouTube policies.
KitGuru Says: The YouTube algorithm has been frustrating for many content creators on the platform. It takes time for the algorithm to learn what should be demonetized/removed and what shouldn’t, which has had a knock on effect on creators financially. Perhaps with some helping hands manually reviewing content to help train the algorithm, these issues will pop up less.