Get ready for extremist videos with off-centre camera work, lighting overlays and shrunken content – Google and Facebook are now automatically taking down videos thought to be related to terrorism, so the uploaders may need to find a way to get around it.
Although neither Facebook or Google have announced this move, Reuters is reporting sources ‘familiar with the process,' who claim that that is exactly what the two tech giants are now doing. Considering the two entities share the majority of online-video viewing through Facebook and Youtube, they are likely to have the biggest impact, if their methods are effective.
The system used to find and automatically remove extremist content, is much the same as the algorithm used to take down copyright protected content, or allow the copyright holders to apply adverts to it. The automated system searches for unique identifiers of the videos and if it finds them, the content is automatically deleted.
At least this means terrorist videos will be as infuriating to watch as pirate streams now. Source: Wikimedia
Although this system may not work against brand new or altered content, it should make it much harder for extremism groups like DAESH to continue posting the same videos time and again. In particular it's hoped that videos of violent acts can be kept off of most major video sharing sites, preventing people stumbling across them.
None of the sources would confirm what identifiers are used to find the videos, nor how often human double-checking is employed. That latter point may be important though, as automated systems have in the past been quite heavy handed with takedowns of suspected copyright infringing videos.
It can be difficult in some instances too, as some content which may be identified as extremist is technically not illegal. It may be that even though Facebook and Google are employing similar techniques to take the videos down, they may draw the line somewhere different than one another, on what actually constitutes extremist material.
Discuss on our Facebook page, HERE.
KitGuru Says: As much as I understand why the content is being taken down, it still makes me a little uneasy. “Extremist,” is very much a subjective word. I'd much rather they use the automated tools to apply advertising to the videos for gay rights groups, or some other form of pro-acceptance organisations instead.