YouTube has recently said that it has deleted almost 5 million videos uploaded to its platforms in the last year’s fourth quarter itself. The Google owned video sharing platform deleted these videos even before the viewers could see them, which is quite a feat. This may be a direct result of the strict criticism that the company has been facing from various governments about the inappropriate and extremist content that is posted on it.
YouTube has be entangled into criticism when advertisers like Procter & Gamble Co and Under Armour Inc boycotted the platform for including ads in the videos from these companies that it didn’t approve of.
Recently YouTube commented that the automation of the content policy checking through software has been “paying off” a boon and that is what led to such a huge number of discarded videos.
Another 1.6 million videos were removed by a human team which were results of automated findings, but not before some users had already seen them. Besides these another 1.6 million videos were not removed by the automated software but were removed only after some users reported them.
Facebook has also had a problem with 1.9 million content which were extremist in nature and has put labels on them or removed them. And though YouTube has kept regulations at bay for now with the recent steps, the ad revenue of the website remains unaffected.
Youtube has taken serious steps to prevent extremist content on the website and promptly removes any videos that are extremist in nature and flags the uploader. It also bans terrorist organisations that have been identified by governments. Youtube even downplays videos with graphic content.
Youtube has had a difficult time monitoring videos that spread fake news. And to that effect the company has promoted “authoritative sources” such as reputed media organisations to the top of the feed to prevent the spread of fake news. It also plans to display Wikipedia description along with videos to debunk false news.
WATCH: Gesture navigation on the Xiaomi Redmi Note 5
Videos that show a child in danger or a cartoon character used inappropriately has also been one of the priorities for YouTube to ban content. It even clamped down on comments that made inappropriate references towards children.