Facebook took action on 1.9 million pieces of content related to the Islamic State (IS) and Al Qaeda in the first quarter of 2018, twice as much as the last quarter of 2017.
The key part is that Facebook found the vast majority of this content on its own.
“In Q1 2018, 99 per cent of the IS and Al Qaeda content we took action on was not user reported,” Monika Bickert, Vice President of Global Policy Management at Facebook, said in a blog post late on Monday.
“Taking action” means that Facebook removed the vast majority of this content and added a warning to a small portion that was shared for informational or counter speech purposes.
“This number likely understates the total volume, because when we remove a profile, Page or Group for violating our policies, all of the corresponding content becomes inaccessible.
But we don’t go back through to classify and label every individual piece of content that supported terrorism,” explained Brian Fishman, Global Head of Counterterrorism Policy at Facebook.
Facebook now has a counter-terrorism team of 200 people, up from 150 in June 2017.
“We have built specialised techniques to surface and remove older content. Of the terrorism-related content we removed in Q1 2018, more than 600,000 pieces were identified through these mechanisms,” the blog post said.
“We’re under no illusion that the job is done or that the progress we have made is enough,” said Facebook.
“Terrorist groups are always trying to circumvent our systems, so we must constantly improve,” the company added.