Facebook has been battling the fake news problem for a while now. However, a recent turn of events linked to the 2016 US Presidential elections reveal that fake ads are a major concern too. After an investigation confirmed that Facebook had inadvertently sold ad space to several Russia-controlled Pages, which were spreading divisive messages to influence the election results, Facebook is looking to hire 1,000 more people to manually read and review fake ads, according to reports. Also Read - Facebook gives voice to emojis with Soundmoji: Here's how to sendAlso Read - Netflix could launch its video game streaming service next year, at no extra cost
Facebook would also be investing more in machine learning to “better understand when to flag and take down ads” as well as tweak its advertising policies such that content with even “subtle expressions of violence” are knocked down from its platform. The Mark Zuckerberg-owned firm would also be doing thorough background checks on advertisers buying political ad space on Facebook. This comes after reports revealed that Facebook had allowed political advertisers to target ads with keywords like “Jew hater”. ALSO READ: Facebook Messenger Lite now also available in US, UK, Canada and Ireland Also Read - You can now use WhatsApp web without your phone
Violent/harmful ads sold to about 470 “inauthentic” Russian pages earned Facebook a massive $100,000, but raised significant doubts over its intent or ability to curb divisive messaging that is non-direct or veiled. “The vast majority of ads run by these accounts didn t specifically reference the US presidential election, voting or a particular candidate. Rather, the ads and accounts appeared to focus on amplifying divisive social and political messages across the ideological spectrum touching on topics from LGBT matters to race issues to immigration to gun rights,” Facebook wrote on its blog. ALSO READ: Facebook addresses fake news, issues newspaper ads and app notifications to educate users
After all, listening tools could only do so much. Hence, Facebook’s decision to throw in more humans into the checking, scanning and reviewing of ad content makes sense. “We constantly update our systems and monitor for malicious activity and we have been forthcoming in what we ve found,” a Facebook statement said. These new hires would be made “in the next few years” according to reports. Prior to this, Facebook had announced in May that it would also beef up its non-advertising “organic content” monitoring division. Over 3,000 are expected to be hired for the purpose; their only job would be to weed out videos of murder, suicides, and violence off Facebook.