Facebook said today it will give users the right to appeal decisions if the social network decides to remove photos, videos or written posts deemed to violate community standards. Also Read - Facebook gives voice to emojis with Soundmoji: Here's how to send
Plans to roll out an appeals process globally in coming months came as Facebook provided a first-ever look at internal standards used to decide what posts go too far in terms of hateful or threating speech. Also Read - Netflix could launch its video game streaming service next year, at no extra cost
“This is part of an effort to be more clear about where we draw the line on content,” Facebook public policy manager in charge of content Siobhan Cummiskey told AFP. Also Read - You can now use WhatsApp web without your phone
“And for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake.” The move to involve Facebook users more on standards for removing content comes as the social network fends off criticism on an array of fronts, including handling of people’s data, spreading “fake news,” and whether politics has tinted content removal decisions.
California-based Facebook already lets people appeal removal of profiles or pages. The appeal process to be built up during the year ahead will extend that right to individual posts, according to Cummiskey.
The new appeal process will first focus on posts remove on the basis of nudity, sex, hate speech or graphic violence.
Notifications sent regarding removed posts will include buttons that can be clicked to trigger appeals, which will be done by a member of the Facebook team. While software is used to help find content violating standards at the social network, humans will handle appeals and the goal is to have reviews done within a day.
“We believe giving people a voice in the process is another essential component of building a fair system,” vice president of global product management Monika Bickert said.
“For the first time, we are publishing the internal implementation guidelines that our content reviewers use to make decisions about what’s allowed on Facebook.” Some 7,500 content reviewers are part of a 15,000-person team at Facebook devoted to safety and security, according to Cummiskey, who said the team is expected to grow to 20,000 people by the end of this year.
“It’s quite a tricky and complex thing drawing lines around what people can and cannot share on Facebook, which is why we consult experts,” said Cummiskey, whose background includes work as a human rights attorney.(AFP) AMS AMS
This is published unedited from the PTI feed.