comscore Facebook faces backlash after man posts murder, confession videos | BGR India
News

Facebook faces backlash after man posts murder, confession videos

It took Facebook over two hours to pull down videos by a man in Cleveland about his ‘mission to commit murder’.

facebook-live-desktop-cover

Facebook has become the internet-friendly civilization’s go-to place for all things information (and misinformation), however, it can get a tad difficult to cap the kind of content that flows through the platform. Amidst the ongoing controversy of not doing enough to stop the spread of fake news on its network, Facebook is facing strong backlash for a murder video that was recorded and posted on Sunday.

On Easter Sunday, a 34-year-old man named Steve Stephens drove around downtown Cleveland on what he called a mission to commit murder. As soon as he shot dead 74-year-old Robert Godwin and posted the video of the crime on Facebook, he had millions of people as the witness and the audience. He followed the pre-murder and actual murder video with a live video confessing to the crime. Facebook is accused of taking too long – over two hours – to pull down the video from the website even as Ryan A. Godwin, the victim’s grandson, pleaded on social media, asking users to stop sharing the video and report it instead.

While Facebook has checks and measures that allow other users to report obscene, crime-related, offensive, and other such posts; owing to its huge userbase, it gets impossible to immediately remove any post deemed offensive. In his blog post, Facebook VP, Global Operations, Justin Osofsky noted that as a result of the event, the company is now reviewing its reporting standards to be sure that people can report the videos and other material that violates the standards as easily and quickly as possible. ALSO READ: Facebook fake news: If Wikipedia is democratic, so is Facebook

Talking about the Cleveland case, Osofsky claimed that Facebook did not receive a report about the first video in which Stephens announced his intention of committing the crime. It was only for the second video in which the crime was committed that a report was received after over an hour and 45 minutes after it was posted while the report about the live video in which he confessed the crime, was received only after it had ended. Although Facebook disabled Stephens’ account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind, the company acknowledges that it “need to do better.”

It is worth mentioning that it isn’t the first time that Facebook has got its name tainted over what people share on the website, flouting many a code, ethics, and laws. Last year, a woman posted the death of her boyfriend who was shot by the police during a traffic stop. Earlier this year, three men in Sweden were arrested on suspicion of raping a woman and streaming the assault live to a private Facebook group. While in February, two radio journalists in the Dominican Republic were fatally shot during a Facebook Live broadcast.

In 2016, CEO Mark Zuckerberg said that Facebook will be all video in the next five years. Working towards this direction, the company has opened up channels for pushing more video-based content with its Live Video and Snapchat-like Stories feature. However, in the hindsight, the task of moderating the content has become tougher than ever before. On one hand, these features are proving a boon for advertisers who are minting higher revenue as compared to text or photo-based ads; on the other hand, monitoring what is shared from the users’ end still needs a lot of improvement.

As of now, the company’s content monitoring standards use a combination of artificial intelligence and human checkers to scrutinize content. However, the algorithm is such that it needs a minimum number of users to report the content as objectionable, and then it is reviewed by Facebook’s team of content moderators (including humans) which then decides if it in violation of the website’s standards to be pulled down. ALSO READ: Facebook’s new tool will make it harder to share fake news

The flaw of such a mixed bag of human and machine intelligence is that there could be multiple takes on a particular subject. A Facebook moderator could make decisions which might not really be in popular interest. An example of this is from last year when an iconic Pulitzer Prize-winning photo from the Vietnam War was pulled down because it depicted a naked girl running away from a napalm strike. The conflict here was about nudity on the website. After sharp criticism, Facebook restored the photo because it was more than just nudity and had a historic relevance and meant for public awareness.

While Osofsky assured in his post that the company is working on improving its review processes, the latest crime video incidence arrives at a time when Facebook is preparing for its 2017-edition of the F8 developers’ conference that has the world’s eyes on it. At the annual event, Facebook introduces new products and details its strategy for the near future. It is expected that this year, the Zuckerberg-led company will be focusing strongly on the issue of fake news and how it can tackle the spread of objectionable content of any form.

  • Published Date: April 18, 2017 10:08 AM IST