After Facebook published a research paper in which it had manipulated the News Feed of nearly 700,000 users for a week in the name of science, one of its data scientists involved in the research says that the “paper may not have justified all of this anxiety” via a post on Facebook. Also Read - Facebook smartwatch to feature cameras alongside fitness functions: Yes, detachable cameras!Also Read - What happens to your Facebook account after you die?
It seems as if Facebook doesn’t understand the repercussions of manipulating its users’ emotions by selectively exposing them to posts with either negative or positive emotions. “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook,” Adam D I Kramer, a data scientist at Facebook who wrote and designed the experiment, explained in the post. Also Read - WhatsApp Multi-device support confirmed, public beta rollout begins in two months: Mark Zuckerberg
However, an experiment like this can have wide ranging repercussions not just with Facebook’s users but also stakeholders like governments. We live in an age where governments are trying to have as much control over the Internet and it is not just about snooping into our emails, tweets and Facebook posts. Governments globally are looking at ways it can use online media to propagate its ends and this propaganda can come in many forms including censorship, Photoshopped images and much more. There is a reason why online companies including Facebook are trying to convince governments to release statistics related to government requests for user data.
Facebook’s experiment is not a clear cut case of censorship as the posts are not deleted but simply omitted from the user’s News Feed and these posts are available on the original poster’s profile page. But when users are unaware their News Feed is being manipulated, they are unlikely to go and check their friends’ walls to see whether there is something they have posted that might not have come in their News Feed.
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Kramer writes.
Facebook still doesn’t get it. This “anxiety” is not about how Kramer and team described the research. It is about Facebook being willing to manipulate the News Feed with the intent of manipulating the emotions of its users.
Kramer’s explanation in its entirety follows.