From being the world’s largest online social network to possessing mind altering capabilities, Facebook is much more powerful than we probably think. A research paper published by the Proceedings of the National Academy of Sciences, Facebook altered the News Feed of nearly 700,000 users without their knowledge to see the impact of continuous exposure to positive and negative posts. Also Read - Facebook smartwatch to feature cameras alongside fitness functions: Yes, detachable cameras!
In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. Also Read - WhatsApp Multi-device support confirmed, public beta rollout begins in two months: Mark Zuckerberg
Facebook targeted 689,003 users and manipulated their News Feed by exposing them to posts with emotional content. The lab rats (at least that’s how it seems Facebook treats unsuspecting users) were divided into two groups – one whose News Feed was manipulated to have less positive emotions and the other with less negative emotions. Then their posts were monitored to see whether an increased amount of negative or positive posts on their News Feed affected their behavior. [Spoiler: It did.]
While Facebook didn’t alter the content of posts but just changed what posts these users could see using selective algorithm, it could be a powerful tool to alter the emotions of users regarding much more than their personal lives. When Facebook launched selective algorithm, it said that the new News Feed would give users more content that they would like to see. What Facebook didn’t reveal was it also allowed them to get more control over showing users the content it wants them to see. But what’s more alarming is Facebook is ready to become a willing participant to manipulate what users see and it doesn’t find it unethical.
It isn’t hard to believe that Facebook could be used as a tool to alter the mood in bigger communities (think countries) about their governments. Manipulating Facebook News Feed could become a part of the handbook of organizing successful political coups, assuming Facebook colludes with governments – a not so difficult possibility to accept after Snowden leaks about the PRISM program.
With advertising being Facebook’s core revenue stream, think about the possibilities. Imagine being targeted to depressing posts continuously on your News Feed and then being targeted with advertisements from pharma companies selling anti-depressants. You won’t have any case against Facebook since you signed an agreement that lets Facebook to experiment from time to time on what you can see in the News Feed when you signed up for the service.
Unlike physical world where you can choose not to hang out or talk to people you don’t want to, you cannot have any such control on Facebook. Even if you delete or mute those people, Facebook could always show news posts on your News Feed that have the same effect. And the only option you have is to delete your profile, which is the equivalent of locking yourself up in your room in the physical world.