For one week in early 2012, Facebook ran a somewhat creepy social psychology experiment on about 690,000 users of the web site. In conjunction with Cornell University and the University of California, the social media site attempted to control the emotional state of users by controlling the type of posts that showed up on a person’s news feed. Specifically, the organization reduced the amount of “emotional content” in the news feed, in some cases reducing only negative content, and in other cases reducing only positive content. As reported in the study, “These results indicate that emotions expressed by others on Facebook influence our own emotions.” At the risk of sounding unprofessional, "well, duh."
It is in our nature to be affected by the emotions of those around us, whether we are cognizant of it or not. Words, attitudes, appearance, and body language lend clues into whether one is happy or sad, and the human psyche more often than not leads us to empathize with those around us. Facebook however demonstrated "emotional contagion" strictly through text content. Those that saw fewer positive posts, themselves wrote fewer positive posts, and those that saw fewer negative posts in turn produced fewer negative posts. Put loosely, users that saw fewer negative posts were slightly happier.
Honestly, I'm not at all surprised, and at first glance I am not terribly disturbed by it. It's a matter of remembering who is the customer. I'm not paying Facebook for a service, so I am not the customer ... I am the product being consumed. Since I understand that, I can keep that in mind when deciding what to share and what not to share. In truth, I think Facebook did the world a service by revealing in a controversial way what all media do. As is stated in the study report, “Because people’s friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging.” So yes, Facebook manipulates news feeds because the business is best served when users (the product) are actively engaged. It's not all that different from traditional media - no one would deny that all media pick and choose what to report.
I do not find it in the least bit surprising that Facebook would try such an experiment. What is disturbing, though, is thinking about how this idea could be used in some really unnerving ways. Facebook has somewhere around a billion users from all around the globe. It's not too much of a stretch to think certain three-letter-acronym agencies could compel the site to use this capability with the express goal of inducing dissatisfaction with a particular government or (if you are a conspiracy theorist) with a candidate for office running against an entrenched incumbent.
Before running wild with such ideas though, it is worth looking at the actual results of the experiment. Yes, Facebook was able to demonstrate a change in the content of posts by the manipulated users - but it was on the order of a tenth of a percent. Statistically measurable but hardly overwhelming.
TL;DR? Facebook performed a research experiment that borders on creepy, proved that they could (minutely) manipulate user emotional state, and reminded users once again that if you are not paying, you are not the customer and should keep that in mind.