In January 2012, Facebook, along with researchers from Cornell University and University of California San Francisco, conducted an experiment with over half a million News Feeds to determine the emotional effects of posts on users. Facebook also wanted to determine whether users who saw an increased number of positive posts would be more engaged and less likely to quit the site.
The experiment occurred for one week in January 2012 when the Facebook News Feeds of 689,003 users were filtered to show posts with an increased proportion of positive or negative emotions. The filters only applied to News Feeds, users could see all of their friends’ posts by going to their pages.
From the study:
Because people’s friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging.
The study can be read in full here: Experimental evidence of massive-scale emotional contagion through social networks.
The experiment showed that emotions can be contagious, spreading among friends through Facebook posts:
- Users who saw an increased proportion of positive posts responded by posting more positive posts and posting more frequently.
- Users who saw an increased proportion of negative posts responded by posting more negative posts and posting less frequently.
According to the study, “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”
⇒ Also see, It’s Time to Spring Clean Your Facebook! for information about a similar report from Public Library of Science which showed the effect of positive and negative Facebook posts on emotions.
A storm of controversy about this research has erupted with many being upset about how the study was conducted. Participants in the study weren’t expressly informed that News Feeds were being altered or that their emotions were being studied by Facebook. The users whose feeds were filtered were not specifically asked for consent nor notified that they were part of an experiment. Facebook has never revealed whose feeds were altered for the study.
Further, Facebook has not revealed what the impact of the study has had upon our News Feeds. Perhaps our News Feeds have been filtered to omit more negative posts ever since the study was conducted. Perhaps Facebook’s News Feed algorithm didn’t change post-study. We just don’t know.
We may not care. USA Today is reporting that while people are upset over Facebook’s altering News Feeds without the informed consent of its users, no one seems to be quitting the site in a huff over the controversy: No one mad enough to quit Facebook over research study.
How do you feel about Facebook altering News Feeds for a social experiment? Vote in today’s Wonder of Tech poll and let us know your thoughts:
⇒ Learn how you can take charge of your News Feed, Fix Your Flooded Facebook Feed (Without Unfriending Anyone).
What do you think about Facebook’s social experiment? Do you think Facebook should have informed affected users of its experiment with their News Feeds? What responsibility should social media sites have to their users? Do you think your News Feed has been filtered to show more positive posts? Share your thoughts in the Comments section below!