Greetings! You must login or signup first!

Unknown-2-485761

Submission   13,093

Part of a series on Facebook / Meta. [View Related Entries]


Overview

The Facebook Emotional Contagion Experiment was a psychological study conducted by a group of researchers from Facebook and Cornell University to test whether an emotional bias in the newsfeed content of a Facebook user can affect his/her own emotional state. Upon its publication in June 2014, the paper was criticized for toying with Facebook users’ emotions without their consent and for failing to gain prior approval from the Cornell ethics committee.

Background

On June 17th, 2014, the Proceedings of the National Academy of Sciences[1] (PNAS) published a paper titled "Experimental evidence of massive-scale emotional contagion through social networks" authored by Adam D. I. Kramer of Facebook's Core Data Science Team, Jamie E. Guillory of the Center for Tobacco Control Research and Education at the University of California and Jeffery T. Hancock from the Departments of Communication and Information Science at Cornell University. As part of the experiment, Facebook data scientists altered random users' news feed algorithms by skewing the number of positive or negative terms to see if people would respond with increasingly negative or positive status updates of their own. The study found that increased exposure to specific terms resulted in increased expression of that same type.

Mean number of positive (Upper) and negative (Lower) emotion words (percent) generated people, by condition. 寸 Control Experimental Negativity Reduced Positivity Reduced Kramer A D I et al. PNAS 2014,111:8788-8790 PNAS ©2014 by National Academy of Sciences

"when positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

Notable Developments

News Media Coverage

On June 26th, the science and technology magazine New Scientist[4] published an article reporting on the findings of the experiment. In the coming days, other news sites published articles about the experiment and the growing public backlash, including International Business Times,[9] The Telegraph,[10] The Guardian,[11] RT,[12] The Washington Post,[13] Business Insider,[14] Salon,[15] AV Club,[16 Forbes[17] and the BBC.[18]

Online Reaction

On June 27th, OpenNews director of content Erin Kissane posted a tweet[7] urging her followers to delete their Facebook accounts and for Facebook employees to quit their jobs (shown below).

Erin Kissane @kissane Follow Get off Facebook. Get your family off Facebook. If you work there, quit. They're f------ awful. わReply t Retweet * Favorite More RETWEETS FAVORITES 541 543 11:33 PM-27 Jun 2014

On June 28th, Redditor hazysummersky submitted a link to an AV Club[16] article about the experiment to the /r/technology[3] subreddit, where it gained over 3,500 points and 1,300 comments in the first 72 hours. That same day, CEO of the Department of Better Technology Clay Johnson tweeted[8] that he found the Facebook experiment "terrifying" (shown below).

Clay Johnson @cjoh Follow In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook "transmission of anger" experiment is terrifying. Reply Retweet ★ Favorite More RETWEETS FAVORITES 152 62 9:43 AM-28 Jun 2014

Ethics Committee Approval Controversy

On June 28th, 2014, The Atlantic quoted PNAS editor Susan Fiske, who claimed she was concerned about the study but was told that the author's "local institutional review board had approved it." On June 29th, Forbes[6] published an article about the ethics review controversy, which pointed out part of Facebook's data use policy revealing that the company may use user data for "internal operations, including troubleshooting, data analysis, testing, research and service improvement" (shown below).

Search for people, places and things information we receive about you: - as part of our efforts to keep Facebook products, services and integrations safe and secure; to protect Facebook's or others' rights or property; -to provide you with location features and services, like telling you and your friends when something is going on nearby; to measure or understand the effectiveness of ads you and others see, including to deliver relevant ads to you; to make suggestions to you and other users on Facebook, such as: suggesting that your friend you as a friend because the user imported the same email address as you did, or suggesting that your friend tag you in a picture they have uploaded with you in it; and for internal operations, including troubleshooting, data analysis, testing, research and service mprovement.

On June 30th, Cornell University[5] published a statement indicating that since the research was conducted independently, the university's review board had deemed it unnecessary for review.

"Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required."

Adam Kramer's Apology

On June 29th, Kramer posted a Facebook[2] status update explaining the methodology behind the experiment an apologizing for any anxiety caused by the paper. In the first 48 hours, the post gathered more than 750 likes and 140 comments.

Adam D. I. Kramer in Floyd, ViA June 29 at 4:05pm OK so. A lot of people have asked me about my and Jamie and Jeff's recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. e felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody's posts were "hidden," they just didn't show up on some loads of Feed. Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect itthe result was that people produced an average of one fewer emotional word, per thousand words, over the following week. The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why ome people have concerns about it, and my coauthors and are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety While we've always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we've learned from the reaction to this paper Like Comment Share Alex Leavitt, Brian Ries and 756 others like this. 644 shares View previous comments 50 of 142

Search Interest

Not available.

External References



Share Pin

Related Entries 58 total

Bobscover
Bobs and Vegana
Trashdoves
Trash Doves
Facebook
Facebook Cartoon Profile Pict...
Irec
Indonesian Reporting Commissi...


Recent Images 6 total


Recent Videos 0 total

There are no recent videos.




Load 42 Comments
Facebook Emotional Contagion Experiment

Facebook Emotional Contagion Experiment

Part of a series on Facebook / Meta. [View Related Entries]

Updated Jul 02, 2014 at 12:06PM EDT by Brad.

Added Jul 01, 2014 at 03:37PM EDT by Don.

PROTIP: Press 'i' to view the image gallery, 'v' to view the video gallery, or 'r' to view a random entry.

This submission is currently being researched & evaluated!

You can help confirm this entry by contributing facts, media, and other evidence of notability and mutation.

Overview

The Facebook Emotional Contagion Experiment was a psychological study conducted by a group of researchers from Facebook and Cornell University to test whether an emotional bias in the newsfeed content of a Facebook user can affect his/her own emotional state. Upon its publication in June 2014, the paper was criticized for toying with Facebook users’ emotions without their consent and for failing to gain prior approval from the Cornell ethics committee.

Background

On June 17th, 2014, the Proceedings of the National Academy of Sciences[1] (PNAS) published a paper titled "Experimental evidence of massive-scale emotional contagion through social networks" authored by Adam D. I. Kramer of Facebook's Core Data Science Team, Jamie E. Guillory of the Center for Tobacco Control Research and Education at the University of California and Jeffery T. Hancock from the Departments of Communication and Information Science at Cornell University. As part of the experiment, Facebook data scientists altered random users' news feed algorithms by skewing the number of positive or negative terms to see if people would respond with increasingly negative or positive status updates of their own. The study found that increased exposure to specific terms resulted in increased expression of that same type.


Mean number of positive (Upper) and negative (Lower) emotion words (percent) generated people, by condition. 寸 Control Experimental Negativity Reduced Positivity Reduced Kramer A D I et al. PNAS 2014,111:8788-8790 PNAS ©2014 by National Academy of Sciences

"when positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

Notable Developments

News Media Coverage

On June 26th, the science and technology magazine New Scientist[4] published an article reporting on the findings of the experiment. In the coming days, other news sites published articles about the experiment and the growing public backlash, including International Business Times,[9] The Telegraph,[10] The Guardian,[11] RT,[12] The Washington Post,[13] Business Insider,[14] Salon,[15] AV Club,[16 Forbes[17] and the BBC.[18]

Online Reaction

On June 27th, OpenNews director of content Erin Kissane posted a tweet[7] urging her followers to delete their Facebook accounts and for Facebook employees to quit their jobs (shown below).


Erin Kissane @kissane Follow Get off Facebook. Get your family off Facebook. If you work there, quit. They're f------ awful. わReply t Retweet * Favorite More RETWEETS FAVORITES 541 543 11:33 PM-27 Jun 2014

On June 28th, Redditor hazysummersky submitted a link to an AV Club[16] article about the experiment to the /r/technology[3] subreddit, where it gained over 3,500 points and 1,300 comments in the first 72 hours. That same day, CEO of the Department of Better Technology Clay Johnson tweeted[8] that he found the Facebook experiment "terrifying" (shown below).


Clay Johnson @cjoh Follow In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook "transmission of anger" experiment is terrifying. Reply Retweet ★ Favorite More RETWEETS FAVORITES 152 62 9:43 AM-28 Jun 2014

Ethics Committee Approval Controversy

On June 28th, 2014, The Atlantic quoted PNAS editor Susan Fiske, who claimed she was concerned about the study but was told that the author's "local institutional review board had approved it." On June 29th, Forbes[6] published an article about the ethics review controversy, which pointed out part of Facebook's data use policy revealing that the company may use user data for "internal operations, including troubleshooting, data analysis, testing, research and service improvement" (shown below).


Search for people, places and things information we receive about you: - as part of our efforts to keep Facebook products, services and integrations safe and secure; to protect Facebook's or others' rights or property; -to provide you with location features and services, like telling you and your friends when something is going on nearby; to measure or understand the effectiveness of ads you and others see, including to deliver relevant ads to you; to make suggestions to you and other users on Facebook, such as: suggesting that your friend you as a friend because the user imported the same email address as you did, or suggesting that your friend tag you in a picture they have uploaded with you in it; and for internal operations, including troubleshooting, data analysis, testing, research and service mprovement.

On June 30th, Cornell University[5] published a statement indicating that since the research was conducted independently, the university's review board had deemed it unnecessary for review.

"Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required."

Adam Kramer's Apology

On June 29th, Kramer posted a Facebook[2] status update explaining the methodology behind the experiment an apologizing for any anxiety caused by the paper. In the first 48 hours, the post gathered more than 750 likes and 140 comments.


Adam D. I. Kramer in Floyd, ViA June 29 at 4:05pm OK so. A lot of people have asked me about my and Jamie and Jeff's recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. e felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody's posts were "hidden," they just didn't show up on some loads of Feed. Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect itthe result was that people produced an average of one fewer emotional word, per thousand words, over the following week. The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why ome people have concerns about it, and my coauthors and are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety While we've always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we've learned from the reaction to this paper Like Comment Share Alex Leavitt, Brian Ries and 756 others like this. 644 shares View previous comments 50 of 142

Search Interest

Not available.

External References

Recent Videos

There are no videos currently available.

Recent Images 6 total


Top Comments


+ Add a Comment

Comments (42)


Display Comments

Add a Comment