O HAI! You must login or signup first!

Meme Encyclopedia
Media
Editorials
More

Popular right now

China's Social Credit System / +15 Social Credit

China's Social Credit System / +15 Social Credit

Don Caldwell

Don Caldwell • 6 years ago

Jacob Batalon Saying Things / You Seem Chill / CEO of Sex

Jacob Batalon Saying Things / You Seem Chill / CEO of Sex

Philipp Kachalin

Philipp Kachalin • 5 years ago

Bloomer

Bloomer

Philipp Kachalin

Philipp Kachalin • 6 years ago

Fukouna Shoujo 03

Fukouna Shoujo 03

7 years ago

0.5 GPA Activities TikTok Trend example image.

0.5 GPA Activities

Sakshi Rakshale

Sakshi Rakshale • 2 days ago

Know Your Meme is the property of Literally Media ©2024 Literally Media. All Rights Reserved.
Unknown-2-485761

Confirmed   13,169

Part of a series on Facebook / Meta. [View Related Entries]


Facebook Emotional Contagion Experiment

Facebook Emotional Contagion Experiment

Part of a series on Facebook / Meta. [View Related Entries]

PROTIP: Press 'i' to view the image gallery, 'v' to view the video gallery, or 'r' to view a random entry.

Overview

The Facebook Emotional Contagion Experiment was a psychological study conducted by a group of researchers from Facebook and Cornell University to test whether an emotional bias in the newsfeed content of a Facebook user can affect his/her own emotional state. Upon its publication in June 2014, the paper was criticized for toying with Facebook users’ emotions without their consent and for failing to gain prior approval from the Cornell ethics committee.

Background

On June 17th, 2014, the Proceedings of the National Academy of Sciences[1] (PNAS) published a paper titled "Experimental evidence of massive-scale emotional contagion through social networks" authored by Adam D. I. Kramer of Facebook's Core Data Science Team, Jamie E. Guillory of the Center for Tobacco Control Research and Education at the University of California and Jeffery T. Hancock from the Departments of Communication and Information Science at Cornell University. As part of the experiment, Facebook data scientists altered random users' news feed algorithms by skewing the number of positive or negative terms to see if people would respond with increasingly negative or positive status updates of their own. The study found that increased exposure to specific terms resulted in increased expression of that same type.


Mean number of positive (Upper) and negative (Lower) emotion words (percent) generated people, by condition. 寸 Control Experimental Negativity Reduced Positivity Reduced Kramer A D I et al. PNAS 2014,111:8788-8790 PNAS ©2014 by National Academy of Sciences

"when positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

Notable Developments

News Media Coverage

On June 26th, the science and technology magazine New Scientist[4] published an article reporting on the findings of the experiment. In the coming days, other news sites published articles about the experiment and the growing public backlash, including International Business Times,[9] The Telegraph,[10] The Guardian,[11] RT,[12] The Washington Post,[13] Business Insider,[14] Salon,[15] AV Club,[16 Forbes[17] and the BBC.[18]

Online Reaction

On June 27th, OpenNews director of content Erin Kissane posted a tweet[7] urging her followers to delete their Facebook accounts and for Facebook employees to quit their jobs (shown below).


"source":https://twitter.com/kissane/statuses/482728344656809984

On June 28th, Redditor hazysummersky submitted a link to an AV Club[16] article about the experiment to the /r/technology[3] subreddit, where it gained over 3,500 points and 1,300 comments in the first 72 hours. That same day, CEO of the Department of Better Technology Clay Johnson tweeted[8] that he found the Facebook experiment "terrifying" (shown below).


"source":https://twitter.com/cjoh/statuses/482882070101106688

Ethics Committee Approval Controversy

On June 28th, 2014, The Atlantic quoted PNAS editor Susan Fiske, who claimed she was concerned about the study but was told that the author's "local institutional review board had approved it." On June 29th, Forbes[6] published an article about the ethics review controversy, which pointed out part of Facebook's data use policy revealing that the company may use user data for "internal operations, including troubleshooting, data analysis, testing, research and service improvement" (shown below).


Search for people, places and things information we receive about you: - as part of our efforts to keep Facebook products, services and integrations safe and secure; to protect Facebook's or others' rights or property; -to provide you with location features and services, like telling you and your friends when something is going on nearby; to measure or understand the effectiveness of ads you and others see, including to deliver relevant ads to you; to make suggestions to you and other users on Facebook, such as: suggesting that your friend you as a friend because the user imported the same email address as you did, or suggesting that your friend tag you in a picture they have uploaded with you in it; and for internal operations, including troubleshooting, data analysis, testing, research and service mprovement.

On June 30th, Cornell University[5] published a statement indicating that since the research was conducted independently, the university's review board had deemed it unnecessary for review.

"Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required."

Adam Kramer's Apology

On June 29th, Kramer posted a Facebook[2] status update explaining the methodology behind the experiment an apologizing for any anxiety caused by the paper. In the first 48 hours, the post gathered more than 750 likes and 140 comments.


"source":https://www.facebook.com/akramer/posts/10152987150867796

Search Interest

Not available.

External References

Recent Videos

There are no videos currently available.

Recent Images 6 total


Share Pin

Related Entries 58 total

Bobscover
Bobs and Vegana
Trashdoves
Trash Doves
Facebook
Facebook Cartoon Profile Pict...
Irec
Indonesian Reporting Commissi...


Recent Images 6 total


Recent Videos 0 total

There are no recent videos.





Load 42 Comments

Top Comments


+ Add a Comment

Comments (42)


Display Comments

Add a Comment


Meme Encyclopedia
Media
Editorials
More