It intentionally manipulated users’ emotions without their knowledge.
By Katy Waldman
Facebook has been experimenting on us. A new paper in the Proceedings of the National Academy of Sciences reveals that Facebook intentionally manipulated the news feeds of almost 700,000 users in order to study “emotional contagion through social networks.”
The researchers, who are affiliated with Facebook, Cornell, and the University of California–San Francisco, tested whether reducing the number of positive messages people saw made those people less likely to post positive content themselves. The same went for negative messages: Would scrubbing posts with sad or angry words from someone’s Facebook feed make that person write fewer gloomy updates?
They tweaked the algorithm by which Facebook sweeps posts into members’ news feeds, using a program to analyze whether any given textual snippet contained positive or negative words. Some people were fed primarily neutral to happy information from their friends; others, primarily neutral to sad. Then everyone’s subsequent posts were evaluated for affective meanings.
The upshot? Yes, verily, social networks can propagate positive and negative feelings!
The other upshot: Facebook intentionally made thousands upon thousands of people sad.
Facebook’s methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal lawand human rights declarations. “If you are exposing people to something that causes changes in psychological status, that’s experimentation,” says James Grimmelmann, a professor of technology and the law at the University of Maryland. “This is the kind of thing that would require informed consent.”
Click here to continue reading the article at Slate: