Facebook Admits Doing “Emotional Contagion” Study Wrong

October 6, 2014 | Marina Galperina

Facebook released an “update on changes” they have been working on for the last three months. The internet empire faced a lot of criticism for their “emotional contagion” study when it was revealed that nearly 700,000 Facebook users had their feeds unethically and possibly illegally manipulated to contain more “positive” or “negative” posts without their knowledge.

“Were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism,” Facebook’s newsroom wrote, apparently shocked that their users were mad when they found out that their feelings were secretly being toyed with and studied. “It is clear now that there are things we should have done differently.” (Translation: “We fucked up.”) Also, they admit that they “failed to communicate clearly why and how we did it.” (Translation: “Lied.”)

Facebook is very busy with stuff (“from systems infrastructure to user experience to artificial intelligence to social science”) but don’t worry, now they’re going to be more “responsible.” They even presented a padded-out but ultimately shallow outline of how they will accomplish this.

TL;DR: Facebook will now first considering not fucking with you before they fuck with you, will make sure “future studies” (with which they may or may not be fuck with you) will involve only the most expert staff, hitting your sweet, sweet, quantifiable personal “data” like emotions.

Guidelines: we’ve given researchers clearer guidelines. If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin. The guidelines also require further review if the work involves a collaboration with someone in the academic community.

Review: we’ve created a panel including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy and policy teams, that will review projects falling within these guidelines. This is in addition to our existing privacy cross-functional review for products and research.

Training: we’ve incorporated education on our research practices into Facebook’s six-week training program, called bootcamp, that new engineers go through, as well as training for others doing research. We’ll also include a section on research in the annual privacy and security training that is required of everyone at Facebook.