facebook
© UnknownFacebook Thursday said it has changed how it conducts experiments on users, by giving its researchers more guidance and adding internal reviews. But the company declined to discuss other details of the changes, which some outsiders called inadequate. The changes follow the disclosure in June of an earlier experiment where Facebook researchers altered the news feeds of 700,000 users, omitting either positive or negative emotions to study how emotions spread on the social network. The disclosure unleashed widespread criticism that the researchers had not notified users.
After raising a storm of controversy for manipulating its users emotions in the name of scientific research, Facebook announced new steps to consider future studies more carefully.

Facebook was widely criticized in June after researchers published the results of a study that measured the impact of showing almost 700,000 users more positive or negative stories in their news feeds. The study found a tiny but possibly significant impact on users' happiness from the manipulations.

"We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," Mike Schroepfer, Facebook's Chief Technology Officer, said in a blog post on Thursday. "It is clear now that there are things we should have done differently."

Under the new policy, Facebook researchers will have to seek approval for sensitive projects from a new committee that will include members from the company's engineering, research, legal, privacy and policy teams. Notably absent are any reviewers from outside Facebook.


The company also promised to educate employees better during training about research policies. And Facebook will disclose all research derived from experiments on its users on a new section of its website.

But the new policy did not add a requirement to get users' consent before altering their feeds, one of the most criticized aspects of the happiness study. Facebook's massive terms of service that all users must approve before joining the service include consent for "research," though the term was added in 2012 after the controversial study on emotions was conducted.

"The policy is a good start, but it falls well short of what Facebook is legally required to do," says James Grimmelmann, a law professor at the University of Maryland. "It still does not treat users as people who are entitled to make their own decisions about whether to take part in research projects."

A member of Facebook's core data science team and two researchers from Cornell University conducted the study, called "Experimental evidence of massive-scale emotional contagion through social networks," that set off the controversy. For one week in 2012, they altered how many positive or negative items appeared in the news feeds of some 689,003 Facebook users. They then measured how many positive and negative words appeared in users' own posts. Users that saw fewer positive posts had 0.1% fewer positive words in their own posts. And users that saw fewer negative posts wrote posts with 0.07% fewer negative words, the study found.

Critics said Facebook should have gotten explicit consent from users involved in the study, and noted the potential harm of such manipulations, given that one in ten Americans suffers from a mood disorder that could lead to depression.

Facebook is hardly the only Internet company that runs experiments on its users, though most efforts are directed at improving business results, not academic research. Twitter (TWTR) announced this week that it was funding a new research center at the Massachusetts Institute of Technology and would give the university access to its entire archive of tweets. The deal did not involve any manipulating of users' Twitter streams.