
Facebook Thursday said it has changed how it conducts experiments on users, by giving its researchers more guidance and adding internal reviews. But the company declined to discuss other details of the changes, which some outsiders called inadequate. The changes follow the disclosure in June of an earlier experiment where Facebook researchers altered the news feeds of 700,000 users, omitting either positive or negative emotions to study how emotions spread on the social network. The disclosure unleashed widespread criticism that the researchers had not notified users.
Facebook was widely criticized in June after researchers published the results of a study that measured the impact of showing almost 700,000 users more positive or negative stories in their news feeds. The study found a tiny but possibly significant impact on users' happiness from the manipulations.
"We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," Mike Schroepfer, Facebook's Chief Technology Officer, said in a blog post on Thursday. "It is clear now that there are things we should have done differently."
Under the new policy, Facebook researchers will have to seek approval for sensitive projects from a new committee that will include members from the company's engineering, research, legal, privacy and policy teams. Notably absent are any reviewers from outside Facebook.

















Comment: Bottom line is their advertisement revenues from big corporations and military and all else is secondary. Can we trust Facebook?