In the wake of harsh criticism Facebook faced after a study revealed last June how the social network titan had manipulated posts in its feeds, the company says future research on its 1.3 billion users would be “subjected to greater internal scrutiny from top managers, especially if they focused on ‘deeply personal topics’ or specific groups of people,” reports The New York Times.
Facebook was oblique about how it will make amends. Not only did the company not reveal what kind of guidelines it will use to assess whether future research is appropriate, it also did not say if it will get permission from users for studies like the emotion manipulation study, which ignited a blaze of controversy when news about it broke this past summer.
“In the study, Facebook changed the number of positive and negative posts that a half-million users saw in their news feeds to assess the impact on the emotional tone of their future posts,” writes the NY Times.
In a blog post on Thursday, Facebook’s CTO Mike Schroepfer said that the company had “taken to heart the comments and criticism” it received after the study was unveiled.
“It is clear now that there are things we should have done differently,” he said. “For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.”
Schroepfer, who turned down the NY Times’ request for an interview, said that Facebook researchers were going to be given clearer guidelines but he did not explain what they were.
He did say that “the company’s engineers will get ethics training as part of their six-week boot camp when they join the company. And more intrusive research will now be reviewed by a panel of high-ranking Facebook officials, including people involved in the legal, policy and privacy arenas.”
Source: The New York Times Facebook Promises a Deeper Review of Its User Research
Image: Thinkstock