The journal that has published the Facebook mood swings study regrets the way the study was conducted. Facebook issued a sorry statement for accessing the content of 700,000 people’s pages, but the company’s second-in-command said she has no regrets.
It was concluded by the Proceedings of the National Academy of Sciences journal that the move to manipulate the content appearing on the Facebook pages of about 700,000 people without their prior consent may have violated some principles of academic research.
However, as a non-scientific, profit-driven company, Facebook wasn’t obliged to comply with scientific ethics, Inder Verma, the journal’s editor-in-chief, wrote a day after the initial article.
“It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out,” she stated.
The “editorial expression of concern” appeared in the journal on Thursday.
It came after the Facebook’s Chief Operating Officer Sheryl Sandberg apologized — although she wasn’t sorry about the experiment, she said, merely about the way it was carried out.
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Sandberg told the Wall Street Journal.
The experiment carried out by the Facebook consisted of manipulating the content that appeared in the news feed of a small part of the social network’s almost 1.3 billion users, AP reported. The study was carried out in January 2012, and aimed to prove that people’s moods could spread like an“emotional contagion” based on what they were reading.
The results were published a month ago, but only a couple of days ago the global outrage began over the blogs and essays in the New York Times and the Atlantic raising questions as to whether the study was ethical.
Facebook data scientist Adam Kramer tried to provide explanation why the study was started in the first place.
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper. ”
Among those who participated in the study development, there was researcher Jeffrey T. Hancock of Cornell University connected to a Department of Defense-funded program to use the military to curb civil unrest. This news triggered online outrage.
UK authorities are currently probing if the experiment has broken UK data protection laws. If proven, the world’s most popular social network could face a 500,000-pound (US$857,000) fine.