WASHINGTON (AFP) ― Facebook secretly manipulated the feelings of 700,000 users to understand “emotional contagion” in a study that prompted anger and forced the social network giant on the defensive.
For one week in 2012, Facebook tampered with the algorithm used to place posts into users’ news feeds to study how this affected their mood, all without their explicit consent or knowledge.
The researchers wanted to see if the number of positive or negative words in messages the users read determined whether they then posted positive or negative content in their status updates.
The study, conducted by researchers affiliated with Facebook, Cornell University and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
Results of the study spread ― and with that anger and disbelief ― when the online magazine Slate and The Atlantic website wrote about it Saturday.
“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study authors wrote.
“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
While other research has used metadata to study trends, this experiment appears to be unique because it manipulated the data to see if there was a reaction.
The social network, which counts more than 1 billion active users, said in a statement that “none of the data used was associated with a specific person’s Facebook account.”
“We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible,” it added.
“A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends or information from pages they follow. We carefully consider what research we do and have a strong internal review process.”
In the paper, the researchers said the study “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook.”
But that did nothing to stem the growing anger of Facebook users.
“#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT ... Yeah, time to close FB acct!” one Twitter user said.
Other tweets used words like “super disturbing,” “creepy” and “evil,” as well as angry expletives, to describe the psychological experiment.
Susan Fiske, a Princeton University professor who edited the report for publication, said the researchers assured her the study had been approved ahead of time by an ethics review board.
“They approved the study as exempt, because it is essentially a pre-existing dataset, part of FB’s ongoing research into filtering users’ news feeds for what they will find most interesting,” she told AFP in an email.
“Many ethical issues are open to debate, and this one seems to have struck a nerve.”
Katherine Sledge Moore, a psychology professor at Elmhurst College, said the study was fairly standard overall, especially for so-called “deception studies,” in which participants are given one purpose for the research when they provide initial consent and told later what the study is really about.
In this case, however, the study’s subjects did not even know they were in it.
“Based on what Facebook does with their newsfeed all of the time and based on what we’ve agreed to by joining Facebook, this study really isn’t that out of the ordinary,” Moore said.
“The results are not even that alarming or exciting.”
For one week in 2012, Facebook tampered with the algorithm used to place posts into users’ news feeds to study how this affected their mood, all without their explicit consent or knowledge.
The researchers wanted to see if the number of positive or negative words in messages the users read determined whether they then posted positive or negative content in their status updates.
The study, conducted by researchers affiliated with Facebook, Cornell University and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
Results of the study spread ― and with that anger and disbelief ― when the online magazine Slate and The Atlantic website wrote about it Saturday.
“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study authors wrote.
“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
While other research has used metadata to study trends, this experiment appears to be unique because it manipulated the data to see if there was a reaction.
The social network, which counts more than 1 billion active users, said in a statement that “none of the data used was associated with a specific person’s Facebook account.”
“We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible,” it added.
“A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends or information from pages they follow. We carefully consider what research we do and have a strong internal review process.”
In the paper, the researchers said the study “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook.”
But that did nothing to stem the growing anger of Facebook users.
“#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT ... Yeah, time to close FB acct!” one Twitter user said.
Other tweets used words like “super disturbing,” “creepy” and “evil,” as well as angry expletives, to describe the psychological experiment.
Susan Fiske, a Princeton University professor who edited the report for publication, said the researchers assured her the study had been approved ahead of time by an ethics review board.
“They approved the study as exempt, because it is essentially a pre-existing dataset, part of FB’s ongoing research into filtering users’ news feeds for what they will find most interesting,” she told AFP in an email.
“Many ethical issues are open to debate, and this one seems to have struck a nerve.”
Katherine Sledge Moore, a psychology professor at Elmhurst College, said the study was fairly standard overall, especially for so-called “deception studies,” in which participants are given one purpose for the research when they provide initial consent and told later what the study is really about.
In this case, however, the study’s subjects did not even know they were in it.
“Based on what Facebook does with their newsfeed all of the time and based on what we’ve agreed to by joining Facebook, this study really isn’t that out of the ordinary,” Moore said.
“The results are not even that alarming or exciting.”
-
Articles by Korea Herald