A new Wall Street Journal story probes the frequency and casualness with which Facebook ran experiments with the explicit aim of manipulating users’ emotions. Some commentators pooh poohed the concern about the study, saying that companies try influencing customers all the time. But the difference here is that manipulation usually takes place in a selling context, where the aims of the vendor, to persuade you to buy their product, are clear. Here, the study exposed initially, that of skewing the mix of articles in nearly 700,000 Facebook subscribers’ news feeds, was done in a context where participants would have no reason to question the information they were being given.
Facebook’s conduct fell so far below acceptable standards for conducing research that it would have been criminal if funded by Federal grants. As Jaron Lanier, an independent scientist at Microsoft, pointed out in a New York Times op-ed:
Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. Facebook’s generic click-through agreement, which almost no one reads and which doesn’t mention this kind of experimentation, was the only form of consent cited in the paper. The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care…
It is unimaginable that a pharmaceutical firm would be allowed to randomly, secretly sneak an experimental drug, no matter how mild, into the drinks of hundreds of thousands of people, just to see what happens, without ever telling those people. Imagine a pharmaceutical researcher saying, “I was only looking at a narrow research question, so I don’t know if my drug harmed anyone, and I haven’t bothered to find out.” Unfortunately, this seems to be an acceptable attitude when it comes to experimenting with people over social networks. It needs to change.
And in case you think this study fell within the ambit of market research, think again. From Forbes:
Defenders of the Facebook study including my colleague Jeff Bercovici say that everyone on the Internet is doing A/B testing — showing users two versions of something to see which resonates more based on how they click, share, and respond. But the Facebook study with its intention to manipulate the Facebook environment for unknowing users to see whether it made them feel elated or depressed seems different to me than the normal “will this make someone more likely to buy this thing” kind of testing. “They actually did a test to see whether it would have a deleterious effect on their users,” says Pam Dixon of the World Privacy Forum. “This isn’t A/B testing. They didn’t just want to change users’ behaviors, they wanted to change their moods.”
The Forbes article also points out that the Facebook user agreement didn’t include “research” as a possible use of user information when the study was underway; Facebook didn’t incorporate than until 4 months after the project was completed, in May 2012. Note that Facebook was also negotiating a consent decree with the Federal Trade Commission over “unfair and deceptive” concerning member privacy. But since that decree wasn’t inked until August 2012, Facebook appears to have believed it could continue to play fast and loose.
Tonight, the Wall Street Journal reports that Facebook had a team of two dozen researchers, the Data Sciences Group, whose job is to run experiments on Facebook users. It has conducted over 1000 tests since 2007. For instance, Facebook locked some users out and required them to prove they were legit as part of an anti-fraud study. From the Journal:
“There’s no review process, per se,” said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013 who worked on the study that required users to prove they were real. “Anyone on that team could run a test,” Mr. Ledvina said. “They’re always trying to alter peoples’ behavior.”
He recalled a minor experiment in which he and a product manager ran a test without telling anyone else at the company. Tests were run so often, he said, that some data scientists worried that the same users, who were anonymous, might be used in more than one experiment, tainting the results….
One published study deconstructed how families communicate, another delved into the causes of loneliness. One test looked at how social behaviors spread through networks. In 2010, the group measured how “political mobilization messages” sent to 61 million people caused people in social networks to vote in the 2010 congressional elections.
The interest in the effectiveness of political messaging is troubling given Facebook’s connections to the Department of Defense. From SCG News:
In the official credits for the [emotions] study conducted by Facebook you’ll find Jeffrey T. Hancock from Cornell University. If you go to the Minerva initiative website you’ll find that Jeffery Hancock received funding from the Department of Defense for a study called “Cornell: Modeling Discourse and Social Dynamics in Authoritarian Regimes”. If you go to the project site for that study you’ll find a visualization program that models the spread of beliefs and disease.
Cornell University is currently being funded for another DoD study right now called “Cornell: Tracking Critical-Mass Outbreaks in Social Contagions” (you’ll find the description for this project on the Minerva Initiative’s funding page).
The Department of Defense’s investment in the mechanics of psychological contagion and Facebook’s assistance, have some very serious implications, particularly when placed in context with other scandals which have broken in the past two years.
In other words, researchers that the Department of Defense is funding to understand how ideas and news goes viral are doing very similar work for Facebook. The cross pollination is high and means that Facebook users are making a direct contribution not only to the surveillance state having even more data, but to perfecting its methods.
This revelation is unlikely to have any meaningful impact on Facebook in the US. But European countries have much stricter privacy rules, and this news is likely to intensify political pressure to take action to curb US technology giants’ reach and activities. So while Facebook users overseas might eventually see some concrete protections put in place, don’t hold your breath over any changes stateside.