Above are the results of a controversial study in which Facebook altered the News Feeds of its users in order to determine if “emotional states [could] be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” Robinson Meyer explains:
For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. …
Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
Meyer notes that the experiment was “almost certainly legal”. But Katy Waldman doubts anyone could argue that users really consented:
Here is the only mention of “informed consent” in the paper: The research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” That is not how most social scientists define informed consent. … So there is a vague mention of “research” in the fine print that one agrees to by signing up for Facebook. As bioethicist Arthur Caplan told me, however, it is worth asking whether this lawyerly disclosure is really sufficient to warn people that “their Facebook accounts may be fair game for every social scientist on the planet.”
Katie Collins notes:
In the Code of Ethics and Conduct published by the British Psychological Society, it is stated that psychologists should: “Ensure that clients, particularly children and vulnerable adults, are given ample opportunity to understand the nature, purpose, and anticipated consequences of any professional services or research participation, so that they may give informed consent to the extent that their capabilities allow.”
Adrienne LeFrance reports that the study did go through an institutional review board – a tool used by the scientific community to assess the conduct of researchers when their experiments involve humans. The approval was “on the grounds that Facebook apparently manipulates people’s News Feeds all the time”. Ha! Laurie Penny rings the alarm:
Nobody has ever had this sort of power before. No dictator in their wildest dreams has been able to subtly manipulate the daily emotions of more than a billion humans so effectively.
There are no precedents for what Facebook is doing here. Facebook itself is the precedent. What the company does now will influence how the corporate powers of the future understand and monetise human emotion. Dr Adam Kramer, the man behind the study and a longtime member of the company’s research team, commented in an excited Q & A that “Facebook data constitutes the largest field study in the history of the world.” …
Emotional engineering is, and always has been, Facebook’s business model. It is the practice of making itself socially indispensable that has ensured that, for many millions of people, Facebook has become the default front page of the internet. Their newsfeed is literally that – it’s the first place many of us go to find out what’s been happening in the world, and in the worlds of those we love, those we like, and those we once met at a party and got an awkward friend request from two weeks later.
Bershidsky reminds us that Facebook’s ongoing daily behavior isn’t exactly beyond reproach either:
An algorithm called EdgeRank scores each post on a number of criteria; such as how frequently a News Feed owner interacts with its author and the quality of that interaction (a comment is more valuable than a “like”). The higher-ranked posts go to the top of the feed. That’s why a typical user doesn’t see everything her friends are posting — just what Facebook decides she’d be interested in seeing, plus paid advertising (which is also supposed to be targeted). You can tweak the settings to make posts appear in their “natural” order, but few people bother to do it, just as hardly anyone ever reads Facebook’s data use policy: buried among these 9000 words, there is a sentence that says research is a legitimate use. … Facebook manipulates what its users see as a matter of policy.
Kashmir Hill raises an eyebrow at the site’s response to the backlash:
Mid-day on Sunday, Facebook data scientist Adam Kramer who helped run the study also commented on it through a post on his Facebook page. … Kramer says, essentially, that the reason he and his co-researchers did this study was to make Facebook better. “[W]e care about the emotional impact of Facebook and the people that use our product,” he writes. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
Kramer sounded a wee bit apologetic: “In hindsight, the research benefits of the paper may not have justified all of this anxiety.” He said that Facebook is working on improving its internal review practices for approving experiments like this and that it will “incorporate what we’ve learned from the reaction to this paper.”
Meanwhile, Charlie Warzel flags some pointed criticism on whether the study was even effective:
Dr. John Grohol, founder of the psychology site, Psych Central said he sees there two major flaws in the study, starting with the use of its sentiment analysis tool, Linguistic Inquiry and Word Count application (LIWC 2007). It’s a software program linguists and others psychologist commonly use in their research and it’s a well-understood tool that’s been pretty widely use but it was never designed to be used for small bits of text. …
Furthermore, Grohol said, the study, while focused on exploring emotional contagion, doesn’t actually measure the moods it’s trying to capture. “They never went to Facebook users and had them fill out a mood questionnaire. Instead the authors were making strange judgement calls based on content of status updates to predict a user mood,” he says, noting that the authors would likely need some other tool or survey to accurately gauge something as complex as emotional state.
And Cowen wonders if we should even care:
Clearly plenty of ads try to manipulative us with positive emotions, and without telling us. There are also plenty of sad songs, or for that matter sad movies and sad advertisements, again running an agenda for their own manipulative purposes. Is the problem with Facebook its market power? Or is the the sheer and unavoidable transparency of the notion that Facebook is inducing us to pass along similar emotions to our network of contacts, thus making us manipulators too, and in a way which is hard to us to avoid thinking about?