Facebook ran an A/B test. Get out the torches and pitchforks!
One of the things the Internet is best at is righteous indignation. The crowd can take up its pitchforks and torches in a heartbeat. Virtual lynchings are becoming the norm and, just as when lynchings happened IRL, they often target innocent victims. The media, quick to jump on a meme gaining velocity, spreads the stories with equal disregard for critical thinking or fact checking.
Consider fast-food chain KFC, vilified and pilloried when the crowd became outraged—outraged, I tell you—that an employee in a Mississippi restaurant asked the grandmother of a young girl disfigured in a pit bull attack to leave because her appearance was upsetting other diners. The public bashing of KFC led the chain to donate $30,000 to the girl’s care, find her a specialist, apologize, and promise action. It took a couple days to figure out that the incident never happened. In fact, the restaurant regularly serves patients at a nearby hospital whose appearance is far worse. (To its credit, and in a great PR move, KFC will stand by its donation.)
Today’s torches-and-pitchforks uprising is aimed at a popular target, Facebook. And after reading a dozen articles and analyses—most expressing horror and disgust—I can only conclude that the mob mentality in this case has infected the Net for one simple reason: It’s Facebook. But mob the has definitely formed: Search the keywords “Facebook,” “psychological,” and “experiment” in Google News and you’ll find more than 2,500 results.
News Feeds were manipulated to test user response to sentiment
You’ve probably already heard the story, and the characterizations have perhaps set your blood boiling. If not, here’s the recap:
Facebook tweaked the algorithms that dictate what users see in their News Feeds for almost 700,000 of its users to determine if the sentiment of content affected their own posts. Not surprisingly, an increase in negative tone leads users to write negative posts, while a bump in positive content results in more positive updates.
The experiment was published in the Proceedings of the National Academy of Sciences. The groundwork for the study comes from three earlier studies of how users’ emotions are affected on Facebook, none of which produced the reaction this one did (though, to be fair, those studies analyzed data without manipulating the content users saw).
The outraged and the media have labeled the study a “psychological experiment,” though the Facebook representative who led the study is a data scientist, not a psychologist. Working with the data scientist, Adam Kramer, were a Cornell University professor and a UCSF post-doctoral fellow. It’s worth pointing out that neither Kramer’s two colleagues nor Cornell or UCSF have been targeted for any abusive or inappropriate behavior. Apparently, neither saw any ethical or moral problem with the research methodology, despite their affiliation with upstanding research institutions. In fact, according to an Atlantic report, their university review boards also approved it.
But because the experiment was conduct in secret—based on Facebook’s terms of service that gives Facebook the right to conduct such experiments—the pitchfork-and-torches crowd has swarmed into the streets.
A regular practice everywhere
Meanwhile, an article posted to RepCap points out that the team at Visual Website Optmizer has been using A/B testing software for years, discovering which case studies inspire people to spread the word, produce leads, and drive sales. The article points out that this A/B testing produced one post that generated more than $13,000 in hard dollars and 92 assisted sales conversions.
Let’s be clear about A/B testing. It’s a psychological experiment to see how people respond emotionally to one approach to content versus another. A/B testing is popular among email marketers, who find the optimum subject line to get someone to open the email. A/B testing, according to Wikipedia, “is a simple randomized experiment with two variants, A and B, which are the control and treatment in the controlled experiment.”
If psychological experiments among bloggers and email marketers isn’t enough, then look to Upworthy, whose writers produce up to 25 headlines for every post, then employ testing tools to determine which will provoke the desired emotional response. That’s right: Upworthy engages in a psychological experiment with every single post it writes.
In each of these cases—and thousands upon thousands more—nobody asks the audience if it’s okay to run the experiment on them. Outrage over this standard practice is reserved for Facebook. Why? Because it’s Facebook, the social network everybody loves to hate.
When you stop to consider that the Visual Website Optimizer blog post case study convinced people to spend money, you might conclude that it’s far more nefarious than the Facebook experiment, which just wanted to figure out how people react to what they see. But, of course, that’s not how the knee-jerk crowd reacted. David Holmes, writing for Pando Daily, labeled the experiment “unethical.” He fretted over Facebook’s ability and willingness wield its power to shift users’ emotional states.
You know, like advertising tries to do with every single TV spot ever produced.
Of course, he’s also vexed over the fact that they didn’t conduct the experiment without telling users. Again, though, neither does any A/B tester anywhere, ever.
Did the test rise to the level of “psychological experimentation” as we usually think of it?
In fact, the labeling of the algorithm tweak that produced the results as a “psychological experiment”—a label never used in the National Academy of Sciences paper—is inflammatory. When we talk of psychological experiments, we’re usually talking about things like the Stanford Prison experiment, in which Philip Zimbardo cast students as prisoners and guards, result in guards abusing the prisoners who suffered anxiety and stress after just a few days. Or the Milgram Obedience Experiment, in which participants were asked to shock a subject when giving the wrong answer to a query; most were happy to deliver the highest voltage despite the pain the study subject appeared to be suffering.
A subtle and short-term adjustment to the sentiment of the posts you see on Facebook hardly rises to the level of these studies. The Facebook test is far closer to headline A/B tests to determine which delivers the desired behavior on the part of the audience. Yet labeling the Facebook test a “psychological experiment” automatically elevates it to the level of The Asch Conformity Expriment.
If Buzzworthy had run a similar experiment to see which headline you clicked and which you passed, and published the results in the same place, the collective response of the Internet probably would have been, “Huh; how about that?”
Tempest in a teacup
The arguments I’ve heard against Facebook’s behavior fall into a few categories:
- They didn’t get permission—As I noted, nobody gets permission in A/B testing. Ever. And there’s a simple reason for that. If I tell you that I’m going to see if slightly more positive or negative posts produce a particular response, your behavior will change. The purity of the sample is polluted and the results invalidated.
- It’s an abuse of Facebook’s enormous power—What’s the goal of the research? Is it, as VentureBeat’s Mark Sullivan asks, “to mess with people’s feeds and moods on a regular basis?” Again, it’s head-shakingly amazing to me that I even need to point out that this is what every online publisher does every single minute of every single day. Even my blog posts, tweets and Facebook updates are designed to get you to react. Those institutions with the resources conduct a cumulative billions of dollars worth of research to figure out how to do it best.
- Facebook shouldn’t mess with your News Feed—You don’t see every post from every Facebook friend or from every page you’ve liked. (Among some Facebook marketers, there is a remarkable belief that someone who likes your brand page has signaled that they want every single update from that page.) What you get is an algorithmically curated collection based on what you’re interested in. So Facebook manipulates your News Feed and always has. The algorithm is routinely tweaked. (That’s why some marketers believe their organic reach has fallen.) I’m just not bothered that they would run a test to see if positive updates in the Feed result in positive updates from the user. (Other, that is, than the obvious, “Well, duh” reaction I had to the study.)
- Facebook is large and it’s not a mailing list, which alters the rules—What rules? Where’s the size cutoff? Where’s the guideline that says it’s okay for this company to run A/B tests but based on this clearly delineated criteria, it’s not okay for that company to do it?
- What chutzpah Facebook had to publish the results in a scientific journal!—Keep in mind, Cornell and UCSF researchers were involved, too, and they all thought the results were interesting. (More interesting than I did, frankly.) But clearly nobody thought it was an ethical or moral breech that would lead to a virtual lynching. (Of Facebook. Again, neither UCSF nor Cornell have been called out for their participation.)
Facebook is as matter-of-fact about the experiment as I am. Here’s what the company said in a statement to Forbes:
We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.
Ultimately, I’m untroubled by the experiment which is hardly uncommon among marketing organizations, did no harm, was not used to get people to spend money or otherwise behave differently than they otherwise might have, and did not rise to what most people consider “psychological experimentation.” It may be a tempest in a teacup, but that won’t stop the vitriol from the pitchfork-and-torch mob. After all, if you’re not hating on Facebook, you’re not one of the cool kids.
Get off Facebook. Get your family off Facebook. If you work there, quit. They're fucking awful.
— Erin Kissane (@kissane) June 28, 2014
06/29/14 | 16 Comments | Facebook ran an A/B test. Get out the torches and pitchforks!