What Everyone’s Getting Wrong About Facebook’s ‘Creepy’ Study

What Everyone’s Getting Wrong About Facebook’s ‘Creepy’ Study

If you’ve been anywhere near this great, big internet of ours over the past few days, you’ve probably heard about Facebook’s outrageous breach of trust. Except, at least this time, Facebook might not be entirely in the wrong.

The Study

In 2012, Facebook decided to spend a week doing a series of experimental user testing, something companies do constantly and something Google once had a whole product arm devoted to.

In the study, researchers took about 690,000 English-speaking users’ timelines (or about 0.05% of all Facebook users’ total), split them in half, and removed more negative content than usual from one and more positive content than usual from the other. All of which produced So for the group that saw more positive status updates (Negativity Reduced), there was a 0.07 per cent increase in the users’ own positive statuses. For the group that saw more negative status updates, (Positivity Reduced), there was a 0.01 per cent increase in the users’ own negative statuses. Which sounds small, sure, but in Facebook world, that could still translate to hundreds of thousands of actual users. And that’s what has so many people up in arms.

The Complaints

“Facebook intentionally made thousands upon thousands of people sad,” said Slate’s Katy Waldman. While Adrienne LaFrance at The Atlantic described Facebook’s as “the puppet masters who play with the data trails we leave online” and “new-level creepy.”

Because while Facebook’s Data Use Policy, which all of us checked and none of us read, does contain a section detailing how Facebook is free to “use the information we receive about you… for internal operations, including troubleshooting, data analysis, testing, research and service improvement,” it’s common knowledge that no one actually reads that stuff. Which means even though Facebook is technically playing by the informed consent rules, it’s a pretty shady way to go about it.

It wasn’t just that Facebook was misleading, though. Facebook may have actively manipulated our emotions. As James Grimmelmann, a Professor of Law at the University of Maryland wrote:

The study harmed participants….

The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That’s psychological manipulation, even when it’s carried out automatically.

All of which sounds pretty awful. But, fortunately, that’s not quite what happened.

Why They’re Misplaced

Yes, Facebook’s “informed consent” is a pretty weak attempt at letting users know that their News Feeds are fair game for research purposes, but it’s a rule that Facebook doesn’t even technically need to play by in the first place. Since the company isn’t receiving any federal funding for its research, it’s being held up to standards that don’t actually apply — but that it’s fulfilling regardless.

And companies play with our emotions like this all time. That doesn’t necessarily make it ok, obviously. But it does suggest that perhaps our indignance is a little misplaced. If you’re really that outraged at being given a slightly different experience than others without your express consent, than you have a big problem with the entire marketing industry. Which is valid — lots of people do! But in the grand scheme of things, in this instance, Facebook really didn’t do all that much (to us or otherwise) other than run a slightly flawed experiment.

As Dr. Karen Dill-Shackleford, Program Director of Media Psychology at Fielding University, explained to Gizmodo:

People do not generally read legalese. They take their chances and manage their investments of time. At the same time, I would call the study “minimal risk” from an IRB perspective (I’m on the IRB and have been for years at a different institutions).

It’s wroth noting that “minimal risk” is the most innocuous sort of study you can have. It’s essentially saying that, within the course of the study, participants are incurring no more risk than they would in their daily lives.

And the complaints that Facebook “manipulated users’ emotions” is a fairly warped description of what the experiment actually entailed. While manipulation entails active attempts to produce a certain result, Facebook wasn’t actually adding anything to people’s News Feeds. Rather, it was simply removing a certain portion of either positive or negative posts.

Facebook didn’t seek out posts with particularly inflammatory language. It didn’t pick and choose the perfect combination of emotional fuel. It simply made you see fewer posts of a certain nature than you would have otherwise. Every post that remained would still have been there whether you participated in the experiment or not. If that still sounds dangerous to you, then you should have deleted your Facebook a long, long time ago.

What We’re Really Dealing With

The fact of the matter is that Facebook is a free, for-profit, voluntary service. We all agreed to its terms and conditions when we signed up, whether or not we decided it was worth our time to read them. And it’s not like Facebook is really doing anything out of the ordinary here, other than making their private research publicly available.

Google, Youtube, Twitter, eBay — they’re all constantly running little experiments to see how those experiments affect (i.e. manipulate) us into behaving differently. Without that kind of research, our user experience on those sites would probably be a hell of a lot less enjoyable. They’re running these tests so we’ll want to stay and make them money, not (presumably) for the sheer joy of playing god.

It just so happens that, in this case, Facebook’s research into what makes us more engaged also has some interesting implications for academia at large. And, as Tal Yarkoni, a Research Associate in the Department of Psychology at the University of Texas in Austin, so eloquently explained:

Facebook doesn’t have to share any of its data or findings with the rest of the world if it doesn’t want to; it could comfortably hoard all of its knowledge and use it for its own ends, and no one else would ever be any wiser for it.

The fact that Facebook is willing to allow its data science team to spend at least some of its time publishing basic scientific research that draws on Facebook’s unparalleled resources is something to be commended, not criticised.

I’m not trying to suggest that Facebook is a victim here, or that we don’t have cause to criticise it on a near-daily basis. This, however, is not one of those times. Our outrage is misplaced and potentially even harmful. We should want Facebook to continue making its data available to the public — god knows it has more resources than most.

And if the ultimate result of this study is that we all start seeing a few less angsty posts in our News Feed, good riddance.

Picture: Jim Cooke


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.