Facebook Changed the Way It Experiments On Us, Barely

The biggest difference: they'll keep us updated more clearly on exactly how they're messing with our heads.

(Photo via Getty)

(Photo via Getty)

Facebook is mastering the art of responding to criticism and controversy with empty apologies and meaningless policy changes.

Facebook CTO Mike Schroepfer announced on Thursday that Facebook will be opening a new research website to showcase the way they experiment on users for the sake of “[building] a better Facebook.” As part of the announcement, Mr. Schroepfer outlined the ways they’ll prevent another publicity crisis, a la their last mass manipulation — but it doesn’t sound to us like they’ll actually be changing the way they do business.

A quick refresher: Facebook ran a study on hundreds of thousands of unwitting users in which researchers manipulated the kinds of posts that appeared in those users’ feeds. Then, they measured whether or not seeing positive or negative posts affected how those users shared status updates.

The study basically proved nothing. Facebook found that the effects of their mood manipulations were about the size of a rounding error, and the researchers themselves admitted they were barely measurable. Regardless of the inefficacy of the study, the media spent weeks driving back and forth over Facebook while they lay in the proverbial driveway. Facebook never apologized, only ever saying that they should have “communicated better.”

“Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism,” Mr. Schroepfer wrote in the latest announcement. “It is clear now that there are things we should have done differently.”

So what kind of changes will Facebook be making? They’ll start by establishing clearer guidelines and making sure more people are monitoring the research. They’ll also educate new engineers with workshops and updates to their “annual privacy and security training,” which sounds like the research ethics version of sensitivity training.

Here is what they will not do (besides admit they did anything wrong):

  • Get a third-party monitor: The new group that will keep tabs on Facebook’s experiments will be “a panel including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy and policy teams.” All of these people, presumably, are employees of Facebook.
  • Inform users that they are guinea pigs: Websites A/B test their users all of the time, but every ethicist worth their salt has criticized Facebook for running a mood manipulation experiment without letting anyone know upfront that it was even possible that they could be manipulated.
  • Let users opt-out: Even without letting someone know that they are being tested, they could still offer an option buried away in Facebook’s settings to the effect of: “Click here if you would like us to not toy with your emotions for product development purposes.” No word yet on the release of a do-not-mess-with-my-head button.

It could be worse: shortly after Facebook took a beating for their research, OkCupid gleefully announced its own bizarre, prank-like experiments on its users and received a much less harsh blow-back, largely because of how unashamed their bragging was.

Over the years, we’ve come to accept that the biggest tech companies of our time spy on us, sell our data, and use our private information as tools to sell ads. Now that Facebook has made it very clear that they are going to keep running experiments on users — and don’t much care how we feel about it — we’re one step closer to accepting it as a given that mass manipulation is, as OkCupid’s Christian Rudder puts it, another part of “how websites work.”

Facebook Changed the Way It Experiments On Us, Barely