How Facebook Automates Irresponsibility

Analysis: Facebook's Safety Check and News Feed each have something in common—spin

Facebook's safety check page.

Facebook’s safety check page for last week’s derailment. Screenshot

As Wired pointed out Thursday, Facebook CEO Mark Zuckerberg has been making moves that fit someone looking at a presidential run. He’s made a commitment to visit every state.  He’s downplayed his past atheism. And he’s used next level spin to deflect blame for his company’s bungles, preventing anyone from holding it accountable by systematically pinning more responsibility on its users.

Let’s look at a couple of SNAFUs in its products as cases in point.

Safety Check

Something weird happened in New York last week. All of the sudden, New Yorkers were asked to check in as safe in “The Train Accident in Brooklyn.” An LIRR train lost control at Atlantic Terminal and two cars derailed. Thanks to Facebook software automatically activating Safety Check after people nearby posted about it, I had people I barely know from far away asking me to check in as “okay.” I didn’t. In a city of more than eight million people, when 104 people get hurt in one place, it’s a safe bet that I’m one of the folks that made it through.

But I don’t blame people for worrying. Facebook has built a system that unthinkingly fosters anxiety.

The image Facebook posted with the incident (above) makes it look like the train accident reverberated across the metro area, yet in truth the impact represented no more than a dot on that map, maybe one pixel. It looks silly to a New Yorker, but people who aren’t familiar with the place won’t pick up on that.

To put this in perspective, I’m from a Kansas town of 20,000 people. If I were home and a car had a collision on the edge of town that sent one person to the hospital, it would be significantly more likely that I had been the one hospitalized, at least based on the number of people in each place.

Facebook CEO and founder Mark Zuckerberg asks a question during the CEO Summit of the Americas in Panama City on April 10, 2015. AFP PHOTO/MANDEL NGAN

Mark Zuckerberg asks a question. MANDEL NGAN/AFP/Getty Images

Safety Check came about because Facebook engineers were inspired by the Japanese tsunami to give users a quick and easy way to broadcast the fact that they were okay. The basic idea here is noble. I was in D.C. on September 11th. While only one building ultimately got hit in the Washington area, in the thick of the tragedy rumors abounded that many had been struck. The phone system got overwhelmed by people calling in and out to check on each other. It took a long time for me to get back to everyone that had reached out.

Facebook makes this instant. That’s great. Facebook knows it is great. So it went on a big self-congratulation tour for this and other socially positive features late last year, which swung this cover story in Wired. But its thin-skinned C-suite can’t weather flak when doing good goes wrong. The first time Facebook implemented Safety Check after a terrorist incident followed the shootings in Paris (killing 129), which happened one day after a bombing in Beirut (which reportedly killed 43). The site got pilloried for what was seen as ignoring deaths in a non-Western country.

‘News’ means something, and as people who work in news know, the more people you have reading, the more responsibility it has

Rather than work to improve its internal process, the company did what tech companies do: punted and hid behind spin about empowering its users. As we reported in November, now a third party company feeds records of large incidents that hurt multiple people into Facebook’s back end (the company has declined to identify the service). Then, if people near one of those disasters start posting about it, at a certain point the site activates Safety Check. That way, whatever Safety Check does or doesn’t do, no one can blame the company.

Safety Check activates without anyone doing any thinking. It just goes live.

Does that mean that last week’s train accident wasn’t scary and that I don’t feel bad for the people hurt or terrified that morning? No, of course not, but it wasn’t such a big deal that people should have been afraid for their friends in New York, and they wouldn’t have been if Facebook hadn’t told them they should.

At least I don’t think it was a big enough deal, not after thinking it through, which is more thinking than anyone at Facebook did about the incident.

If a tech company wants to do good, great, but it still needs to take responsibility for the features it builds.

News Feed

Facebook’s algorithmic abdication here looks like Zuckerberg’s initial refusal to admit his site was responsible for the proliferation of fiction masquerading as news (also known as “fake news”). Zuckerberg wants to argue that it only gives users a way to more efficiently distribute what they want to share. In other words: it’s not us!

Though he has since relented, yet again turned the news vetting job over to a third party (well, several).

Annalee Newitz at Ars Technica points out a deeper problem: that Facebook calls its feed a “news feed.” When Facebook first debuted the idea of showing users what its friends were doing, they called it a “Wall,” which captured its true purpose well: a place for randos to scrawl. “News” means something, and as people who work in news know, the more people you have reading, the more responsibility it has.

Facebook executives don’t seem to have culturally grown out of the era when people only used it to size up classmates they might sleep with, even though now more people look at the News Feed than any news site in the world. Newitz wrote another essay where she argued that Facebook should just hire people to watch its feed (it did make one significant related hire recently).

That feed is the new front page of the internet. It’s so powerful that its creator has made sure that he continues to control it, no matter what he decides to do next.

The flip side of power? Responsibility. Facebook isn’t a startup anymore. It’s a big public company. The move fast and break things days have ended. It’s time to be careful and not hurt anybody.

Taking responsibility for the organization you’re in charge of looks good on a potential candidate for president, but you know what? Forget about that.

Taking responsibility for your company also looks pretty good on a CEO.

For all the Facebook users clamoring for a Dislike button, know that Mark Zuckerberg and his team have been working on a solution. According to the social media magnate, the new feature will not specifically be a Dislike button, but instead another wording that will allow users to show empathy towards one another. In a public question and answer session, Zuckerberg announced plans that will make it clearer when someone feels sorry for the less than positive moments going on in a person?s life.

 

How Facebook Automates Irresponsibility