Despite a long career in journalism, I don’t often pontificate publicly about the news. I have a low tolerance for the kind of navel-gazing we in the business habitually engage in. Plus, the rules have changed so much and so quickly there’s a constant danger of applying yesterday’s perspectives to tomorrow’s problems.
But an important point has been overshadowed in the kerfuffle over whether Facebook did—or did not—deliberately suppress news stories of interest to conservatives in its Trending Topics feature. The real issue isn’t whether it engaged in deliberate manipulation. It’s how news sources that tailor themselves to the perceived interests of individual users create an echo chamber—what the experts call a “filter bubble”—in which the users’ assumptions and biases are constantly reinforced.
For a great illustration, take a look at this brilliant feature put together by The Wall Street Journal, providing a continually updated, side-by-side look at Facebook news posts from liberal and conservative sources. The Journal is careful to say these “aren’t intended to resemble actual individual news feeds.” Still, they illustrate just how starkly different the world can appear, depending on the kind of information being shown to you.
In the blue feed, what liberals are more likely to see, you might read that President Obama made “a beautiful speech” at the site of the first atomic bomb blast in Hiroshima last week, “and delivered a message that the entire world needed to hear.” In the red, conservative, feed, the speech “spits on the graves of American WWII heroes,” called the United States “evil” and “disgraces America.”
Click on “Guns,” and you might see something like this headline in the conservative feed: “WH Makes Final Push to Take the Guns.” The story reports that “Vice President Joe Biden recently urged state and local officials to pick up the task of subverting the Second Amendment.”
We aren’t really screaming at each other; we’re each of us locked in our own cells, screaming at ourselves.
Some full disclosure is called for here: The company I work for in my day job, SmartNews, has a mobile app, and while we don’t directly compete with the Facebook behemoth, you could say we’re an alternative to it when it comes to news.
But while we, like Facebook, use algorithms to find and select the news we include, we’re optimized to display a wide range of sources, rather than focusing on news custom-tailored for each individual user. That means we sometimes get slammed in the app-store reviews by liberals who hate coming across stories from Fox News or Breitbart, or conservatives who loathe everything the Huffington Post and Mother Jones produce.
For anyone with strong views on politics and society, the stories they see on Facebook provide a much less jarring experience. And that’s exactly the problem.
If all you ever see is stuff that reinforces what you already believe, how exactly are you going to see anything that might call those beliefs into question? “When my information changes, I alter my conclusions,” John Maynard Keynes is supposed to have said. “What do you do?” But if you’re being fed a steady diet of the same information, where is the impetus to rethink conclusions going to come from?
The news-business changes wrought by technology have made things even worse. There are more sources of information than ever before, which in theory is a good thing. But the cacophony also means it’s harder than ever for any individual source to be heard—a situation that rewards those who can yell the loudest, or say the most outrageous things, in order to get noticed.
So we get headlines like “C.I.A Links Top Hillary Donor George Soros to Terrorist Bombing.” (Because, let’s see, he backed Vaclav Havel’s Czech opposition movement in the 1980s.) Or “A Bombshell Letter Reveals Donald Trump’s Hatred for and War on Disabled Veterans”—because he wanted to limit the presence of street vendors on Fifth Avenue.
“Bombshell” seems particularly popular in headlines seeking to stoke anger. “Bombshell New Evidence That Obama’s Birth Certificate Is 100 Percent Fake” read a post that showed up in The Journal’s conservative feed yesterday—the “evidence” being a radio interview with the anti-immigrant sheriff of Maricopa County, Arizona.
Nor is it any accident that the interview occurred nearly a year ago and was widely circulated among birthers at the time. Many of the most inflammatory posts in The Journal’s feeds turn out to be weeks, months or even years old. To a personalization algorithm, these are proven winners, guaranteed to stir the juices of true believers. That they aren’t “news” in the most basic sense of the word—there’s nothing new about them—is irrelevant if the point of the game is to generate clicks rather than to convey relevant information.
It isn’t even the case that filter bubbles are promoting an angry dialogue between people of opposing viewpoints. They are promoting no dialogue at all. We aren’t really screaming at each other; we’re each of us locked in our own cells, screaming at ourselves.