The Effect of Facebook’s Social Media Silo on Itself and You

It turns out the behemoth is dividing communities and stunting social discourse

Facebook effectively creates personalized information bubbles for its users. Dan Kitwood/Getty Images

You reinforce the walls of your personal information bubble. At least, that’s what SUNY Buffalo communication professor Ivan Dyelko and his research team found. In “The dark side of technology: An experimental investigation of the influence of customizability technology on online political selective exposure,” Dyelko and his coauthors report that people are much more likely to click and spend time on articles that reflect their pre-existing biases. In the tests they did, the only group that was likely to spend significant time reading articles that challenged their beliefs was the one in which the news feed was randomized, not weighted by user preference or an algorithm based on prior user behavior. Ask yourself: Do any of your social media services or search engines work on randomization? Or do they all show you what they think you want to see?

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

So what? Unlike a certain search engine that weights results instead of providing organic returns to user inquiry, social media companies are now shrugging sheepishly and saying, “Yeah, we totally contributed to the siloing of social discourse. Um. Sorry?” (It helps that there’s a study that points out their role in the media failures of 2016.)

To watch Mark Zuckerberg reposition himself as a tech executive straining to move outside the Silicon Valley bubble—from the visit-every-state tour to the pro-community manifesto that asks, “How do we help people build an informed community that exposes us to new ideas and builds common understanding in a world where every person has a voice?”—is to watch Facebook (META) try to thread the needle between corporate self-interest and public interest in things like cohesive public discourse, open access and quality journalism. Farhad Manjoo’s examination of how Facebook is trying to offset its own complicity in the filter effect sums up the company’s challenge, The solution to the broader misinformation dilemma—the pervasive climate of rumor, propaganda and conspiracy theories that Facebook has inadvertently incubated—may require something that Facebook has never done: ignoring the likes and dislikes of its users.”

Who cares? Social media investors should. Facebook is considered a great long-term stock buy right now because it’s virtually monopolizing a market—MySpace is an also-ran, Google (GOOGL)+ is a niche product, and LinkedIn was bought so it could become a data collection mechanism for Microsoft’s suite of machine learning-enhanced workplace tools. But Facebook’s monopoly depends on it maintaining its user base of 1.9 billion active users, all of whom generate monetizable content for Facebook. If those users perceive a reason to flee Facebook—if they feel it’s biased, untrustworthy, routinely violating their safety or creating social friction—then the data sets Facebook sells to advertisers and publishers become less valuable, meaning the company eventually loses value.

We’ve already seen this happen with Twitter. Google, Disney and Salesforce all backed away from buying the microblogging company in 2016 for two reasons: Twitter hasn’t been able to successfully monetize its users (unlike Facebook), and Twitter has a growing reputation among current and former users as being unconcerned with the quality of user experience. That experience doesn’t just include the bullying endemic to Twitter; it also includes the content of the tweets themselves. On Twitter, nobody knows if you’re a robot someone paid to promote a specific ideology.

And you, dear reader, should care about your filter bubble potential, too. As a recent New York Times op-ed on the virtues of hate-reading reasoned, “Reading what you hate helps you refine what it is you value, whether it’s a style, a story line or an argument… Defensiveness makes you a better reader, a closer, more skeptical reader: a critic. Arguing with the author in your head forces you to gather opposing evidence. You may find yourself turning to other texts with determination, stowing away facts, fighting against the book at hand. You may find yourself developing a point of view.”

If you are the type of person who has actually woken up out of a fitful sleep saying, “And you’re wrong!”—and I’m going to guess, based on the self-selecting nature of people who read So What, Who Cares, every one of you has lost sleep at least once because someone was wrong on the Internet—then you are going to want to be right when you argue on the Internet or in your daily life. Being right requires identifying our own bubble, admitting why you’ve chosen to live in it, and then looking for the points of view that poke at it to see what breaks and what holds true.

Want more? There’s a whole archive of So What, Who Cares? newsletters at tinyletter.com/lschmeiser. In addition to the news analysis, there are also fun pop culture recommendations. 

The Effect of Facebook’s Social Media Silo on Itself and You