Facebook’s New Tool to Combat Fake News Is Counterintuitive but Effective

Facebook began testing the idea in April and expanded the scope of application in August after noticing a positive effect. Justin Sullivan/Getty Images) Justin Sullivan/Getty Images

Facebook’s attitude toward fake news has turned 180 degrees in the past 12 months, from flat-out denying distributing fake news in November 2016 to dedicating a team of 2,000 to combat misinformation, as reported in its last investor meeting. And while the company’s latest effort in this area sounds a bit counterintuitive, it will likely be more effective than previous attempts to curb the spread of false news stories.

Facebook will add a new feature to include a list of related articles under each suspicious news story that has been shared, the company announced Wednesday. This new feature will replace the “disputed flags” on false stories reported by users or third-party fact checkers—a warning system that Facebook implemented in December 2016 but was found to have inadvertently worsened the problem.

“Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs—the opposite effect to what we intended,” Teressa Lyons, a product manager at Facebook, said in a statement released Wednesday.

Jay Van Bavel, a psychology professor at New York University, called this a “backfire effect,” where corrections of misinformation actually increase the belief in the false information. “One reason is because people frequently engage in motivated reasoning to confirm their prior beliefs,” Van Bavel told Observer. “Another problem with flagging suspicious news posts is the ‘implied truth’ effect, whereby false stories that fail to get flagged are seen as more valid.”

“Related Articles, by contrast, are simply designed to give more context, which our research has shown is a more effective way to help people get to the facts,” Lyons said.

The idea of recommending related articles isn’t new. Facebook first introduced the tool in December 2013 to boost website traffic—when a user finishes reading a story, a list of articles on the same topic is shown at the bottom of the page. Recommending related articles has, in fact, become a common practice for digital media sites.

But it’s the first time for Facebook to apply related articles as a tool to combat fake news. With a subtle tweak to its original version, the new system will recommend articles before a user reads a shared story, placing the recommendation list beneath the shared link in a user’s news feed. “That should provide people easier access to additional perspectives and information, including articles by third-party fact checkers,” Sara Su, a product manger at Facebook, wrote in a product release.

“I think the best part of Facebook’s strategy is to simply demote fake news in people’s feeds (and demote sources that consistently produce fake news). This will help minimize the need for motivated reasoning and the viral spread of fake news,” Van Bavel said.

Facebook began testing the idea in April and expanded the scope of application in August after noticing a positive effect. They found that when related articles are placed next to a false news post, fewer people would share the story than when a disputed flag is shown.

Facebook’s New Tool to Combat Fake News Is Counterintuitive but Effective