Alex Stamos, Facebook’s former chief security officer, is one of the many senior executives Facebook lost this year in the wake of one PR crisis after another and rising tensions within the social media giant’s executive circle.
Last week, Stamos, who left Facebook in August to take a research professorship at Stanford University, was brought back into the spotlight after a bombshell New York Times investigation on Facebook’s handling of Russian meddling in the 2016 presidential election surfaced some previously unreported details. Included in the report? A dramatic conference room episode during which Facebook’s chief operating officer Sheryl Sandberg reportedly yelled at Stamos, “you threw us under a bus,” in September 2017 after he notified Facebook’s board that the company had yet to contain Russian activities.
This weekend, Stamos gave an interview to MSNBC’s Revolution, his first public appearance after the Times story came out, to share his side of what happened inside Facebook during the fallout from Russia’s meddling in the 2016 election.
Stamos didn’t directly comment on Sandberg’s conference-room bashing mentioned in the Times story, but he acknowledged that, in retrospect, it was unwise for Facebook to disclose the Russian infestation to the board before fully evaluating how serious the issue was.
“When we discovered the bulk of the Russian activity in the summer of 2017, I think that the big mistake the company made was not being much more fulsome about the fact that this was a huge deal,” he said in Sunday’s interview. “We did not have confidence that this was all of it.”
“Nobody lied, and nobody covered anything up,” he continued. “But I feel like the initial way that these things were communicated really set the bar of whether or not a company was going to be seen as part of the solution or part of the problem. Facebook didn’t take that opportunity to say ‘we’re part of the solution.'”
For the first time, Stamos also shed light on some of the technical aspects of the Russian activities that most people outside Facebook were unaware of throughout the scandal. For instance, Russian government-sponsored posts accounted for only a tiny percentage of the total fake new posts Facebook caught after the 2016 election. “[The share of] Russian-sponsored posts, while small, was also incredibly important because of why it was on Facebook and who was driving it,” he said.
In addition, Stamos said one important reason it took Facebook so long to combat Russian intervention was that the company wanted to solve the problem at its root, rather than deleting infested accounts and blocking users in a passive manner.
“This kind of organized propaganda was nobody’s job at Facebook when it happened in 2016,” Stamos said. “We could have focused on operational steps, finding 500 Russian accounts and shutting them down, kicking off certain people who posted hate speech. But in Silicon Valley, the truth is, you can’t really make changes unless you change how the product works… We’ve got to think about these kinds of adversarial, harmful uses of the technology we build way earlier.”
With the upcoming 2020 presidential election, Stamos cautioned that politically interested groups could start using social media to exert influences much earlier than people expect.
“If you read the tea leaves on what the Russians might be interested in [in 2020], it might be getting involved in the Democratic primary very, very quickly. So, we can’t wait a long period of time before we address these issues,” he warned.