Calls to #DeleteFacebook echoed through social media last week after documents became public detailing how Meta provided Nebraska prosecutors with private messages from a 17-year-old who allegedly planned and executed a late-term abortion in April. On Aug. 9, Meta published a statement asserting it received a valid legal warrant that never mentioned abortion.
Meta has been trying to reconcile its users’ expectations of privacy with the inherently public nature of a social network almost as soon as it was founded as Facebook in 2004, pledging privacy improvements as far back as 2009. The Federal Trade Commission got involved in Facebook’s privacy issues in 2011 and again in 2019. Facebook has promised transparency on data usage, control over personal data, and accountability since 2018. The company’s goal is “to build much stronger privacy protections for everyone on Facebook,” Zuckerberg said in 2020. But he continues to apologize for letting his users down and says the company will do better.
Facebook wants to protect its users’ privacy, but it also must comply with the laws in the countries it operates in. In many cases, it can’t have both. And while the company says it’s committed to protecting the privacy of its users, this isn’t necessarily true if a user broke the law.
Hundreds of thousands of requests from law enforcement for user data
Last year, Facebook received 426,000 requests for user data from law enforcement, one-third of which came from the U.S. The company has repeatedly said it is committed to working with law enforcement, and it frequently cites emergency scenarios like suicide prevention and the recovery of missing children as a reason to do so, though these emergency disclosures only made up about 10 percent of data requests in 2021. It also points to extreme crimes like terrorism, child exploitation, and extortion. In its transparency center, Facebook doesn’t break down what percentage of data requests are related to these major crimes. Meta did not respond to requests for comment.
“If there’s a subpoena or legal court order in connection with a criminal investigation, there’s not much Meta can do about it,” said Ellen Goodman, law professor at Rutgers.
Facebook occasionally pursues different avenues to push back against law enforcement and protect its users’ privacy.
“Online services inevitably must choose whether and when to challenge legal processes—they receive many demands for data from governments around the world,” John Verdi, senior vice president for policy at the Future of Privacy Forum, a nonprofit privacy advocacy organization, said in an email. “Companies typically weigh a range of factors—whether they can notify users, the likelihood a legal challenge will succeed, the nature of the underlying alleged crime, and more.”
A decade ago, Twitter became the first social platform to publicly challenge orders requiring companies to not disclose they were handing over data, saying that their users had a right to know their data was being shared with law enforcement. Just last year, Facebook made news for refusing to comply with a U.S. court order requiring it to release previously deleted accounts related to Myanmar officials’ hate speech towards Rohingya Muslims, which helped spark a genocide. Still, in 2021, Facebook complied with 72 percent of worldwide data requests and 89 percent of U.S.-based requests, according to its transparency center data.
With each court order Facebook receives, it could fight the non-disclosure order like Twitter frequently does. When users are notified, it allows them to assert any objections available to them. In the Nebraska case, however, Facebook didn’t pursue this course. In some cases Facebook can also argue the warrant is too broad, which violates the Fourth Amendment protections against unreasonable searches and seizures, but there was no basis for this argument in the Nebraska case.
Facebook can deploy delaying tactics to frustrate law enforcement
Even when presented with valid court orders, Facebook can slow law enforcement from getting data by requiring it to go through a domestication process, which in this case would have transferred the warrant from Nebraska to California, where Meta is headquartered. The procedure can be expensive and take months, which can stretch court resources. By requesting domestication, Facebook can throw sand in the gears, forcing law enforcement agencies to seek other avenues to solve their cases and reducing their incentive to continue submitting requests. When the underlying law is controversial, many privacy advocates recommend making the process difficult for law enforcement through domestication. In the Nebraska case, however, Facebook wasn’t aware that it was about abortion, and it didn’t proceed with the domestication process.
The previously sealed warrant served on Meta, obtained by the Observer, doesn’t cite the abortion statute and included a non-disclosure order which barred Meta from telling the user it was handing over her data.
The best solution to preventing law enforcement from demanding user information is to “not have that information in the first place,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, a nonprofit protecting digital privacy. “And there are very simple steps Meta could have taken years ago, like end-to-end encrypting all messages by default.”
This specific type of encryption means that messages are stored on users’ devices—not on Meta servers—so law enforcement agencies couldn’t order Meta to turn them over. Though if messages were end-to-end encrypted, Facebook couldn’t use them to target advertising to its users, Galperin said, which is its primary financial model.
“It’s the data collection that’s the problem,” Goodman agreed.
The company announced on Aug. 11, days after the Nebraska court order became public, that it will begin testing end-to-end encryption as a default for Facebook Messenger.
“I’ll believe it when I see it,” Galperin said. “Facebook didn’t just discover this was a problem yesterday.”