Content on social media platforms contributed to the suicide of a 14-year-old girl, a British court found Sept. 30. Molly Russell, who died in 2017, interacted with 2,100 posts related to suicide, self-harm and depression in the six months leading up to her death—or 12 posts per day—according to Meta’s data, which includes Facebook (Meta (META)) and Instagram (META) usage. The teen viewed additional content on Pinterest (PINS).
Suicide is the second leading cause of death for children aged 10 to 14 in the U.S. A Brigham Young University study conducted from 2009 to 2019 found girls who used social media at least two to three hours per day, and then increased usage over time, were at a much greater risk for suicide by the end of the experiment than those who limited social media, as well as their male counterparts. While many studies link social media to depression among teens, Meta admitted during the judicial proceedings it had never studied how depressive and suicidal content on Instagram impacted its young users.
Executives from Meta and Pinterest apologized to the teen’s family during proceedings. “We are sorry that Molly saw content that violated our policies, and we don’t want that on the platform,” said Elizabeth Lagone, head of health and wellbeing policy at Meta. Some posts were not safe for children to see, said Judson Hoffman, Pinterest’s head of community operations. “I deeply regret that she was able to access some of the content shown,” he said. Pinterest had emailed the teen “Depression Pins you might like” two weeks after her death.
Since the proceedings were to determine cause of death rather than civil or criminal responsibility, Meta and Pinterest will not be fined or face other penalties. But the case could serve as a precedent for future families suing social media companies for their role in a child’s death.
No age verification policies for social media
Facebook, Instagram and Pinterest all ask their users to be at least 13 years old to sign up for accounts to comply with a 1998 law requiring parental consent for data collection on children younger than 13. But no social media company requires proof of age to sign up, so children can often bypass restrictions and create accounts, exposing them to content too mature for their age. Most social platforms don’t release data on the age breakdown of its users, aside from TikTok, but surveys estimate that 40 percent of U.S. children aged 8 to 12 use social media, and that number climbs to 95 percent for 13 to 17-year-olds, with half of these users on the internet “almost constantly,” according to Pew Research Center.
Meta said some of the content the teen engaged with violated its policies, and said it must balance safety with free expression. In 2019, after the teen’s father went public with the family’s story, Meta added additional protections like banning graphic depictions of self-harm. In 2021, the company defended its practices, saying its research shows Instagram actually helps with issues like anxiety, loneliness and sadness among teenage girls.