Social media users will likely be exposed to more disinformation, sexually explicit material and hate speech, after a federal appeals court affirmed a Texas law prohibiting social media companies from censoring content hosted on their platforms Sept. 16.
While this law applies to users in Texas, the inherently viral nature of the internet means users around the U.S. and world will likely be exposed to “lawful but awful” content, said Carl Szabo, vice president and general counsel of NetChoice, one of the plaintiffs.
In May, the appellate court overseeing Florida came to the opposite decision regarding a similar social media law in that state, determining that platforms have a right to moderate their content. Given that these two federal courts came to opposing conclusions, the matter is almost certainly destined for the Supreme Court, according to Matthew Schruers, president of the Computer and Communications Industry Association (CCIA). The outcome would define the U.S. government’s ability to decide what is and isn’t allowed on social media.
The Texas law, House Bill 20, bars media companies with more than 50 million active users from removing or flagging content based on its “viewpoint”—a broad term that could encapsulate all content, including misinformation about election fraud, for example. Texas Republicans pushed for the adoption of the law in response to the presumed censorship of conservative viewpoints on sites like Twitter, but the law’s implications are much broader. It impacts sites having little to do with social media including Etsy and Yelp, Szabo said, and it requires platforms to divulge how they moderate their content, including keywords they look for.
“The danger there is that bad actors would love to have a road map to know how to avoid detection,” said Szabo. NetChoice and CCIA, the organizations which sued Texas over HB 20, both advocate for free enterprise and are funded by big tech and social media companies including Meta, TikTok, Amazon and Google.
Opening the door to videos showing animal abuse and terrorist propaganda
The federal court will send the case to a lower court to determine how the decision should be implemented in early October, unless the plaintiffs appeal to a higher court sooner. Until then, it is unclear how social media companies will be expected to revise their policies. It could result in platforms being prohibited from hiding content including animal abuse videos and terrorist propaganda, even if it violates the platform’s individual content guidelines. It might also result in more election-related disinformation in the month leading up to midterms that platforms will be wary of policing, said Samir Jain, director of policy at the Center for Democracy and Technology, a Washington D.C.-based nonprofit advocating for individual social media users’ rights.
The law is written in a way that benefits those in political power, not individual users, said Szabo. It is like forbidding Barnes & Noble from choosing what it puts on its shelves, he said. The Texas governor’s office did not return requests for comment.
Other states including California and Ohio are grappling with the same question of how the First Amendment protections extend to media platforms. While the Texas decision doesn’t directly impact other states, “There is no question that other state legislatures have been watching litigation in Florida and Texas to formulate their own views,” said Schruers, the CCIA president. A Supreme Court decision may moot any need for other states to act.