The First Amendment could get in the way of regulating social media companies
As tech interest groups fight regulations in court battles across the country, they’re making arguments that underpin their content moderation decisions and even their ranking algorithms – the software that decides which posts each user sees when viewing. it opens the app or website – like a full-fledged expression form. And they’re calling on the First Amendment, which protects American citizens and businesses from government restrictions on speech, to keep states out.
From Texas to Florida to Ohio to the U.S. Supreme Court, judges and justices across the country are wrestling with thorny new questions about what constitutes free speech and what rights are really at stake when lawmakers attempt to regulate social media feeds. In the balance are not just right-wing efforts to impose political neutrality on Silicon Valley giants, but also left- and center-wing efforts to demand greater transparency and hold them accountable for amplifying rhetoric. which may be harmful or illegal.
“The First Amendment is to some extent up for grabs,” says Geneviève Lakier, a law professor at the University of Chicago and visiting senior fellow at the Knight First Amendment Institute. “These ancient principles are being pushed, pulled and reinvented in light of changing technological conditions and shifting political alignments.”
The legal battles have their roots in controversies over the ever-growing role of social media in shaping political discourse. As platforms such as Facebook, Twitter, YouTube and even TikTok have become influential forums for politicians, activists and the media, they have been criticized – often, but not exclusively, by the left – for stoking misinformation, bigotry and division.
In response, these platforms have developed increasingly sophisticated systems – combining automation and human oversight – to detect and remove posts that violate their rules. In some cases, they’ve also adjusted their feed ranking and recommendation algorithms to try to avoid highlighting content that might be problematic. But these movements have their own detractors, particularly on the right.
Tech groups ask Supreme Court to block Texas social media law
On May 11, a federal appeals court stunned the legal establishment by allowing Texas to move forward with a law prohibiting major internet sites from “censoring” – whether by removing or downgrading algorithmic – users’ posts based on their point of view. Although the 5th Circuit Court did not explain its decision, the decision appeared to support the argument of Texas Republicans that the right of individual users to be heard on social media platforms could override the right of tech companies to decide which posts to display.
The tech companies quickly appealed to the Supreme Court, asking it to suspend the law while the trial proceeds in a lower court. Judge Samuel Alito is expected to rule on this request in the coming days. While this ruling won’t resolve the case, it will be closely watched as a signal of how the broader debate is likely to unfold in cases across the country.
Meanwhile, on May 23, another federal appeals court took a very different position on Florida’s social media law, which is similar in spirit to Texas’ but differs in detail. In that case, the 11th Circuit upheld a lower court’s decision to suspend large swathes of Florida law, on the grounds that tech companies’ algorithms and content moderation decisions amount to “expressive activity.” protected by the Constitution”.
The decision was broadly consistent with decades of legal precedent that the best way to protect free speech is for governments to stay out of it. But it was remarkable to assert that the “curation” of content on social media sites is itself a form of protected speech.
It was also nuanced. While appeals court judges found that many provisions of Florida law were likely to be unconstitutional, they reinstated parts of the law that require tech companies to disclose certain types of information relevant to their content moderation processes.
For example, they found that Florida required social media platforms to specify their content moderation standards, show users the number of views on their posts, and give suspended users access to their data. These provisions will now come into effect while a lower court continues to hear the case. But the court rejected a provision that would have required platforms to explain to users their reasoning for removing a given post, ruling it would be too burdensome.
Importantly, it also removed a provision requiring platforms to offer their users the option to opt out of algorithmic ranking and see every post in their feed in chronological order. This decision, again, was based on the First Amendment, suggesting that platforms have a constitutional right to algorithms and even “shadow bans” – a colloquial term for hiding posts from certain users or make them harder to find, often without the user knowing.
11th Circuit blocks key provisions of Florida social media law
Mary Anne Franks, a law professor at the University of Miami and author of “The Cult of the Constitution,” is a critic of what is sometimes called “First Amendment absolutism” – the idea that government can almost never interfere with even the most odious speech. She argues that there should be room for reforms that allow tech companies to be held accountable when they host or promote certain types of harmful content.
Still, Franks thinks the 11th Circuit was right to find much of Florida’s law unconstitutional. Requiring social media platforms to offer a chronological feed, she said, would amount to requiring bookstores to arrange every book in chronological order in their storefront – a violation of their right to decide which works to put on display. evidence.
This advice could have implications not only for right-wing attempts to restrict content moderation, but also for bipartisan and progressive proposals to promote more and better content moderation. These include a slew of bills that surfaced or gained momentum after Facebook whistleblower Frances Haugen drew attention to how that company’s algorithms were prioritizing internet. commitment and benefits rather than social responsibility.
Some of these bills would remove the liability protections that internet platforms enjoy under Section 230 of the Communications Decency Act if their algorithms play a role in amplifying certain categories of speech. Others would require social media sites to offer “transparent” alternatives to their default recommendation algorithms. Still others would require them to submit their ranking algorithms to researchers or even the Federal Trade Commission.
Based on recent federal court opinions, most, if not all, would likely result in lawsuits from tech groups alleging they violate the First Amendment. It remains to be seen exactly where the courts will draw the line.
“What the 11th Circuit opinion does is start from the presumption that algorithmic ranking, recommending, and amplifying are part of the conduct or speech protected by the First Amendment in which a platform engages. engage,” said Emma Llanso, director of the nonprofit’s Free Expression Project. Center for Democracy and Technology, which receives funding from tech companies as well as some tech critics. “And so any regulation of this aspect of what platforms do will potentially face the same First Amendment scrutiny.”
Lawmakers’ Latest Idea to Fix Facebook: Regulate the Algorithm
That doesn’t mean it’s impossible to regulate social media algorithms, Llanso said. But it does set a “very high bar” for the government to show a compelling interest in doing so and avoid making these regulations too burdensome.
Following recent court opinions, the kinds of regulations that appear to have the best chance of surviving judicial scrutiny are those that emphasize transparency, Llanso and other experts agreed. For example, a bipartisan bill in Congress that would require large platforms to share data with vetted researchers might have a good chance of surviving the level of scrutiny applied by the 11th Circuit.
But they cautioned that the big underlying legal issues remain open at this time, particularly after the 5th and 11th Circuits took such different positions on Texas and Florida laws.
At the heart of the debate is whether only the speech rights of tech companies are at issue when the government tries to regulate them, or whether some of these tech companies now have such power over the speech of individuals that speech user rights must come into play.
Historically, conservative thinkers have argued that “the best way to protect users’ speech rights is to give platforms a lot of speech rights,” Lakier said, while some on the left worried that speech rights individuals are granted in the short term. Now, a new breed of Trump-aligned Republicans have embraced the view that individuals may need speech protections from corporations, not just government. These include Texas Governor Greg Abbott, Florida Governor Ron DeSantis and Supreme Court Justice Clarence Thomas.
“It’s a live question,” Lakier said. Although she thinks the laws in Texas and Florida go too far in restricting platforms, she added, “I will say that as a progressive, I’m quite sympathetic to this twist on the speech rights of users. I think we should think about it a lot more than we have in the past.
Cat Zakrzewski and Cristiano Lima contributed to this report.