Briefs filed by outside parties in cases to the Supreme Court usually make sober legal appeals in support or opposition of the case before the justices. But one filed on Dec. 7 by the moderators of two Reddit communities was, uh, more colorful in its arguments in a recent amicus brief.
The brief was submitted by the moderators of the r/law and r/SCOTUS subreddits in case of NetChoice v. Paxton, which challenges the constitutionality of laws passed by Republicans in Florida and Texas that restrict the ability of social media platforms to moderate, remove or edit user content. To make their point against the laws, the moderators provided the justices with screenshots of content they’ve removed from their communities that “includes inappropriate remarks (and even threats) directed at members of this Court.”
“Corrupt justices should face justice on the national mall,” one user wrote in response to an undated article about the Supreme Court’s ethics woes.
“Yep set up some nice shiny guillotines!” another user responded.
“We’ve got the guillotine, you’d better run,” a user wrote about Chief Justice John Roberts’ statement that the justices “cannot and should not live in fear.”
“Promoting violence is the only rational response, which is why authorities don’t want you to do it,” a user said following protests against the conservative justices who voted to end the national right to an abortion.
“Wow didn’t know Sotomayor was a Nazi,” another user wrote about Sotomayor’s praise for conservative Justice Clarence Thomas as “a man who cares deeply about the court as an institution.”
“Is that a woman?” a user wrote in response to the official portrait of Justice Ketanji Brown Jackson.
Following an article about Justice Neil Gorsuch not wearing a mask during arguments, a user wrote, “Yeah well Neil can Gorsuch a dick.”
The comments represented some of the “mildest content” that the moderators said they had removed as part of their effort to maintain a serious and non-divisive space to discuss legal issues and the Supreme Court. They noted that they also removed “the physical addresses of the Justices, their clerks, and court staff, as well as celebrations of the death of Justice [Ruth Bader] Ginsburg.”
Both subreddits maintain policies executed by the moderators, who act in a volunteer capacity, placing limits on what speech will be tolerated in their respective communities. This is a common practice of subreddit communities that are run by independent moderators.
The laws passed by Republicans in Florida and Texas seek to limit “viewpoint discrimination” by restricting how social media platforms with at least 50 million active users can block, edit, remove, arrange or demonetize user-generated content and requiring sites to explain decisions to remove or otherwise alter user-generated content. These laws were passed to counter supposed censorship of conservative voices on social media platforms like Facebook, X (née Twitter) and YouTube.
Following the 2020 election and the response by social media companies to Jan. 6, 2021, of removing former President Donald Trump from platforms, conservatives argued that social media companies had gone too far moderating content and blocking users who expressed conservative positions. This included the addition of warning labels to certain tweets that included lies about the election and alternative theories about vaccines and the banning of users that violated content policies, like conspiracist Alex Jones, antisemite Nick Fuentes and Trump. After the laws were passed, NetChoice, a lobbying group for the tech industry, sued to challenge them as a violation of the platforms’ First Amendment speech rights.
Texas’ law does state that platforms can remove content containing threats of violence or that “directly incites” criminal activity. This may allow subreddit moderators to remove death threats made to the justices, but it does not prevent them from building an online community that produces the “substantive” and “constructive conversation” that they state as their goal.
“R/SCOTUS and r/law cannot function if there are hecklers (on the internet, we call them trolls) drowning out substantive conversations with an unending stream of vulgar, racist, sexist, or just plain stupid, argle-bargle,” the brief states. “[We] choose instead to ban the trolls. This does not silence the trolls. The internet provides them with an unlimited number of alternative bridges to haunt and howl under.”
If Texas’ law is left to stand, the moderators argue that its provisions allowing users who have had their content removed, edited or blocked to sue the individual moderators would make it impossible to police trolls and unpleasant content, like that posted about the justices.
“Even if the Honorable Attorney General declines to exercise his prosecutorial discretion, HB20 authorizes individual users like HateSpeechLuvr[, who post comments with racial slurs,] to sue SMPs directly (if they’re located in Texas),” the brief states, using the acronym for social media platform. “Amici have no way of knowing if HateSpeechLuvr or any other user who visits their subreddits is from Texas, and so every moderation decision would necessarily be impacted by the potential threat of litigation.”
The case came before the court after the Fifth Circuit Court of Appeals upheld key elements of the laws while, in a separate decision, the Eleventh Circuit Court of Appeals rejected them. This circuit court split led to the case being appealed to the Supreme Court. The court has yet to set a date for arguments.