Connect with us


Censorship case set for argument

Missouri v. Biden, or Murthy v. Missouri, the great social-media censorship case, now comes to oral argument before a divided Court.

Print Friendly, PDF & Email



The great censorship case, now known as Murthy v. Missouri, has received an oral argument schedule. The United States Supreme Court, which granted review of the case over three months ago, will hear argument next month. At issue is the massive injunction that a Louisiana district court entered on July 4, 2023. The Court will decide – likely at the end of June – whether lower courts may now enjoin the government from turning social media into State actors. But so far, no Court is arguing the role of the social media companies themselves. That’s especially important given the long history of one social media company that has consistently refused to engage in censorship.

History of the censorship case

The Supreme Court has scheduled oral argument for March 18, 2024, in a morning session.

This case began originally as Missouri v. Biden, case no. 3:22-cv-01213. The Attorneys General of Missouri and Louisiana filed it in Monroe, La., on May 5, 2022. (See CourtListener’s docket listings at the District Court and Appeals Court levels. See also two Supreme Court docket listings, for the application for stay, and for a petition for review.)

Then-Attorneys General Eric Schmitt of Missouri (now Senator,), and Jeff Landry of Louisiana (now Governor) filed the case originally. Since then, many private plaintiffs have joined the case, through amendment of the complaint and through consolidation. The most recent Amendment Complaint is the Third, filed May 5, 2023 – the anniversary of the original Complaint. Joining the two Attorneys General were:

  • Three physicians who had lost their posting privileges on Twitter (now X) over their statements contradicting the COVID-19 Narrative, and
  • The Editor-in-Chief of The Gateway Pundit, and one of his colleagues, over their coverage of those moderational actions.

Since then, Robert F. Kennedy, Jr., and Children’s Health Defense won consolidation of their own case with the Missouri case.

The underpinning of the Missouri case

Paragraph 2 of the Complaint reads:


A private entity violates the First Amendment “if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint.” Biden v. Knight First Amendment Institute at Columbia Univ., 141 S. Ct. 1220, 1226 (2021) (Thomas, J., concurring). “The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly.” Id.

The Knight case arose out of then-President Trump’s blocking of critics from his Twitter account. The Second Circuit Court of Appeals ruled he may not do that, because his account was “a public forum.” Trump petitioned the Supreme Court for review.

But several things happened after that, in rapid-fire order. First, Trump “lost” the Election of 2020. Second came the January 6 Event. Third, and in consequence, Twitter banned Trump completely. When the case came to the Supreme Court, the Biden administration was already in place. So on April 5, 2021, the Court granted review, vacated the Second Circuit’s judgment, and sent the case back “with instructions to dismiss the case as moot.”

Separately, Justice Thomas wrote a twelve-page concurring opinion giving his best rationale for a comprehensive treatment of social-media platforms as common carriers or, alternatively, as places of public accommodation.

Contrary to popular belief, such notions have been current for centuries, as part of English and American federal common law. Thus far, no complaint, response, or other brief has tried to allege that no such things as common carriers or places of public accommodation ought to be construed to exist. Instead they allege that social-media platforms are not common carriers or places of public accommodation. Sadly, Justice Thomas had to agree – because Congress has passed no law designated social-media platforms as such.

Nevertheless, Thomas did identify a First Amendment issue:


[A]lthough a “private entity is not ordinarily constrained by the First Amendment,” Halleck, 587 U.S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” Bantam Books, Inc. v. Sullivan, 372 U.S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; Blum v. Yaretsky, 457 U.S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital platform if it took adverse action against them in response to government threats.

That is the case the Missouri plaintiffs make.

Questions before the Court

The defendants (with Biden’s Surgeon General leading) present three questions to the Court in response. To recast the questions as assertions, the government is saying:

  1. Plaintiffs have no standing before the Court. Court after court has found that they do have standing, but the government has never conceded that.
  2. Even if plaintiffs do have standing, the government did not and perhaps could not threaten, coerce, or induce the social platforms to do anything they wouldn’t have done absent the “challenged conduct.”

They also complain that the Big Injunction was too broad and would stop the government from disseminating its own viewpoint.

The Supreme Court has received no further briefs after the December 26 Election Officials’ brief. CNAV analyzed most of those briefs here. But the Knight First Amendment Institute filed a brief early in the briefing period. Justice Thomas’ concurrence in their case against President Trump makes their brief more important than any other.

Justice Thomas cited two specific cases to support his thesis: Bantam Books v. Sullivan and Blum v. Yaretsky. Both cases involve possible State action – Bantam regarding censorship in book publishing, and Blum regarding nursing-home transfers for purposes of government program economy. The Knight Institute wants Bantam to govern, in deciding whether the government’s “jawboning” of Twitter, Meta, Alphabet, et al. exceeded the government’s Constitutional authority.

Then they added this:


Finally, the Court should resolve this case narrowly, without expecting jawboning doctrine to address all of the challenges created by the centralization of private power over public discourse. The major social media platforms’ power to dictate what can be said and what will be heard online poses a serious threat to public discourse and, by extension, to our democracy. Jawboning doctrine can reduce the risk that the government will take advantage of this concentrated power by pressuring the platforms to suppress disfavored speech. But it would be a mistake for the Court to contort this doctrine to solve what is, in reality, a problem of excess concentration and lack of competition. As explained below, this problem should be addressed through legislative and judicial tools better suited to the task.

In other words, Knight suggests the real issue with social-media platforms is that they have cornered the market. That might prove extremely difficult to solve.

Is censorship permissible or not?

Of all the arguments the government makes, the second argument is arguably the strongest. That is to say, that they have not coerced, but merely have persuaded. But two kinds of persuasion ought to be in view here. They are persuading the public to think as the government does, and persuading the platforms that certain speech is a harmfully disruptive influence.

Miss Wheeler, you do not know your place here… I consider you a disruptive influence, and I can promise you this: you will be out of this hospital within twenty-four hours. From Coma, by Robin Cook, M.D. New York: Signet Books, 1977.

No one disputes that the government may at any time seek to persuade its citizens or subjects to believe as it does. One may dispute the specific things the government says. But in a Constitutional republic with democratic elections, the usual remedies against government authorities who lie, include:

  • Judicial or impeachment proceedings against said officials on grounds of fraud, or
  • Voting them or their appointing or confirming superiors out in the next election.

They do not include prior restraint on government speech.

But the issue is whether Big Tech has banned users as “disruptive influences” because the government told it to. Interestingly, not one defendant in this case has ever said, “I/we never did such-a-thing.” Rather, defendants allege, “I/we had a perfect right to do what I/we did.” And why? Typically because they further allege that the targets of their censorship were and are:

  • Lying to the public about the harm of government measures, or the harmlessness of the target of those measures, or:
  • Telling a truth that would bring harm from irrational public reaction to that truth – or merely embarrass the authorities.

When is panic a good excuse for censorship?

“This mustn’t get out because it would cause a panic” has long been a staple of fiction involving impending disaster. See, for instance, Stephen King’s The Stand (New York: Doubleday and Co., 1978). Ironically, that novel begins with the accidental release of a biological weapon at least as devastating as COVID-19 was supposed to be (but wasn’t), and indeed worse. The government tries to deny the threat and suppress warnings about it.

Suddenly we’re back to Chief Justice Oliver Wendell Holmes saying that the First Amendment does not allow someone to yell, “FIRE!” in a crowded theater and cause a panic. But most people understand Holmes to be objecting to the raising of a false fire alarm. The government’s defense of censorship turns Holmes on his head. They seem to be saying, “If you smell smoke, keep your trap shut, because it’s none of your business!”


Suppressing true statements

Benjamin Wetmore at The Gateway Pundit concentrates on the second part. No one was alleging that the government may not post. But the government was telling social media companies to delete the posts of others. Some of those posts shared information that turns out to be true. Among the topics in which the government has tried to suppress such embarrassing truths as:

  • COVID-19 was never as transmissible, or as deadly, as the government pretended. (We never saw meat wagons with rooftop bullhorns blaring, “Bring out your dead!”)
  • Coronavirus, the causative agent, originated at the Wuhan Institute of Virology, which released it, accidentally – or on purpose. (Perhaps a purposeful release, intended to infect the Western world, got out of hand and infected Chinese civilians as well.)
  • The mRNA and indeed all other vaccines against coronavirus were far more deadly than the virus itself.
  • Joseph R. Biden and his campaign stole the Election of 2020. The only down-ticket elections they stole were the elections of Senators Rafael Warnock and Jon Ossoff in the Georgia Runoff on January 5, 2021. CNAV continues to maintain that Rep. Nancy Pelosi (D-Calif.), then Speaker of the House, must have wanted to strangle whoever designed a cheating system that did no favors for some of her staunchest allies in the House, like Florida’s Donna Shalala, who failed of reelection.

Problems on both sides

All this is in keeping with what the country has learned about the three- and four-letter agencies involved in censorship. These agencies even came up with definitions of the three influences they were fighting:

  • Misinformation – the passing on of information one believes is true, but is false.
  • Disinformation – the passing on of information one knows is false.
  • Malinformation – the passing on of truth that society is not ready to hear.

James Madison, who wrote the First Amendment, would not have accepted any of those definitions as valid reasons to act as the government has done – and is still doing. Who decides what is true or false? Who decides what society is or is not ready to hear? Furthermore, the essence of a Constitutional republic is individual responsibility, in keeping with the authority that a vote represents.

But the plaintiffs in Missouri v. Biden have a problem, and the Knight brief brings that problem into stark relief. The Knight brief’s authors recognize the hazards of a small group of large – and cooperative – companies dictating what is acceptable. But they see no solution from a court that would bring them comfort. They would prefer to limit the reach of various platforms, so that no single platform can “corner the market.” No doubt they remember the “solution” to the great Telephone Anti-trust Case: to break up that company. And even without that breakup, the “monopoly” problem solved itself with the development of new methods of telecommunication.

Alternative: find a platform that doesn’t practice censorship no matter who so demands

How a court – or the Congress – is supposed to “break up” social media is impossible to imagine. Nevertheless, the distinction between persuasion and coercion remains the sticking point. Furthermore, several social-media platforms have sprung up to receive those whom the Big Tech family has expelled. The greatest distinction any of these new platforms can make is to:

  • Receive direct, peremptory government orders to take down content contrary to “accepted” narrative, and in reply, to:
  • Refuse to obey such orders, and let the world know of their refusal.

The Gab Empire has been refusing to obey for six years. They’ve literally built their own hosting infrastructure, after host after host expelled them. Likewise, Rumble has received orders to take down certain user videos – and have refused. Happily, their content standards are very simple. This might disappoint intellectual-property anarchist Lawrence Lessig, but they will not be accessories to copyright violation. Nor will they be accessories to the exploitation of children, or, for lack of a better term, to virtual prostitution. Beyond these, they forbid very little, and what they forbid might fill a modern screen. In contrast, YouTube’s Community Guidelines would fill an entire file folder drawer.

Likely outcome

The Supreme Court will not decide the case until the end of Term. How the Originalists (Alito, Gorsuch, and Thomas JJ), the Moderates (Roberts CJ and Barrett and Kavanaugh JJ), and the Progressives (Jackson, Kagan, and Sotomayor JJ) see fit to question the two legal teams, will provide the best clues to how the Court will eventually rule. (Do not expect another Great Leak! The Court is too smart to let that happen again.) Thomas is probably the key, for in Knight, he practically begged someone to file a test case like this one. But he’ll have to convince Kavanaugh and especially Barrett that “panic in the streets” is a risk worth taking for human liberty.

If Thomas cannot convince those two Justices – and he must convince both of them, because Chief Justice Roberts will always break a tie in favor of the Progressives (as Thomas knows all too well!) – then Gab and Rumble will start the greatest advertising campaign any company has ever run. “Don’t wait for the Court to save you,” they’ll say. “Come and join us!” And they’ll be right.

Print Friendly, PDF & Email
+ posts

Terry A. Hurlbut has been a student of politics, philosophy, and science for more than 35 years. He is a graduate of Yale College and has served as a physician-level laboratory administrator in a 250-bed community hospital. He also is a serious student of the Bible, is conversant in its two primary original languages, and has followed the creation-science movement closely since 1993.

Click to comment
0 0 votes
Article Rating
Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Inline Feedbacks
View all comments


Would love your thoughts, please comment.x