On Friday, January 10, the Supreme Court heard oral arguments in TikTok v. Garland. The case involves the constitutionality of the Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA). Congress enacted PAFACA in April 2024 due to national security concerns over the Chinese government’s control of ByteDance, TikTok’s parent company. Although TikTok is an American company, ByteDance is incorporated in the Cayman Islands and headquartered in China. Because the Chinese government can force Chinese entities like ByteDance to assist in intelligence gathering operations, Congress feared TikTok’s 170 million users in America were at risk. Among other things, the PAFACA requires ByteDance to sell TikTok by January 19, 2025, or it cannot operate in the United States.

In September 2024, the D.C. Circuit held that the PAFACA satisfied all levels of First Amendment scrutiny (without deciding whether the law was subject to intermediate or strict scrutiny). Given the looming January 19 deadline, TikTok asked the Supreme Court in December for an injunction pending review. The Supreme Court treated the emergency application as a petition for certiorari and set argument for January 10. Adding to the intrigue, President-elect Trump filed an amicus brief on January 3 asking the Court to stay the PAFACA’s January 19 divestment deadline to give him an opportunity to broker a deal that addresses the government’s national security interests and protects TikTok’s hundreds of millions of users in the United States once’s he’s sworn in on January 20. 

At oral argument, the Justices seemed likely to uphold the PAFACA, but how they get there could be the most interesting part of the decision. 

What happens when foreign entities and adversaries entangle themselves with domestic entities? 

The Justices spent a great deal of time at the argument discussing the threshold question of whether the PAFACA triggers First Amendment scrutiny at all. The government argued below that the PAFACA avoided the First Amendment under Arcara v. Cloud Books, Inc., because it’s a generally applicable law unrelated to the expressive activity. In Arcara, the government closed down an adult bookstore because the store was being used for prostitution. The D.C. Circuit (mostly) rejected the Arcara comparison (Chief Judge Srinivasan insinuated in his concurrence that the government’s data security rationale might satisfy Arcara). Under Arcara, the First Amendment is implicated in “cases involving governmental regulation of conduct that has an expressive element,” or when a statute is directed at an activity without an expressive component but imposes “a disproportionate burden upon those engaged in protected First Amendment activities.” The D.C. Circuit determined that the PAFACA imposes a disproportionate burden on TikTok’s speech and singles out that expressive activity by indirectly subjecting TikTok to divestiture. 

At argument, the Justices primarily addressed the Arcara question through the lens of corporate ownership. That is, does Congress’s regulation of a foreign entity’s corporate structure implicate the First Amendment rights of its American subsidiary and users? The question is complicated because, as a general matter, foreign adversaries and foreign corporations don’t possess First Amendment rights. Yet American corporations like TikTok do. 

TikTok’s counsel, Noel Francisco, argued that foreign control makes no difference for purpose of the First Amendment—the burden still falls disproportionately on TikTok. Solicitor General Prelogar argued that the target of the regulation isn’t anything that occurs in the United States, but the product (i.e., the algorithm and source code) that comes from a foreign company controlled by a foreign adversary. Justice Gorsuch pointed out that the Chinese government has said it would rather shut down TikTok in America than permit ByteDance to divest. The government points to this as proof that the app is ultimately controlled by the Chinese Community Party. TikTok’s counsel argued that those protestations concern control of intellectual property and the feasibility of divestiture.

Justice Kagan insinuated that the burden on TikTok might only be incidental because TikTok can still use any algorithm it chooses as long as ByteDance complies with the law. In other words, any effect on TikTok only occurs as a result of a third party’s choices. Relatedly, Justice Jackson asked if this was a burden on association, not speech. She noted that Congress has long prevented association with terrorists and foreign adversaries. In turn, counsel for the TikTok users argued that preventing association with ByteDance would implicate the rights of Americans to work with the likes of the BBC Al-Jazeera. 

Because the Court was open to the idea that the law doesn’t impermissibly burden speech, it raises an interesting decision point. The Court could determine that the PAFACA doesn’t implicate the First Amendment because the law regulates ByteDance’s corporate structure, not any protected speech. That route would allow the Court to avoid deciding what level of scrutiny to apply and whether the government’s various rationales are permissible. But that holding would give the government a much more significant victory in terms of the applicability of the First Amendment.      

Does concern about content manipulation pass constitutional muster?

Assuming the law does trigger First Amendment scrutiny, the government offered two national security justifications: (1) the Chinese government’s efforts to collect data of Americans, and (2) the risk of the Chinese government covertly manipulating content on TikTok. The D.C. Circuit held that each constitutes an independently compelling national security interest. Although the data security rationale is the stronger of the two justifications, the Court’s questions were particularly focused on the latter. 

The Justices were clearly concerned about the implications of the government’s content manipulation rationale. First, they pressed the Solicitor General on how a concern about content manipulation isn’t an attempt to regulate the content on TikTok or the views of a particular speaker. The government argues that China could covertly manipulate content so as to sow discord among Americans or spread misinformation. The Solicitor General told the Court that the government’s purpose isn’t to suppress speech, but to patch up a back door vulnerability in the app. The Chinese government is still free to join TikTok and spread its message, they just can’t covertly manipulate what Americans see. 

TikTok and the users have argued that this is plainly a content-based rationale that subjects the law to strict scrutiny. Justice Gorsuch pressed the government on whether its rationale would apply to a newspaper owned by a foreign government. Justice Kagan wondered if during the Cold War the government could regulate the Community Party in the United States if it was being manipulated by the Soviet Union. Justice Alito asked what would happen if the Chinese government funded a movie in the United States. The Solicitor General distinguished those scenarios by arguing that the PAFACA isn’t a direct regulation and has, at most, an indirect effect on protected speech. 

As to the tailoring, several Justices inquired as to why the government couldn’t just require TikTok to disclose to users that content on TikTok may be manipulated by the Chinese government. The Solicitor General argued that disclosure wasn’t good enough, analogizing the situation to a store that tells customers that one of the hundreds of items they sell will cause cancer. The Justices seemed quite skeptical of the content manipulation rationale. 

There was very little discussion of the data security rationale, although the Petitioners argued that the law still failed the narrow tailoring analysis because Congress could address the concern by simply passing a law prohibiting TikTok from sharing U.S. user information with China. On this issue, the Justices seemed likely to defer to Congress’s judgment that divestiture is necessary. 

An interesting question arises if the Court is inclined to accept the data security rationale but not the content manipulation rationale: Does one impermissible rationale taint the entire law, or does the law simply need one permissible purpose to survive? The Justices didn’t signal either way. To the extent the Court wants to (again) avoid a tricky question, it could simply determine that the First Amendment doesn’t apply. 

 

Disclosure: The author filed an amicus curiae brief on behalf of the State of Montana and 21 other states supporting Respondent in TikTok v. Garland. 

Note from the Editor: The Federalist Society takes no positions on particular legal and public policy matters. Any expressions of opinion are those of the author. We welcome responses to the views presented here. To join the debate, please email us at [email protected].