The Federalist Society is pleased to announce its Student Blog Initiative, a project of the Practice Groups and the Student Division. An inaugural group of eight students will contribute to the Federalist Society's blog throughout this academic year. Student contributors accepted into the program are held to the same rigorous standards as the regular and guest contributors to the blog, which exists as a forum for experts to provide thoughtful, balanced commentary in an engaging, accessible manner. 
Each student in this select group drafts posts on legal, constitutional, and policy issues, receives feedback and revisions from volunteer experts, and has the opportunity to share his or her work on the Federalist Society's widely viewed platforms. 
The Federalist Society takes no positions on particular legal and public policy matters. Any expressions of opinion are those of the authors.


On November 12, Austria’s Supreme Court ordered Facebook to remove a post insulting a former politician and “equivalent” posts around the globe because the post violates Austria’s defamation laws. As policymakers and stakeholders grapple with the future of platform liability and Section 230 in the United States, this ruling could have far-reaching consequences.

Case Background

The Glawischnig-Piesczek v. Facebook Ireland case stemmed from an April 2016 post of an article entitled Greens: Minimum income for refugees should stay. The post included the article’s title, a brief summary, and a photograph of Eva Glawischnig-Piesczek, then chair of Austria’s Green Party, along with commentary calling her a “lousy traitor of the people,” a “corrupt oaf,” and a member of a “fascist party.”

Claiming the post was defamatory, Glawischnig-Piesczek sued in an Austrian court, where she obtained an order for Facebook Ireland to delete the post and monitor for “identically worded” and “equivalent content.” Following the court order, Facebook Ireland disabled access to the initial post in Austria, but it remained potentially visible to Facebook users elsewhere.

On appeal, Austria’s Supreme Court referred the case to the European Court of Justice (ECJ), which found in October 2019 that the EU’s E-Commerce Directive—the EU’s intermediary liability law defining when social media platforms can be held liable for user-generated content—did not prohibit Austria from (1) requiring Facebook to monitor for “equivalent” content; and (2) ordering Facebook to remove the content worldwide. Last week, Austria did just that.

The ECJ’s reasoning that the E-Commerce directive does not limit EU members from imposing so-called monitoring requirements on platforms has been heavily criticized. The E-Commerce directive bars EU member states from requiring platforms to look for and take down content beyond a particular post or content. However, the ECJ found that the directive does not prohibit monitoring requirements associated with “a specific case,” if it could be done without engaging in an “independent assessment” through “automated search tools and technologies.”

Consequences: A Race to the Bottom on Global Speech Rules?

In Slate, Jennifer Daskal, the faculty director of the Tech, Law & Security Program at the American University Washington College of Law, noted that the ruling and its reasoning could “censor the internet worldwide.” She stated that platforms impose content-based restrictions consistent with local laws, such as Germany’s hate speech laws, Thailand’s ban on criticizing the monarch, and Singapore’s ban on “fake news,” in addition to the platform’s community standards and terms of service. According to Daskal,

Under the Austrian court precedent, courts in any such jurisdiction would be more or less free to apply their local laws to compel not just local, but global takedowns of posts or comments that violate the vagaries (and often highly speech-restrictive) of local law.. . . This creates a classic risk of a race to the bottom, with the most censor-prone nation setting global speech rules.

Others have expressed concerns about the technologies used to censor “equivalent content.” Daphne Keller of Stanford’s Center for Internet and Society has noted that defamation and speech about public figures are “widely considered too context-dependent for even judges to assess easily, and thus particularly ill-suited for enforcement by automated tools.” This is particularly relevant because in cases of unsettled international law, domestic courts often look to how other jurisdictions have addressed a particular issue and may treat those conclusions as sources of law. Essentially, decisions made by courts in other countries can have a precedential effect. Following the ECJ opinion, Keller said “[i]t’s open season for courts to mandate made-up technology without knowing what that technology will do. And those mandates can apply to the whole world.”

There is some evidence that Keller’s prediction is coming true. Weeks after the ECJ handed down its Glawischnig-Piesczek decision, the Delhi High Court in India issued global take down orders—also in a defamation case—relying on Glawischnig-Piesczek in reaching its decision.

Moreover, since the ECJ’s opinion last year, China adopted the Hong Kong national security law, which it claims applies to anyone, anywhere in the world. Chinese courts could cite the Glawischnig-Piesczek precedent and order a platform to take down speech it deems in violation of the law, such as calls from foreigners who support Hong Kong independence or imposing sanctions on China, which Article 38 of the law suggests can be prosecuted upon entering Hong Kong or mainland China.

In addition to speech concerns, commentators have noted that such worldwide orders also raise sovereignty issues. Other countries may refuse to recognize decisions such as Glawischnig-Piesczek, but as long as a platform complies with such judgments, there really is not much other states can do, which is especially significant when orders require content to be pre-emptively blocked.

As Facebook said following the ECJ’s ruling, the precedent “undermines the longstanding principle that one country does not have the right to impose its laws and speech on another country.” The company added that the ruling also raises concerns about “the role that internet companies should play in monitoring, interpreting and removing speech that might be illegal in any particular country.”

As Facebook’s comments suggest, the ruling puts social media platforms in a tight spot—comply with the court order and face backlash for censoring otherwise legal speech, or refuse to comply and risk being forced to leave a market. These issues will continue to be front and center in the United States and around the world, raising significant legal issues, questions about core values, and serious practical challenges for companies that host content online.