Section 230 of the Communications Decency Act stands at the center of technology debates. Its provisions create the liability regime for internet content, and its protections play a key role in the business models of social media and other dominant technology firms.
Yet, some courts have extended this protection beyond any reasonable understanding of the text or plain common sense. Justice Thomas’s recent statement respecting a denial of certiorari critiques several aspects of current Section 230 case law. The Trump administration shared these concerns.
As a salient example of overreach, courts have expanded 230(c)(1) from a small, targeted legal protection into a virtual get-out-of-jail-free card for Big Tech. The provision states “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The statute’s text only protects internet platforms, like Facebook or Google, from causes of actions that have as an element “publishing” or “speaking” content from third parties, such as slander or criminal threat.
Some support this expansion claiming that a broad reading of Section 230(c)(1) protects small online businesses, thereby encouraging innovation, as well as limiting litigation costs. Similarly, some have argued that claims, such as product liability or negligent design, that do not contain publishing or speaking as elements “collapse into speech claims.”
But, as Justice Thomas points out, lower courts have “departed from the most natural reading of the text by giving Internet companies immunity for their own content” decisions—expanding Section 230 beyond claims like defamation or criminal threat. Rather, lower courts have used Section 230(c)(1) cases to reject claims based on the platforms’ own content decisions to discriminate against particular ethnic groups, facilitate terrorism, or create dangerous, life-threatening computer apps in violation of product safety laws.
In Lemmon v. Snap, a textual wind blows from the West echoing Justice Thomas’s position. The suit involved Snapchat’s app “Speed Filter,” which allows drivers to record their real-life speed. The suit alleged that this app encouraged and facilitated dangerous driving, leading to Plaintiffs’ sons’ tragic deaths.
A 9th Circuit panel of both Republican and Democratic nominated judges rejected Snap’s section 230 defense in this case. The panel reasoned that negligent design involves a “duty to exercise due care in supplying products.” This claim, the court observed, differs markedly from the duties of publishers as defined in the CDA. Specifically, negligent design has no element of speaking or publishing third party content—as Section 230(c)(1) requires.
Further, Lemmon clarifies the Section 230(c)(1) test. Earlier cases from district courts state Section 230(c)(1) applies when “(1) Defendant is a provider or user of an interactive computer service; (2) the information for which Plaintiff seeks to hold Defendant liable is information provided by another information content provider. . . .”
Notice the second prong expands dismissal for all actions involving—even tangentially—third-party information. Using this test, as mentioned above, courts have upheld claims that the platforms’ own content decisions should be immune from contract, fraud, and civil rights action simply because these claims involved third party content.
Addressing that confusion, Lemmon clarified the second prong, stating that a defendant only gets Section 230(c)(1) immunity “if it is ‘(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.’” The 9th Circuit makes clear that Section 230(c)(1) only applies to causes of action which contain as elements publishing or speaking third party information, such as defamation and criminal threat. Section 230(c)(1) does not protect the platforms’ own editorial decision concerning third party content.