Social Media Content Control

Telecommunications & Electronic Media and Free Speech Practice Groups Teleforum

Listen & Download

In two recently filed lawsuits conservative organizations have complained that Google has restricted their access to readers. Gab, which provides its own site for conservative and alt right voices, complained when Google refused to include Gab’s app in its app store. Per Gab, Google’s true reason for this was to stymie Gab’s competition with Google’s business partner Twitter, violating the antitrust laws. Prager University complained that Google and YouTube unlawfully censored its educational videos by restricting their availability to younger viewers. Prager asserts that its videos are fully appropriate for younger viewers and that Google/YouTube’s real objection is to their admittedly conservative point of view. PragerU’s counsel, former California governor Pete Wilson, asserts that this “is speech discrimination plain and simple, censorship based entirely on unspecified ideological objection to the message or on the perceived identity and political viewpoint of the speaker” and thus violates both the First Amendment and California law.

At the same time Google and Facebook assert they are free to run their private businesses as they deem appropriate. They also face intensive pressure from American politicians and foreign governments to moderate their platforms. Facebook and Google were called before a committee of the House of Representatives, which assailed them for doing too little about “fake news” on their sites. In Germany, Facebook was recently compelled to remove a post critical of Islamic migrants.

This Teleforum will consider the obligations, if any, that American law, including the antitrust laws and the First Amendment,  place on popular social media outlets. It will consider whether they can, or can be required to, restrict online content that some deem objectionable.




Prof. Thomas C. Arthur, L. Q. C. Lamar Professor of Law, Emory University School of Law


Prof. Eric Goldman, Professor of Law, Santa Clara University School of Law, Co-Director, High Tech Law Institute & Supervisor, Privacy Law Certificate



Teleforum calls are open to all dues paying members of the Federalist Society. To become a member, sign up here. As a member, you should receive email announcements of upcoming Teleforum calls which contain the conference call phone number. If you are not receiving those email announcements, please contact us at 202-822-8138.

Event Transcript

Announcer:                        Welcome to The Federalist Society's Practice Group Podcast. The following podcast, hosted by The Federalist Society's Telecommunications and Electronic Media and Free Speech Practice Groups was recorded on Friday, January 19, 2018 during a live tele-forum conference call held exclusively for Federalist Society members.

Laura Flint:                          Welcome to The Federalist Society Tele-Forum conference call. This afternoon we'll be discussing social media content control. My name is Laura Flint. I'm the Deputy Director of Practice Groups here at The Federalist Society. As always, please note that all expressions of opinion are those of the experts on today's call.

                                                Today we are happy to have with us Professor Thomas C. Arthur, LQC Lamar Professor of Law at the Emory University School of Law, and Professor Eric Goldman, Professor of Law at Santa Clara University School of Law and Co-Director of The High Tech Law Institute.

                                                After opening remarks from Professor Arthur we'll go to audience question and answer. Thank you for speaking with us, the floor is yours.

Prof. Arthur:                       Okay. Well, welcome everybody, and uh, thanks for calling in and listening to us. Uh, this program was initiated by uh, uh, several law su, suits which are um, indications of a perceived uh, censorship uh, by social media sources like Google and, uh, Facebook against people on the, groups on the right. Uh, we don't really know, uh, whether this is actually a widespread, uh, function or not. That's an empirical question, and all we have about it is anecdotes. But there is a perception by speakers on the right that these groups are, uh, in some ways impeding their ability to reach the audience. Uh, I think it's important to put this in perspective as to exactly what we're talking about.

                                                The complaint does not seem to be that they don't have access to the internet. Uh, that doesn't seem to be the case. What they're saying though is that they are being, uh, impeded, uh, or, or burdened in reaching, uh, the high volume, the high audience groups things like Facebook, like Twitter, like, uh, uh, YouTube, where there's a big audience to be, uh, achieved. And there are three lawsuits that we've seen recently which suggest that.

                                                Uh, first of all there was a suit filed I believe last August or September, uh, Gab, which is uh, a sort of version of Twitter, uh, that says it's more open to all speakers, and then particularly conservative speakers, than other groups like Twitter, uh, filed a lawsuit against Google saying that uh, complaining that Google had not, uh, um, accepted its application for its app to be on the Google Play Store, which would be the app source for a lot of telephones, Android telephones. Uh, that case ultimately got settled, but they made an anti-trust claim saying that Google, through it's contractual relationship with Twitter, was trying to suppress, uh, uh, their alternative, their competing uh, uh, uh, site. Um, but their real complaint seemed to be that, well this was not, was being done because we're on the right side of the political spectrum, and you don't like us.

                                                The case, the second case, was Prager University, PragerU as it's sometimes called, against Google and YouTube. And PragerU was complaining that Google's unit, YouTube, which Google owns, uh, was putting on its age restrictive monitoring system, or filtering system, uh, for people under 18, and uh, having this so-called restricted mode, uh, that cut off many of its so-called, well, its, its instructional video, its educational videos. And the thought was not that these were really unsuitable for people under 18, but because there was an objection to their, uh, ideological point of view.

                                                Uh, the third case, uh, was I believe Chuck Johnson versus Twitter. Uh, and Twitter actually uh, um, finally banned Mr. Johnson from their site. He claims because his views were too far to the right. Um, there's some evidence in the case, and uh, Professor Goldman knows more about this than I do. But, I believe uh, Eric, wasn't it that ultimately they just got tired of him trolling on their site. But, anyway, he actually got banned from the site.

                                                So, as I say, there are these cases which indicate a perception, uh, that there might be some moderation, some filtering, that's ideologically oriented. Now, at the same time there is a lot of pressure on sites like Facebook and Google to do more monitoring and, uh ... Editing as it were, of their content. Uh, Facebook was called before, uh, a committee of the House of Representatives, uh, last fall I believe it was, where they were put under intense pressure to filter their news. To make sure that they weren't purveying what the Congresspeople thought was fake news. And indeed, uh, Facebook has made an alteration in its news feed recently.

                                                Uh, internationally, in Germany, the Wall Street Journal and the Economist have both reported that uh, German law requires Facebook to ensure that hate, uh, messages, hate speech messages, uh, are not on Facebook. And uh, they, according to the Wall Street Journal, a disproportionate number of its monitors of Facebook are dedicated to sort of censoring, as it were, uh, the German me, Facebook pages. So, there's at least a big issue. Uh, and an issue worthy of discussion. And it's not really limited to conservative groups. It should be, at least theoretically, uh, for any unpopular speaker, on the left or the right or otherwise. Whether there is a, as an empirical matter, uh, that they are being restricted or burdened as reaching audiences on, uh, high audience sites like YouTube, like Google, like Facebook. And, uh, if so whether the law does or should do anything about that.

                                                Now, assuming that there is some of this, or that there might be some of this, in the future, at least in the United States law, is there a legal remedy? And, I'm not gonna ... I'm gonna, let me just mention this at the beginning of my discussion of possible remedies, uh, Section 230, as it's popular known, 47USC230, uh, commonly but not accurately reported as Section 230 of the Communications Decency Act, uh, provides a, a pretty strong legal immunity, uh, for sites like Google and Facebook to, in effect, edit or moderate, uh, their sites. And that, uh, Eric is gonna talk about in some detail as we go on in this program. But, I just want you to keep it in mind as I discuss, uh, quickly, uh, the two main theories, uh, that have come up in these lawsuits.

                                                The first one, and this is in the Gab case, is an anti-trust claim. And frankly, anti-trust claims aren't likely to, uh, be an important part of this discussion. So, I don't want to spend a lot of time with them. But, I do feel like I need to tell you why, since the Gab case was all based upon a Sherman Act claim. The reason the anti-trust claim is limited is for several reasons, I should say. The first reason is, uh, there must be ... The plaintiff must be a competitor of the defendant for this to even begin. So, in the Gab case, Gab had to say, well, we were being, uh, discriminated against so that Twitter could have an anti-competitive advantage over Gab. And we're both competitors.

                                                Uh, secondly, even if there are competitors, uh, the next thing is the defendant must be a monopoly. And there's a lot of loose talk now about company, about sites like YouTube and Google and Facebook being monopolies, but that is not as obvious as, as, as some of the media suggests. Now, if you've looked at, uh, today's Wall Street Journal you'll see there's a giant article, uh, about the so-called anti-trust, uh, claim against these big media, uh, companies, and the cover story in the upcoming Economist is also gonna be on that. But, journalists are not judges. And I think it's useful to keep that in mind. Whether there really is a monopoly here depends on some complicated economic and legal issues about what the real market is. Who really competes with whom. And especially is, not just do companies have a large usage, uh, but, is there something that keeps people from, uh, entering that market, uh, and competing with them. What the Economist called barriers to entry.

                                                And that's not so obvious. I mean, to give a concrete example of that in the Gab case, uh, there was nothing that kept Gab from competing with Twitter. Uh, and they had their site out there. Um, so, uh, they presumably could compete. Even beyond that, if there is a monopoly, is this covered by the Sherman Act to, uh, favor, uh, one of your sources, or to give, to give an affiliate or yourself a competitive advantage. If, if it's not actually trying to act, uh, to create a new monopoly in that area. And the law on that, um, is a bit unclear. There's some cases that say that a monopolist who uses his monopoly to get a competitive advantage, even if he's not monopolizing in a second area, uh, would be liable. But, there are also cases that suggest that that's not the law. And particularly the Trinko case from 2004, uh, suggests that this is not likely to be a winner.

                                                Uh, so anti-trust theories are difficult to make, and probably aren't gonna apply in a lot of cases. And if you remember I mentioned Section 230, and Eric, I wanna ask you to chime in here. Because if I'm correct, uh, Section 230 could also trump an anti-trust claim in this area anyway.

Prof. Goldman:                 Uh, yeah. Let me just, uh, mention a few words about Section 230. And then I think you're gonna finish your thoughts, and I'll perhaps have some others. Um, Section 230 was passed in 1996 and the way I summarize it in lay speak, uh, is that websites aren't liable for third party content. Uh, there are three statutory exceptions, um, none of which I think will be relevant to our discussion today. Um, so the general premise is that if there's liability for online content that liability rests with the person who originates the content, but not a re-publisher or anyone else who might be affiliated with the re-publishers. Um, and that rule has been applied very broadly across a wide range of circumstances. So, it doesn't really matter how the claim is framed if the ultimate objective is to hold a website liable for third party content then Section 230 protects it.

                                                There's a lesser known provision in Section 230 that goes even further to say that websites aren't liable for their filtering decisions. Now, there's some additional qualifications to that, but the deal was very clear behind Section 230. If a site publishes content it's not liable for third party content. If it removes third party content, it's not liable for that removal decision either. And so, there really, it really eliminates a lot of the legal theories that normally plaintiffs like to marshal up to try and hold somebody accountable for someone else's words.

                                                We do have some cases that have applied Section 230 against anti-trust claims. In fact, a case that came out, uh, just recently this year, one of the allegations against the defendant was a Sherman Act claim, and it just got swept up in the overall discussion on Section 230. The court didn't have anything special to say about the anti-trust considerations. It simply said, I see, plaintiff, that you are trying to hold this site accountable for third party content. I don't care if you called it anti-trust. I don't care if you call it some torte theory. You lose.

Prof. Arthur:                       Yeah. And that's pretty definitive. (laughs) So let's switch from anti-trust to the more interesting area, which is, was raised by the PragerU case and the Chuck Johnson case, which are claims that in some ways, uh, these big sites like Facebook, YouTube, Google, are the functional equivalent of the town square. Or what in First Amendment terms has been called the traditional public forum. Uh, and there's some loose talk, quite honestly, that companies like Google have, have engaged in. Uh, particularly uh, in earlier years, that well, we're just here, uh, to provide a way for people to communicate and get to each other, and, and reach the web. And we don't really edit. It's not up, you know, it's not up to us to decide what the content is.

                                                There's not as much of that talk anymore, but. These theories tend to be cases that have one of, one or two different uh, uh, theories. The first theory would be that well, you've uh, created this sort of public, or, or virtual town square, if I could call it that. And therefore you should be subject to the same kind of restrictions that the First Amendment places on, on the government. And particularly local governments for their town squares, for their actual town squares, uh, where they're not allowed to engage in viewpoint restriction, of who can say, hold rallies and street parades, and otherwise speak in the town square.

                                                Uh, the only actual case they have that's somewhat like that, uh, is a case from 19, I believe, '47, called Marsh against Alabama, where there was a so-called company town. Where it was private property but it was the functional equivalent for the people who worked for that company, uh, as a town. And it had a square. And the Supreme Court, in an opinion by Justice Black said that they would be treated as if they were the government. Uh, there really hasn't been anything quite like that since then.

                                                The closest thing I can think of is a case from 1979 called Pruneyard Shopping Center in which the California courts held that a shopping center, which was open to the public, uh, under the California constitution, was required to permit, uh, a couple of students to uh, high school students I believe they were, to put up a little card table at an ... In, in the mall. And argue about uh, questions about the Arab boycott, and things like that, of Israel. And this pro-Israeli, Israel spot, uh, wasn't particularly objected to by the owner of the shopping center. But the shopping center just didn't want to have political activity in his shopping center. He didn't say this interfered with his speech. It just was something he didn't want to have to deal with.

                                                And the California Supreme Court said, well, under the California constitution, uh, you have to, you have to permit this, and in an appeal to the US Supreme Court the US Supreme Court affirmed, saying that well, this didn't violate the you, the First Amendments of the US Constitution. The shopping center had argued that this, uh, provision of the California courts would actually violate its First Amendment rights, uh, claiming that it was for, uh, being compelled to speak and, um, there was a long line of cases saying that, uh, people can't be compelled to speech. There's the famous flag salute case, West Virginia Board of Education against Barnett from the 40s. There, uh, is the famous Live Free or Die license plate case, Wooley against Maynard, from a later time, uh, that said that you can't be required, uh, to express a message that isn't yours. But, in this case, the shopping center owner, according to the court, didn't have a message one way or the other, and therefore it was distinguishable. Um, that's about the best you can come up with, uh, for this kind of theory.

                                                Now, there's a history in this. And I think it might be useful, uh, to tell you what it is. Uh, there were a couple of professors, and I knew one of them, uh, Jerome Barron, who in the 60s and 70s was the Dean of George Washington Law School, and he had a buddy named Thomas Deans at American U Law School, and they together wrote the nutshell on First Amendment. And their theory was that, uh, media, particularly the print media, had become so concentrated with only one newspaper in modern times in a city, that this was as concentrated as, uh, and hard to reach the public, as the television lanes were. And they argued that just as the Supreme Court had allowed the FCC in the Red Lion case to regulate, uh, broadcast stations and require them to provide equal time to speakers that newspapers could be required to do the same thing.

                                                And Florida passed a statute on this theory. But, in the famous Miami case versus Tornillo case of 1974 the Supreme Court pretty much said, virtually, that while Professor Barron's theory's very nice about we need to have affirmative support for speakers, um, we just can't do that. Because editing newspapers is the job of newspaper editors, and the First Amendment absolutely protects it. So, uh, there really hasn't been a lot of support for this kind of idea that uh, uh, you can compel, uh, people to let you speak on their platform.

                                                Uh, so when we get down to talking about these cases now, the argument is, are sites like Google, um, and Facebook, and YouTube, are they more like newspapers, uh, or more like this open shopping center, or, uh, say cable TV systems, or the broadcast media? And the court has said pretty emphatically, um, in the uh, was it the ACLU versus Reno case, Eric?

Prof. Goldman:                 Yeah, right. It's a Supreme Court case from 1997.

Prof. Arthur:                       Right. That well, no, they're more like the newspapers. They're more like regular media. Uh, and I believe in that case, Justice Stevens said well, all media, each medium, is, of communication is different, but this one is more like the print media and less like, um, uh, television and radio which are sort of treated differently because you have to have government licenses. Uh, and pretty much said, you know, you can, you on the ... If you're a site like Google, you can in effect edit your content. Uh, so, that's kinda where we are.

                                                Um, I think what we're left with, the interesting part of this, is sort of the normative question. As I tell my students, whenever you look at areas like he First Amendment and what the Supreme Court does in it, you sorta have to be a legal realist. Uh, the precedents seem to suggest, uh, that, uh, realistically, uh, sites like Google and uh, YouTube, and uh, and all the rest of them, Facebook, pretty much would have the same kind of editorial freedom that uh, the Atlanta Journal Constitution, or the New York Times, or the Washington Post, or any other print medium would have. Which would mean pretty much that they can "censor" or otherwise, or put in a more benign way, they can edit the content that's on their sites if they want to.

                                                And um, on the other hand, while the precedents seem to go that way, uh, as I tell my students, in the area of Constitutional law, it's always a good idea to follow Justice Holmes, uh, claim that the life of the law isn't logic always. Sometimes it's experience, and the felt necessities of the time can produce some, some interesting results. Uh, so, that sort of slides us to the normative question, which is should there be ... And I think this is sort of the, the nub of the matter as a policy matter, question. Is should there be, uh, any kind of limits on the ability of a Facebook or a Twitter or a YouTube or any of these sites to in effect edit the content, just like a newspaper editor was? Or do they have some sort of obligation to be almost like a common carrier, uh, the way some of the equal time and right of reply provisions that we've had in the broadcast media have?

                                                And I know, let me (laughs) ... I'm doing quite a monologue here. (laughs) Let's hear what you think about this.

Prof. Goldman:                 Uh. Ah, thank you. That was actually a wonderful set up. Um, and I'm just gonna pick off a couple of points, and then I'll turn it back to you. Um, so from my perspective so much of the discussion turns on the analogy. What are the major internet companies, especially those that might be described as platforms. Are they technology providers or are they media companies? And to me the question gets clouded a lot because the companies themselves talk out of both sides of their mouths. Sometimes they'll say they're a media company. Sometimes they'll say they're technology providers. To me there's really no debate about it. They are media companies, just like, um, ah, the traditional editorial publishers of days of yore. Um, and they exercise their editorial discretion about what content they're going to allow on their network just like the newspaper decides which stories it's going to cover, and how much space to allocate to them, and what order to put them in. How, you know, which ones go on the front page, and which ones go on the, the back page.

                                                Um, so, um, if we accept that internet companies are media companies, if we can reach that conclusion, the legal conclusions start to follow, uh, to fall into place really quickly. Um, ah, at that point, the First Amendment protects the freedom of speech, and freedom of press, of media companies. And as Professor Arthur mentioned, there's a long list of ways in which people have tried to make incursions into the discretion of media companies. And for the most part, those fail. And so, if we accept internet companies are media companies, they're gonna be protected by the same package of rights that we've seen, uh, traditionally.

                                                Um, we talked a little bit about Section 230, and I, I, it's such an important part of the discussion because of its power and breadth, um, but it is counterintuitive. Um, the way that so many of us learned law in the first year of law school is that if you have some control, you accept liability for what you control. Um, and if you had no control, then you might have reduced liability. Um, and Section 230 said something different. It says you can have absolute editorial control, and you can still not be liable for the things that you exercise editorial control over. And that just blows people's mind. Um, how could that be?

                                                But, if we think about Section 230 as an extension of the First Amendment, it's a supplement to the First Amendment. Um, it's Congress's way of saying, the, there's certain categories of speech that we value so highly that we want to, ah, to encourage, that we're gonna go beyond what the First Amendment requires us to protect and we're gonna protect more space than that. And if you think about Section 230 as a speech enhancing law, uh, that sits on top of the First Amendment, it shows you that there may be ... It [inaudible 00:26:23] that normative question. Is that a good thing or not? And I'm happy to explain why I think it's a good thing. But, um, but really that shifts the ground to the normative question. Not the legal question. The legal question is Congress has spoken, and it's gone beyond what the First Amendment requires.

                                                Um, Section 230 is also really interesting law because it actually acts as way of spurring competition among, um, internet companies. It, it keeps the door open for the next Facebook, or the next Google, or the next Twitter. Um, otherwise, without Section 230, anyone who wanted to enter the market would have to build a very expensive infrastructure to make sure that any liability risks were properly managed up front. They'd have to build a team of people who are out there making the kind of editorial judgments, um, ah, that traditional media, uh, um providers have to make. And that would raise their costs. Or they would face crippling liability that the big companies can insure against, can, can, um, ah, smooth out the averages across a large base. Um, but the small companies can't accept that kind of risk.

                                                So, Section 230 by, by allowing sites to exercise the editorial discretion that they want, not make the investments up front that the big companies are making, actually keeps the door open for new, emerging companies. And for those who say, well, there's really only three or four providers, uh, in the internet space. And they're the giants, and everybody else doesn't matter. Think about the companies that have emerged in just the last years. Companies like Snapchat or Pinterest have come, I don't want to say out of nowhere, but they've come really fast and quickly in just the last few years. And Section 230 is a big part of that story.

                                                Um, the last thing I want to say in my opening remarks here is that we want internet companies to clean up content. We have seen already, over and over again, in our experience with the internet that any site that isn't actively managing the content will be overrun by the gamers, by the fraudsters, by garbage. Um, there was a reference a decade ago to the idea that there were these cyber-cesspools, these places on the internet that were so filled with garbage that there really was no redeeming value to them. Um, and the way to avoid those kinds of um, outcomes, to keep the gamers, the fraudsters, and just the junk from overwhelming tools that we want to be valuable is for sites to exercise their editorial discretion.

                                                And so that actually is a feature of Section 230. It creates the space for sites to exercise that editorial discretion, to keep their sites from turning into garbage, without facing liability for whatever they miss, or without facing liability for the judgment calls that other people might disagree with. And so, Section 230 is actually a very brilliant solution to the problem of how do we get internet companies to do the work that we want them to do? We give them the incentive, not the obligation, to do that. Because they know if they don't exercise that discretion then they'll become worthless if they become overrun.

                                                And if you look at the Chuck Johnson lawsuit, for example, Twitter's position appears to be they terminated him because he was a repeated troll that was reducing the level of discourse on Twitter in a way that was harming Twitter's interests, and the interests of Twitter's users. And we want a site like Twitter to be able to, to shut down the trolls that might be taking away from the rest of everyone else's conversations.

                                                Um, let me take a break there. I'm sure I have more to say. But, I'll turn it back to you, Professor Arthur.

Prof. Arthur:                       Yeah. Yeah, let me ... (laughs) Just as Eric has, let me add a ... You know, just as Eric has added a few uh, additions to what I have to say. I ... Add a couple to what he has to say, and that is this. Um, I think what we've got going on in here is, there has been loose language by companies like Google. That will look, basically we just want to have a forum. And we're not interested in editing. But, the fact of the matter is they can't help but edit. And so, to the ... You know, it's true, and I think, along with their rhetoric, they don't edit in the way a newspaper does.

                                                Like in the famous Miami Herald case that I mentioned earlier, versus Tornillo, which pretty much did the end to the Barron and Dean's theory that, uh, newspapers should be required to, to give rights of reply and equal time the way broadcast media had. Uh, that case emphasized in its main opinion that well, the essence of editors is that they edit. And while it's true that the language that we've had, particularly a little bit earlier in the days of the internet, by people like Facebook and Google, and so forth, that they didn't want to edit, they weren't there to edit.

                                                As Eric points out, you gotta edit. Because if you really want to come back and say that there's no editing at all, that they're just like a common carrier for anybody that wants to, you can have, as they say, the spam. Uh, uh, not the objectionable, act, content, and everybody has ... As I told my First Amendment students, everybody has something they would censor. Uh, even the most uh, pro-First Amendment, and I'm pretty rigid on that. I'm pretty pro-First Amendment. There's some things I wouldn't allow be printed. Uh, the case for me was, uh, the guy that wanted to publish how to make an atomic bomb in a newspaper. Uh, I'm ready to censor that. Everybody has something.

                                                Uh, so the idea that you could absolutely not edit at all ... It's true that sites like Facebook don't edit in the same way we see a, a newspaper edit. Uh, that was mentioned in the Miami Herald case. But, I think, as Eric points out, they do have to edit somewhat. This is the reality. Uh, some of this other talk might have just been happy talk. They didn't really mean, when they sorta said we're just here to let people talk to each other. Uh, some of it may have just been naïve. Maybe it was sincere, but people hadn't really thought it through. But, as they've had to think it through, uh, they've found that they have to do some editing. And the fact that they don't edit as much as a newspaper or TV or radio station doesn't mean they're not editing. Uh, editing is not a binary either do it or you don't do it. It's a, it's more of a spectrum. It's more of a matter of degree. The fact that they may not choose, they choose not to edit as much as a, a newspaper doesn't mean that they don't have to edit somewhat.

                                                And I think edit, moderate, use whatever word you want, as Eric has pointed out, if you don't want to be, uh, overwhelmed by spam, or by misleading ads, or by child pornography, a whole array of things, that I think almost, the most libertarian people ... Uh, and on the First Amendment area I'm one of the most libertarian people. There's some things that even I would say no, you can't do that. And I think they just have to do that. That's, that's the normative question. Uh, is there ... Do we really wanna come back and say that they've gotta just let anything go and be completely unedited? And I think ... I'm not sure that's realistic.

Prof. Goldman:                 If I can add to that. Um, I think the acid test example for the conservatives who feel like, um, it's a drag that uh, they're being targeted. Um, recognize that the same sites are under extraordinary pressure to try and clean up terrorist and extremist content. Whatever that means. And much of that content is unquestionably protected by the First Amendment. Um, so, we kind of have to decide. Do we want the, uh, the, uh, the, uh platforms, um, which I'm calling media companies, uh, to uh, uh, clean up terrorist and extremist content, despite its First Amendment protection? And if so, we already have made a bunch of normative biases about the idea that First Amendment protected content should still not be published by these platforms.

                                                And once we go down that line we kind of have to let the platforms have the discretion then to decide where they're gonna draw the lines about First Amendment protected content. Um, or if we take away their, their ability to do so. If we remove the discretion of the platforms to decide what content will be on their platforms, um, then the First Amendment protected terrorist and extremist content is gonna stay online as well.

Prof. Arthur:                       Yeah. I think that's a great point, and I also think to maybe put this also in a little but more, uh, ah, context, uh ... Despite the claims in some of these lawsuits, it's not true ... I mean, you know, it's not a fact that, uh, that co, that companies, the plaintiffs, like Gab and PragerU, in any event, have been uh, even kept off, uh, the sites that they're complaining about. PragerU's educational videos still, uh, were on. Admittedly were on YouTube. The claim was that, uh, YouTube only had it's filter for restricted content for underage, under 18 year olds, to watch. And so, they weren't being absolutely censored. Uh, they, they could still reach a wide audience.

                                                And, uh, in the Gab case, um, Gab, uh, itself was a, ah, you know, a site, uh, and ... So, uh, if you ... And I did this as a little experiment, I went to Google, and I googled Gab. And I found Gab's website. Now, I don't have an Android phone, so I don't know if there's a Google app. But, it wasn't like they're being completely restricted and censored. Uh, a bit of burden for sure, but, um, it hasn't ... You know, it hasn't been a thorough going censorship. I mean, even the Chuck Johnson, where he's not on Twitter, Twitter is not the only thing on the web.

                                                Uh, and it's ... It's unclear to me, I mean, it's an argument that people need to keep in mind, uh, is that the fact that you're not able to reach ... To sorta be on the platform with the widest viewer point isn't that different from the fact that you don't automatically have a right to, you know, have your op-ed published in the New York Times, or broadcast on CBS. Uh, that's not considered to be censorship of, of the kind that we're talking about, government censorship.

Prof. Goldman:                 Uh, last thing I might toss in, I don't know how close we are to audience questions, but I think we're getting pretty close.

Prof. Arthur:                       Yeah.

Prof. Goldman:                 Is that, um, uh, the First Amendment may, and Section 230 definitely applies in circumstances where the, uh, defendant has engaged in discriminatory practices. And we have a wide range of examples about that. Um, the case I teach in my Internet Law class is a case from 2003. It was Noah versus AOL. And Noah was an AOL subscriber who alleged that AOL discriminated against him in various chat rooms, um, on the basis of the fact ... I'm sorry. Discriminated against, uh, Muslims um, including him, uh, in AOL chat rooms in the way that they moderated content and treated various individuals. They let a bunch of anti-Muslim, um, uh, uh, remarks, uh, be shared in these chat rooms that were offensive to Muslims. And then allegedly they took retaliation effort against Muslims who complained.

                                                And so, he brought a very standard, uh, uh, anti-discrimination claim. Uh, saying AOL has discriminated against me on the basis of my race and my religion. And Section 230 said, we don't care. Um, that is their discretion. If you're trying to hold AOL liable for the content of third parties, um, in this case these people are saying really stupid things in the chat room, Section 230 applies. You lose. Um, and so, a political affiliation in the media context actually isn't a protected class at all. But, even if it were, Section 230 still applies.

                                                And so, it really gets down to the power of Section 230. And the vision that we, uh, aren't going to hold the intermediaries liable for the third party content that they, uh, publish. Um, and that if they feel the need to remove content, similarly, we're not gonna hold them liable for how they choose to remove content. That's their discretion. We want them to exercise that discretion. Because, in the end, we're gonna get a bunch of net social goods. A bunch of work that has social value, because of the fact that they're cleaning up the spammers, and the gamers, and the trolls.

Prof. Arthur:                       Yeah. And let me say one last thing, and let me take off my First Amendment professor hat and put back on my anti-trust professor hat. Uh, one of the things that Section 230 does, and this is looking at it from an industrial organization/anti-trust perspective, uh, is that it removes a barrier to entry. Uh, just as Eric said, the fact that small, new sites, are not in as good position to defend themselves against liability to protect themselves against liability like the big boys are, 230 gives them a, a sort of protection. Uh, it ... I don't want to call it a subsidy exactly, but you know, they don't have to worry about uh, liability, uh, the way ... So, for a Gab for example, 230 might be a problem for it in its attempts to have a, a anti-trust suit against Google. But, Section 230 also makes, uh, Gab itself, uh, free from liability for the third party content that it wants to post, or that's posted on its site. And that certainly cuts down its cost of business.

                                                Are we ready for questions?

Prof. Goldman:                 Uh, I am.

Laura Flint:                          Great. Let's go to audience questions. In a moment you'll hear a prompt indicating that the floor mode has been turned on. After that, to request the floor, enter star then the pound key.

                                                When we get to your request, you will hear a prompt and then you may ask you question. We'll answer questions in the order in which they are received. Again, to ask a question, please enter star then the pound key on your telephone keypad. Let's go to our first audience question.

Gary Wheaton:                 Hello. Hi, this is Gary [Wheaton 00:41:30] again in New Hampshire. Um, I'm gonna short, quick comment here, um, and then if you can kinda answer it somehow. Um, this seems to be that what we need is some kind of, I don't know [inaudible 00:41:45] rules. But, we need some kind of rules. Because, I mean, you really are talking about anti-trust here. Um, if it's ... You know, obviously if it's a private entity you, you, the First Amendment won't apply. But, unless the government is encouraging those private entities to, you know, censor content. Of course, we all understand that. So, if, if you've got ... Let's be real. I mean, Google, Facebook, all these things, these really are monopolies. So, you know, we can argue that, of course, arguably. But, when you have monopoly situations, then the government can just say, you know, you've got to let the content go.

                                                Now, you guys are real good at saying there's garbage, and this and that and the other thing. But, what is garbage, right? So, one man's garbage is another man's treasure. So, um, obviously, again, these things are obvious rules. Political speech can't be censored. I mean, it, it, it's free speech. Now, is, is it garbage? Well, you know, obviously Prager University is not garbage. Now, you say that's only an age restriction, because 18 years old, whatever. It's ... Let's be real again. They're censoring it. There's no age, there's no pornography in Prager University's product. So, (laughs) how can you logically say it's an age problem? You can say the content is adult. But how is that adult when it's political speech? You know, I mean our teachers, arguably, are liberal, and they indoctrinate young people, less than 18. So why can't Prager University indoctrinate less than 18?

                                                I mean, this is what we're talking about here, so let's cut to the chase, and so, the question would be, um, what kind of rules would you have then? You know, I mean, you guys are the experts. You, you use the term garbage. What is garbage? Yeah, obviously, we can use some of those First Amendment things, like imminent danger, imminent, you know, harm, harm, harm is really where the line is for me. If it's harming somebody, let's, let's use that as a starting point for the rule. But, if not, what rules would you guys propose, I guess is my question.

Prof. Goldman:                 Uh, Professor Arthur, I have some comments, so I don't know if you want to go first?

Prof. Arthur:                       Uh, not necessary. You go ahead, Eric, and I'll, I'll chime in.

Prof. Goldman:                 So let's, yeah, let's start with the anti-trust issue. And I think that the, um, uh, the Tornillo Miami Herald case is actually quite helpful here. Um, because as Professor Arthur pointed out, um, at the time in the mid-70s there was one newspaper for, in a, um, a metro area in virtually all the major metro areas. Um, there was no question that the newspaper was a monopoly in its local market. And the court still said that it had, the newspaper had the absolute right to deny access to somebody who wanted to espouse their political views. Um, so, as a matter of First Amendment law, I think if we accept that internet companies are, uh, media companies, the monopolist label doesn't really change the discussion at all. Um, it actually reinforces the fact that even in the circumstance they have their discretion.

                                                Um, the phrase that YouTube is censoring Prager University. Um, I would just point out, as we know, that censorship can only be done by, uh, the government, or a state actor. Everything else is something else. Um, and when we're talking about media companies, what someone might call censorship is the media company exercising its editorial discretion. It's deciding what's fit for its audience or not. That might be colloquial described as censorship. And I understand that, that, you know, if you don't know the, the difference between a state actor and a private actor, it all looks the same. Um, but, but the label censorship here ... Every time you hear that in the discussion of a media company, what you're really saying is they're exercising their editorial discretion. And when we phrase it that way we realize, actually, we want them to do that. That's what media companies do. That's why they're valuable. Is because they have that kind of judgment.

                                                Um, the last thing is this question about what is garbage. Um, and uh, I don't actually think we need to answer that. Because it ends up being an irrelevant question. Once we phrase the issue as media companies exercising editorial discretion, they'll define what's garbage by their own definition. And if what they define as garbage is something that we would have thought is the most valuable content, well, they're under serving us as an audience. That's going to be a problem in the marketplace. Um, but ultimately the end, uh, we have to defer to their editorial discretion. They decide what they think is relevant to their audience. We don't get to tell them that, uh, they made the wrong call.

Prof. Arthur:                       Uh, that's pretty much my view, too. I mean, but let me just add to that. Let me do the anti-trust side of this. I think, uh, Eric expressed pretty much the First Amendment side, which is uh, if I could say it in a nutshell, uh, if they have the ability to edit, they have the ability to edit. Uh, and I would agree with you, uh, with the caller, that PragerU, uh, videos don't seem to me to be inappropriate for, uh, folks under 18. Uh, but, that's just a disagreement that I would have with the editorial decision, uh, that YouTube has made. Assuming they actually are making it. That's what alleged in the complaint.

                                                Uh, but, I also disagree with what the Atlanta Journal Constitution, which is my local newspaper, has on some of the stuff they put on their pages. And uh, it's the nature of First Amendment issues, that uh, you can edit, you know, you can say good things and you can say bad things. You can edit in a good way, or you can edit in a bad way. But, uh, it's sorta like comp ... It's a compared to what question. Is the media company a better, safer editor of its content, uh, in the, the sorta the marketplace of media, um, or is the government? And history shows that the government is not really someone your should trust to edit. Uh, and it's hard to come up with an alternative, uh, uh.

                                                Putting my anti-trust hat on like I said I was gonna do, (laughs), and actually getting to it. Anti-trust doesn't automatically create a pub, a utility type of regulation. Of, if there are anti-trust cases against Google, YouTube, Facebook, et cetera, ultimately the uh, remedy is to break 'em up. Uh, and it's not necessarily to make them into utilities. Uh, the anti-trust law is not a cure for whatever ails you, despite a lot of, uh, loose language about it. Uh, anti-trust is, uh, not something that turns a company into a utility. In fact, my anti-trust professor view is that's a, that's a terrible thing to do. You're a lot better, uh, if it turns out that their people are unhappy with what they see on a Facebook or Google or whatever, that ah, creates the market opportunity for somebody to provide an alternative.

                                                Uh, personally, I hate the way Google has moved in recent years to have these wretched ads at the beginning of all of the searches I do. I much preferred it before they had that. And if another search engine will come up that gives it, gives me what I used to get from Google I'll switch to it. And I can switch very, very easily. Um, so, uh, anti-trust wise, I, I really don't see anti-trust as providing this sort of, uh, public utility type of provision, uh, that people seem to think it will.

Prof. Goldman:                 And Professor Arthur, if I can add just one more thing. I just want to come back to the, the theme of the question, and, and all the different ways, uh, that perhaps Chuck Johnson is, is, is getting screwed. Or Prager University is getting screwed. Put, put, change plaintiffs here. ISIS sues Twitter or YouTube or Facebook for removing its First Amendment protected, terrorist promoting videos. Um, and they make the same anti-trust claims. And they claim that ah, ah, uh, the um, ah, the sites are, are making improper editorial judgments about what's, ah, garbage or not. Um, if Chuck Johnson's theories prevail, so does ISIS's.

Prof. Arthur:                       Yeah. And I think just to underline that, if this seems fanciful, what Eric is saying, the number one case in, uh, whether the First Amendment protects speech, uh, by extremist groups is Brandenburg versus Ohio which was a case involving the Ku Klux Klan. And uh, the court held that what they were doing was protected so long as it didn't, uh, actively incite imminent violence, and there was a, ah, actual threat of imminent violence. That's a very protective standard. And uh, the fact that I don't want ... I agree with that opinion as far as the government's concerned. But, I don't think, uh, uh, that would mean that TV stations and newspapers should have to, uh, uh, carry the Klan's ads if they don't want to. And I hope they don't want to, quite honestly.

Prof. Goldman:                 Are we ready for another questions?

Prof. Arthur:                       I think we are, yeah.

Laura Flint:                          Yes. Great, let's go to the next caller.

Gene Berg:                         Hi, this is Gene [Berg 00:51:27]. Good afternoon. Um, my question about a, perhaps non-anti-trust, not anti-trust approach, but more a contractual approach to this issue. Um, of course, each website, including Google or Facebook, or whatnot, they have terms and conditions, which constituted a contract, the commercial contract. And, uh, the Prager University, other providers, they are commercial parties to this contract. Can you see that, uh, they would have uh, a obligation, the, the they being Google, the, the ... They would have to, obligation to enforce the contract, uh, in a fair and equitable fashion. The good, it's part of the good faith requirement of the contract.

                                                If, if, ah, a site like Prager University, they invested money, they invested funds, into the developing their content. And they give this content, in reliance on this expectation of fair and equitable enforcement. Wouldn't that be unfair to single them out, uh, comparing to uh, the providers, uh, that give a, for example, left-wing content?

Prof. Goldman:                 Uh, Professor Arthur, I do, I do want to chime in on this one.

Prof. Arthur:                       Yeah. Go ahead.

Prof. Goldman:                 If you don't mind. Um, so, uh, yeah, absolutely. Uh, a contract could be a basis on which a, um ah, a person whose content was removed, uh, could seek redress. Except that the contracts are never gonna say anything that provides support for those claims. In fact, almost all the relevant contracts are going to say things like we can remove your content at any time in our discretion. And they're going to say things like you should not rely upon the availability of our service. And they're gonna say things like we limit our damages to the amount that you paid us. And so on and so forth.

                                                So, the contract, um, uh, arguments really don't advance the ball very much, because the contracts themselves are gonna be too thin a, uh, a foundation to rely upon.

Prof. Arthur:                       And if they aren't already they sure as hell will be once the ... Once [inaudible 00:53:40] lawyers get ahold of them.

Prof. Goldman:                 No, they are already. Um, and I will add that there's a possibility that Section 230 would protect the defendants even in the case of a contract claim for exercising their discretion. As we said, uh, Section 230 says that it, a site is not liable for its decision to remove content. And there's an, an unclear interplay between a contract promise on the one hand, and then Section 230 that says you can remove content without liability. Um, so even if the contract, uh, uh, works, there still could be a Section 230 defense. But, the contract isn't gonna work.

Laura Flint:                          We have two more callers in line, so let's go to our next caller.

Speaker 7:                           Yes, I have a quick question regarding, beyond the current law right now, do you think that Constitutionally, if Congress, you know, got together, made this a bi-partisan effort, to either limit or to, um, uh, you know even change the rules in regards to, maybe for example, making Google and various companies a common carrier, or to say you know, for example, you can't use your editorial discretion for political views, or you can't favor ... I know for some of the anti-trust arguments, you can't favor your types of businesses, like say a Gmail versus other email providers on your search engine, uh, part. Do you think that those sorts of statutes would be able to stand under Constitutional, ah, I guess review?

Prof. Arthur:                       Well, I think with regard to the competition law, as I said before, it's not clear that anti-trust law would prohibit favoring, uh, your own content, for example. Uh, but, certainly, Congress could pass a law that would, uh, impose some kind of common carrier obligations, uh, if they wanted to. Uh, it's, you know, for things that don't raise First Amendment issues. I mean, they can ... Uh, anti-trust law is almost always made by the courts, but as I tell my anti-trust classes, there's nothing that says that Congress, uh, if they're not happy with the way the courts are interpreting the anti-trust law, can't modify it. Uh, they haven't on the substantive side literally since 18, well, since 1950, uh, in the Merger Act. But, they could.

                                                But, as a First Amendment issue, I think that's a tougher issue. If we are correct, Eric and I I think are pretty much on the same page about this. If we are correct that these are properly classified as media companies, the First Amendment, is as it's been interpreted for a long, long time, and again, the Miami Herald case is probably the leading case, uh, makes it kinda tough, uh, for the Congress to come back and impose these kinds of, um, rules about what you can and can't have on your sites.

Prof. Goldman:                 I agree with that. I would point to the Reno versus ACLU decision in 1997, that did say that there was no basis for Congress, uh, to qualify the level of protection that is applied to, uh, the internet. And so, uh, Congress, ah, can't just make up that internet companies have become common carriers. Um, they're gonna be subject to the full level of First Amendment scrutiny, uh, for doing so.

                                                I think the idea about the self-favoritism, and how that creates anti-trust issues is a more complicated one. And there has been a lot of questions about, ah, the self-promotion of other services. Um, I see that as a different issue. And I don't think it really responds to the thing that, ah, that brought us to this call, about, uh, whether there's discriminating against conservative view points or not. Um, so, I would put that in a separate bucket, and that I see as a more complicated question.

Laura Flint:                          It looks like we have one final question. Let's go to this caller.

Molly:                                   [inaudible 00:57:55] Hello, my name is Molly. Um, I'm from Chicago, and I have the following question. Today we were mostly, um, considering the discussion from the perspective of freedom of speech violations. Um, but I feel that there is another point of view that's, um, neglected usually when people talk about freedom of speech on this, on social media, and it's a particularly a violation of consumer rights. Um, and most precisely, um, misleading advertisements in the sense that all these social media, um, services advertise themselves as forums, while in fact what happens is that they actually act as media, ah, venues. And their content is curated. It's not random. It's curated content that's carefully filtered. And um, in this res, in this respect they're actually misleading consumers to believe that they're there to read, ah, randomly produced free information when this isn't the case.

                                                And I would like to hear your perspective on that, and particularly, I would like to bring your attention to the case when in 2014 Facebook actually conducted experiments with its consumers by actually, um, manipulating the timelines, um, on their profiles, uh, by only, um, publishing content, ah, just to experiment with their emotions and their reactions. And do you believe that it would be, make more sense to approach that issue from simply consumer standpoint?

Prof. Goldman:                 Um, I know we're running out of time. Let me just, uh, make a couple of remarks. Um, first of all, uh, I think it is helpful to ask the question from the consumer perspective. But, when we talk about that, we are talking, eh, consumers of media products. And so we start with the premise that the providers of those media services have such enormous discretion the consumers then have the, really the choice, do I read or not read? And should I invest my media dollars in reading this product or some other product? Um, so, the consumer issues are all framed by the fact that there's such great discretion on the part of the media providers.

                                                But, your broader point I think is that, um, if the companies engage in false advertising, there can be a claim for false advertising that might be independent of all the other considerations that we discussed. And even Section 230 might not protect against false advertising. If a company describes itself as having a particular type of service, um, and even if they're a service that publishes third party content, if they've misdescribed what they're doing, then that is their own words that they should be held accountable for.

                                                We've had some very, um, confusing case law about the interplay between Section 230 and the first, and, and the false advertising claims. And so, I, I won't say that Section 230 is irrelevant to that discussion. Um, but no doubt that if we can find an example where a company's engaged in false advertising, um, that should be evaluated on its own terms.

Prof. Arthur:                       And, the only thing I would add, from the First Amendment standpoint, the Supreme Court's made it very clear that commercial speech, which includes advertising, uh, any speech for money, uh, isn't protected as much as other types of speech. In particular, deceptive, uh, claims, or misleading claims are, are not protected by the First Amendment. So, to the extent that they're holding themselves to doing a, and in fact do non-a, uh, then that, that's a very different matter.

Prof. Goldman:                 The only thing I'll add is that, though the false advertising claims may escape some of the First Amendment and Section 230 restrictions, they're also gonna be tough to win in many cases. In many cases the companies aren't engaged in false advertising, when you run through the legal elements. Or there may be standing issues, or other limits on the ability to bring those claims. So, I don't think they're a panacea, but, I do think that they raise a, a different set of questions than the bulk of our conversation today.

Prof. Arthur:                       And, I guess the last word I would have on that, if ... Assuming that they are, the question is, what would the remedy be? And typically the remedy for something like this is disclosure. Uh, it may be a more frank disclosure of exactly what they're doing, and of course, that disclosure, unfortunately, is probably gonna be buried somewhere in the terms of service that we all click on automatically and don't read.

Laura Flint:                          Would either of you like to make some final remarks?

Prof. Goldman:                 Uh, actually I would, and I'll, I'll keep them brief. First of all, thank you for convening this forum. Thank you to all those who are listening and participating on this call. Um, uh, I'm glad we're having this discussion. I will say that, um, from my perspective, um, uh, I've been intrigued by how many, uh, "conservatives", and I use that in quotes only because I don't really know what that word means, are looking for ways to increase the regulation of the media industry. And from the libertarian, or quasi-libertarian perspective it just makes me scratch my head.

                                                Um, what, do we really want to go there? And, uh, at the end of this process, if all these arguments work, are we gonna be better off? Because there's gonna be swings in who's in, ah, the powers, the corridors of power. There's gonna be swings in, uh, prevailing social norms. So, if we built a regulatory infrastructure to restrict the discretion of internet companies to, to, to what they can do, so that more people can tell them what to do, are we gonna get what we want at the end of the day on that? And I encourage all of you who are chomping at the bit for more successful lawsuits to really think about, how's this gonna blow up on us when the pendulum shifts?

Prof. Arthur:                       Yeah. I agree totally with what Eric just said. Uh, one of the things I tell my First Amendment class is you've gotta understand, uh, that, uh, this is the kinda thing ... The First Amendment doesn't favor liberals or conservatives. It doesn't favor any particular view. Uh, uh, it, it protects all the views. And you might desire, uh, to suppress the view you don't disagree with, but, if you have that power, then ultimately that means other people will have the power to, you know, to stifle your views. And I think history has shown that, uh, that's a power, when you give it to government, it really isn't healthy.

                                                And the United States, with its First Amendment, is the freest country in the world, with regard to expression. Uh, there are a lot of people pushing back from that, as, as the late Nat Hentoff of the Village Voice used to say, uh, the freedom of speech and freedom of the press, uh, are routine, or, are always under attack from either the left or the right. When I was a child, I still remember ... Some of the earliest things I remember my parents, uh, talking about in worried tones, was the McCarthyism movement. Where freedom of speech, freedom of expression, uh, was being threatened from the right.

                                                Uh, now, I see freedom speech often, as Mr. Hentoff had said at the time, seems to be threatened more from the left. But, it's always gonna be threatened by somebody who doesn't like, and left and right tend to be arbitrary categories anyway. There'll be people with a point of view that won't want to have opposite points of view being expressed. And once we, uh, let it down so that some of that can be done by the government, uh, because we like it for our particular thing, ultimately, it can come around and bite you in the rear end and be used against you.

                                                I personally believe we're better off, uh, not having that power of expressed by the government. Because, I frankly think it's a long history of governments regulating speech, and it's not a happy history. I think that's my last word.

Prof. Goldman:                 It's a great word to end on.

Laura Flint:                          Our monitor.

Prof. Arthur:                       Thanks to everybody.

Laura Flint:                          Thank you. Um, on behalf of The Federalist Society, I want to thank our experts for the benefit of their valuable time and expertise today. Our next tele-forum call is on Monday, January, um, not 16th. Uh, January 23, and that will be on the 2017 [Mercadus 01:06:50] report, so be sure to tune in for that. We welcome listener feedback by email at Thank you all for joining us. We are adjourned.

Announcer:                        Thank you for listening. We hope you enjoyed this Practice Group podcast. For materials related to this podcast, and other Federalist Society multi-media, please visit The Federalist Society's website at