Recently, Big Tech companies have come under fire from all sides for their content moderation decisions. Some argue Big Tech companies are not doing enough to stem the spread of harmful content and misinformation. Others contend Big Tech companies' selective approaches to moderation belies partisan preferences — silencing only certain voices and threatening to undermine democratic values. The recent actions by Big Tech companies regarding President Trump have brought these concerns to a head.
Many different solutions have been proposed during the uproar surrounding the debate. Among these options, some have posited that historical regulation of common carriers can provide a road map for appropriate and effective big tech regulation. Do the market positions of modern giants like Twitter and Facebook generate common-carrier obligations? Should they? Our experts will discuss these issues, exploring the relevant legal contours and the desirability of the proposed common carrier solution to curb Big Tech power, as well as other issues surrounding the debate.
- Richard Epstein, Laurence A. Tisch Professor of Law and Director, Classical Liberal Institute, New York University School of Law
- Joshua Wright, Executive Director, Global Antitrust Institute, Antonin Scalia Law School, George Mason University
- Moderator: Elyse Dorsey, Adjunct Professor, Antonin Scalia Law School, George Mason University
Visit our website – www.RegProject.org – to learn more, view all of our content, and connect with us on social media.
As always, the Federalist Society takes no position on particular legal or public policy issues; all expressions of opinion are those of the speaker.
[Music and Narration]
Nate Kaczmarek: Good afternoon. Welcome to this Regulatory Transparency Project webinar. Today's program will explore the question "Is Common Carrier the Solution to Social-Media Censorship?"
My name is Nate Kaczmarek. I am Vice President and Director of the Regulatory Transparency Project for The Federalist Society. As always, please note that all expressions of opinion are those of our guests this afternoon.
For our moderator today, we are pleased to welcome Elyse Dorsey. Elyse is an Adjunct Professor at the Antonin Scalia Law School at George Mason University. She previously served as Counsel to the Assistant Attorney General in the Antitrust Division at the Department of Justice and as an Attorney Advisor to FTC Commissioner Noah Philips.
Before joining the FTC, Elyse was an Associate in the Washington D.C. office at Wilson Sonsini where she was a member of their antitrust practice.
If you'd like to learn more about all our guests today, you can visit our website, regproject.org, where we have complete bios for all of our speakers.
In a moment, I'll turn it over to Elyse. Once our panel has thoroughly discussed the topic, we'll go to audience Q&A, so please think of the challenging questions you'd like to ask them. Audience questions can be submitted via the Zoom chat function at the bottom of your screen, or you can use the raise-hand function, and we'll call on you directly.
With that, Elyse, Josh, Richard, thank you very much for being with us today. Elyse, I turn it over to you.
Elyse Dorsey: Thanks, Nate. Thanks so much for that introduction, and thanks and welcome to everyone joining us here today. I'd also like to take our excellent discussants who are here to discuss some really exciting and really tricky legal issues.
We have Richard Epstein, who is a Professor at NYU, and I think needs no real introduction for this audience. And Josh Wright, who is a Professor at Antonin Scalia Law School and also a former Commissioner at the U.S. Federal Trade Commission.
I'm really excited today to have these experts to talk to us about some issues that have really started coming to the forefront of popular discussion over the last several weeks and months.
Increasingly, we're hearing complaints regarding how Big Tech companies are or are not moderating content on the internet. There have been a lot of allegations really running the gambit from the tech companies are not doing enough to stem the spread of misinformation to the tech companies are biasing the discussions against certain political views or to shape discussions in certain ways that maybe are not representative or are biased against certain individuals or viewpoints.
The complaints that we're hearing are really raising a lot of potential concerns and implicate several potential legal regimes and regulatory issues. Richard recently proposed, in a Wall Street Journal op-ed, that one potential solution to some of these issues might involve treating certain tech companies as common carriers. Richard, I will turn the floor over to you to lay out what you see as some of these potential issues and how that solution might work.
We'll let you go for a while, we'll let Josh jump in, we'll have a little back and forth, and then, as Nate mentioned, we'll open it up to some audience Q&A, so definitely keep those questions forefront and keep them coming.
Richard, the floor is yours.
Richard Epstein: Thank you so much. This is really a very odd situation. Josh and I had to decide whether we would do this as a panel discussion in one shape, form, or another or whether we would do it as a debate. The difficulty about doing it as a debate, at least on my end, is my position shifts regularly back and forth, up and down, and sideways, and I'm not sure, ultimately, where I will land.
But let me see if I could lay out some of the position and then get to the common carrier solution that Elyse had mentioned in the introduction.
As she noted, it turns out that the objections that are raised against, chiefly, I would say, three companies, Google, Facebook, and Twitter—I think Apple and Amazon are not really part of this thing, nor is Microsoft—is that they both do too much and too little. On the one hand, it is claimed that they allow all kinds of salacious and vicious speech to go forward that is a threat to the social order. On the other hand, it is claimed with equal fervor that what they do is they manage to shut down people who ought to be heard.
The important thing to understand about this particular dual set of charges is both of them could be true simultaneously, which doesn't make our position any easier than it would otherwise be.
Faced with this, it turns out that you then have to figure out which kind of position you're going to take in order to see whether you do anything at all with respect to these kinds of allegations. It turns out there are basically two extreme schools, and I put myself in the middle with the interview that I did with Tunku Varadarajan on the Wall Street Journal. I think it was on January 15 that we had that very fine discussion.
The first position that one can say is that these companies are just private companies. They have speech rights along with all the people who use them. The essence of being a private company is that you have the right to include, therefore, you have the right to exclude, and if somebody doesn't like what's going on with respect to the way these private carriers are going to take place, then the best thing to do is to simply ignore everything that is done.
It turns out that one situation says these companies have First Amendment rights too. Whatever they want to do is just fine by us, and it's just a complete mistake under these circumstances to engage in any form of regulation whatsoever. There may be some kind of private remedies associated with defamation and so forth, breach of contract, but don't get involved in it.
On the other hand, there's another view which says these organizations are so powerful. They benefit from certain kinds of government types of immunities, mainly that under Section 230 of the Communications Act on this thing. What they say is, therefore, you should treat them as though they are public entities subject to the various kinds of restrictions, and that means, in effect, that you could regulate them from top to bottom in any way that you see fit in order to prevent these things from taking place.
It seems to me that one tries to find an interim position. And I'm not sure that it's correct, but I think it's well worthy of some kind of discussion. The situation is as follows. We have competitive industries on the one hand, and we have various kinds of common carrier monopoly industries on the other hand.
It has been for a very long time the history of rate regulation, not content regulations we're talking about here, but of rate regulation to say those kinds of businesses which are affected with the public interest have to charge fair, reasonable, and nondiscriminatory rates. The basic argument was that if you're a common carrier, whatever that phrase exactly means, there's no other place that you can go, so that since the alternatives for customers are going to be shut out entirely from the market, the rule of absolute exclusion that works in competitive markets should not work in these kinds of common carrier monopolistic markets.
In the rate area, that meant that what we did is we had a system of rate regulation, which was not easy to execute. On the one side, what was required was to keep the rates high enough so that you could cover your cost, and then, on the other hand, you had to keep them low enough so as to prevent the extraction of monopoly profits.
Fortunately, in this particular debate, the rate issue no longer part of the overall equation, but the non-discriminatory point is very much a part of the description. The issue is can you, under these circumstances, say that you have to let people in and that you cannot discriminate against them?
The first thing that one would want to do is to say what would this non-discrimination norm look like? Generally, what you would want to say is it's going to be subject to a system in which you have equal access, and one would start to take a leaf from the standard First Amendment kind of literature on this subject and saying that since these guys are now common carriers, they can be regulated so as to make sure that they cannot engage in any form of viewpoint discrimination.
Then, you would have to actually figure out what it is that viewpoint discrimination starts to mean. Well, fortunately, as everything goes, there are easy cases where you could see how troublesome this might be. For example, you take an issue of great importance: how is it that we're supposed to respond to the COVID crisis? What you do is you get some people who want to make recommendations that are inconsistent with those which are put forward by the CDC on the one hand or are put forward by the World Health Organization on the other hand.
Can you do this? Well, in my view, this is an old, classic situation where the kinds of disputes that you have are exactly the kinds of disputes in which you have to have a free, open, and robust debate—to use the kind of language that echoes Justice Brennan in his famous New York Times decision—and the last thing you would want to do is if somebody said if you don't agree with these particular organizations, we're going to throw you off.
The stakes here can be very, very high. There's a huge dispute about the effectiveness of masks. There's a huge dispute about the effectiveness of quarantine. There's a huge dispute about the question of whether or not HCQ, hydroxychloroquine, is the kind of medicine that can actually treat the COVID condition without sending people to the hospital. I think it would be just a terrible mistake to say that you want people to be able to censor these kinds of things so that those discussions cannot take place.
What has happened is the old common law category of deliberate falsehood now becomes a new category of misinformation which is put so broadly that, essentially, any time there's an honest debate, there's going to be somebody—left-wing usually, under these circumstances—who's entitled to say that this can go on.
The whole point about the viewpoint discrimination stuff is that governments could not do this, and so, therefore, when you're a common carrier with this kind of semi-monopoly position, you're not the sole supplier, but you still have a fairly large share of the market, and you cannot do this either.
When you look at the comments that were written about the piece that I did in the Wall Street Journal with Tunku, there were all sorts of people saying, "This man is crazy. You're not going to allow somebody to go on an airplane if they're carrying a bomb because of a nondiscrimination rule. Why are you going to allow people to throw verbal bombs into these circumstances?" There is a point here, but it's overstated. Nobody says, in effect, that you can keep people off a common carrier because you don't like their political views, even on the use of violence, in many cases.
In this situation, what happens is you could certainly have a very narrow exception so that people who engage in deliberate threats of violence against other individuals could be kept off without having to push yourself to the other end and saying that when there's a dispute over public policy or the desirability of the election results that Donald Trump wants to challenge and so forth that you could knock them out of the situation.
The argument is, under these circumstances, they have to be able to let that in. Am I happy with this solution? Of course not. Nobody should be happy with this solution. Ideally, what you would like to do is to have a solution in which other people could be forums so that, instead of having to regulate these carriers and these stations as common carriers, what you can do is introduce a competitive market.
Here the plot now starts to thicken. There have been a number of white ring channels—Parler on the one hand, Gab on the other—which have tried to gain traction, but it turns out that a company like Apple now goes up the chain a little bit and it says, "Oh, we're not going to let them sell their stuff through our App Store or distribute it through our App Store," and that, then, introduces the kind of monopoly element into this business that everybody else becomes dreadfully afraid of.
To make the situation still worse, this is not at all clear how fragmented the existing market turns out to be. That is, there's a standard antitrust rule which says on the one hand, if it turns out that you have a number of companies in an industry and each of them make a decision for independent grounds and are unrelated to their cooperation with others, then, in effect, that's perfectly legitimate.
But on the other hand, if it turns out that they collude, that's perfectly illegitimate because it then does create the kind of monopoly system for which some form of regulation ought to be thought to be appropriate.
Well, what establishes collusion? Well, an explicit agreement amongst the people that all of us will keep out a certain kind of right-wing parties would be, I think, a very serious kind of antitrust violation. But what about a tacit arrangement where they signal to each other by making public statements which are then responded to? There is a huge literature on the question as to whether or not this kind of price signaling in ordinary markets would count as a form of monopoly behavior, to which the answer is maybe. Usually not, but perhaps in some particular cases, too.
The way in which this situation starts to careen, what happens is we have three unhappy alternatives. One of them is to just leave it to the status quo where the objection is really great. Another is to treat these people as though they're government parties, which is also quite risky in many cases because then, in effect, you're going to figure out who in the government is going to regulate them. And I'm not at all comfortable with the fact that either a Trump or a Biden administration is going to be able to have that kind problem.
The other this is to try to treat this as though it's some kind of an intermediate common carrier solution to which we could then look in the hopes that if we do this, what we can do is somehow or another make the thing a little bit easier.
I'm just going to end on the following note. One of the reasons why this problem turns out to be so difficult is that we have gotten a new style of adjudication and dispute resolution in the United States, which is essentially one that says, "Whenever we have a particular advantage, we push it to the max. We're never going to engage in any kind of voluntary form of self-restraint."
Self-restraint, essentially, is essential for the operation of virtually every social system of which I'm aware. If you're an employer and you have the right to hire and fire people at will—which I'm in favor of doing—it doesn't mean that there should not be some sub norms, not enforced legally but respected socially, which ease the damage by allowing a kind of informal for-cause regime to take place.
One discovers very quickly that a for-cause regime that works in a private setting cannot be easily transposed into a legal setting where it now becomes a standard if you don't meet the for-cause situation, you're going to be exposed to damages one way or another. That's exactly the kind of problem that we're facing in this case.
We have a situation where nobody seems to be prepared to back off and that what happens is the demands get higher and higher. What we're doing is living in a situation where, as was said by Elyse at the beginning, some people say, "We have to keep these terrible people off," and then other people say, "You can't let them keep these people off."
The common carrier is an effort to try to find moderation. I'm waiting for Josh to explain to me why it doesn't work, but what I'd be much more eager to find out is if he or I or anybody else could find the regime that would actually thread all of these particular obstacles and come up with a decent conclusion.
I see that my time is up. Josh, over to you.
Joshua Wright: Thanks, Richard. Let me start by saying it is always a pleasure to be on with Richard. And thank you, Elyse, for the introduction.
I think there are a number of places we agree. One, it's a tricky problem, and a tricky problem where I think the attempt to search for what Richard described as an intermediate solution, the demand for that is caused, at least in part, by the fact that the problem's vexing and nobody's offered anything up in the middle so far. So, I thank Richard for that.
I am going to at least try to play my role in describing what I think some of the problems are with the intermediate solution, and we'll see if I can't at least contribute something positive as well. I think the framing here is important. Figuring out what kind of problem is this is an important first step in diagnosing any kind of solution. Is it a competition problem? I think the common carrier apparatus that Richard describes as an intermediate solution, that's what it's built for. It's built for a particular type of competition problem.
Much of my argument is going to be laying out some reasons why I don't think we have that kind of competition problem, and so maybe need some other form of intermediate solution.
It strikes me that this idea that we've got content moderation decisions that leave everybody a little bit unhappy—Is it too little? There's not enough here. There's too much there—come from the antitrust and competition world where you get all sorts of those complaints about all sorts of types of competition. Too much price competition. Too little on quality or innovation. Too much innovation. Not enough price competition. Because competition is messy. It's multi-dimensional, and the way that these firms compete is multi-dimensional as well.
Most of these products are free, as Richard points out, but the product design, the content moderation decisions, all of these bundles of multi-dimensional attributes where social media companies are competing against each other make analyzing any one of them in a vacuum a fairly difficult problem.
I am, to cut to the chase of where I'm coming from, I think there's no doubt that it is a serious problem with high stakes, as Richard said, and errors on each side. I'm sort of with too much or too little content moderation. I think the question for me becomes really to who has the comparative advantage in reducing those errors or their cost? The government or the market or private firms?
I'm going to come out in favor as forcefully as I can for the market. I can't help, and this is meant to be a little clever if not cute, but to say that I'm going to find myself agreeing with Richard, but Richard from 2015 when we were on a panel discussion net neutrality when he said, and this is quoting, "It's always a desperate mistake to allow hypothetical horror stories to set the intellectual stage for evaluating regulatory proposals."
I don't think that the hypothetical horror story is the content moderation. There are content moderation stories here that are real and not hypothetical. The hypothetical, for me, is the idea that we've got a bunch of entrenched incumbent monopolists running around here such that we need a common carrier regime. So, I'll spend my time really talking about that.
One hint for the audience that we may not have the sort of common carrier infrastructure monopolist setup that sets the stage for duties to deal—be they common law duties to deal, be they antitrust duties to deal, and I'll talk a little bit about each—but one hint is that we started off saying we're talking about at least three firms causing these problems. I'll go through some share numbers. It's hard to do social media shares. Are you counting user minutes? Are you counting traffic, or what have you?
That doesn't mean that these firms aren't very large or don't have some form of market power. Of course they're very large, and some of them indeed have some form of market power. But as Richard well knows, judging the monopoly power that a firm holds by looking at the size of the firm alone is a road fraught with all sorts of error.
I come from -- I’m a UCLA economist taught by Herold Demosetz who published a famous article called "Two Systems of Belief About Monopolies." One is the government and one is private. He says both can happen, but I'm worried much more about government-induced monopoly, and indeed, the common carrier framework is sort of set up around that. It is the government grants the exclusive right to the wharf or the bridge or the road or what have you, and you've got a good, durable, perpetual even, sometimes, monopolist at the core infrastructure of a bunch of markets. And we said, okay, well, let's obligate that guy to deal because it gives us competition in other places.
I think that that line of thinking runs into some problems in the social media setting, whether we're thinking about a common-law duty to deal or an antitrust duty to deal. Instead, I think we get a situation that looks a little bit more like imposing neutrality obligations on ISPs as was the net neutrality debate. I think it's much closer to that situation and so fraught with a lot of the same types of problems.
I'll talk a little bit towards the end about what the alternative looks like if we are not letting these firms sort of compete through content moderation. Consumer tastes on content moderation are changing all the time. I think they are certainly changing right now. I think they're changing throughout this election cycle and continue to, and I expect competition among social media platforms to respond to consumer preferences as they change. If they're making poor content moderation decisions, consumers will switch around.
There are a number of these firms. They don't need government permission to enter. Critics of the idea that there are competition in the space sort of shout network effects and walk away as if it's the end of the discussion. But if you look at the numbers when we're talking about these firms -- and, again, I'm not pretending to do a full-blown antitrust market definition analysis here, but something like 20 percent of Americans use Twitter. About 10 percent are daily active users.
When Parler started up, immediately millions and millions of users switched. Now, they ran into a problem, and we could talk about the antitrust bit later. I don't think anybody's alleged a Section 1 antitrust violation yet, so count that as hypothetical horror story number two. But I don't think we've got anybody even alleged tacit collusion or explicit collusion in that set up.
Clubhouse, overnight, six million users. Think about the size of these firms. Snapchat -- I gave you the numbers for Twitter. Snapchat's over 90 million users. That sort of gives a sense of the size of Twitter relative to the rest of the ecosystem.
My own view is -- and I'll be the first to say plenty of these firms are large, but it worries me when I hear arguments that the size is the basis upon which to determine that a company is affected with the public interest. Affected with the public interest in the common-law sense never meant it's really important. It never meant it's really important because it's large and people like it. We allow companies to compete and be successful and get a lot of users out on their own, and that distinction, I think, has always been important. It lies at the heart of our modern antitrust regime.
There is indeed, and Richard mentioned, an antitrust duty to deal that lives in this space too. It requires both the showing of monopoly power and it requires that the firm refusing to deal, on one end of these content moderation decisions that upset people, the refusal or kicking somebody off a platform or what have you, has to have no business justification. The antitrust rule is not that you can hold aside the monopoly power debate. If we say all three of these guys, or nine of these guys, or whoever, are all monopolists. I don't think that's right, but let's say we say so. The antitrust due to deal requires more than that.
The antitrust duty to deal requires a decision made without business justification. I don't think that's the world we're living in. I think content moderation decisions are hard and firms make good decisions and bad decisions, and they internalize the cost of those decisions, and that's the way that markets work. I think, over time -- I'm certainly not a defender of all of these decisions. I've said openly I think many of these platforms -- I've become concerned about censorship of conservative views in particular, and I think that that's a fair concern.
What worries me is reaching for the view—and in my view, it's a little bit of a reach—to say that the size, without a government grant of monopoly or something else that makes the monopoly durable in a market where there's a lot of innovation and dynamism, the same sorts of concerns that led both Richard and I to reject neutrality provisions in the net neutrality debate, I think, operate here as well.
In my view, it would be a much better course to let these firms compete. New entrants will enter. If they collude to keep somebody out, they'll violate the Sherman Act and that will be a real problem. In my view, the best solution available is the one Richard and I both proposed for net neutrality, which is do nothing. Do nothing in the short run.
I'm open to solutions that look like -- this happened in the net neutrality debate as well, too. If social media firms want to make material promises about content moderation and they break them, the FTC has authority to go after these firms under their standard consumer protection and deception authority. Indeed, they may get more active along those lines.
There's tricky First Amendment issues there too, I understand, but that approach, to me, when I get to the punchline question of who has the comparative advantage in solving content moderation mistakes or reducing the social cost associated with them, it is more likely going to be firms competing against each other in a highly dynamic space. We have examples of firms getting millions of users in a really short period of time. We don't have anybody that looks like they have got anywhere near a share that would trigger what we would normally think about entrenched durable monopoly power. We just don't.
I'm much more concerned with the idea of having -- I'll give an example of what a government solution to this looks like. Josh Hawley has a bill. He always does. But Josh Hawley has a bill that would've put a bunch of federal trade commissioners, and I was one of those -- I'm the last guy you want voting on what's okay on the internet and what's not. You get a room full of five people who have to raise their hand and say, "I belong to one political party or another" in the hopes of getting politically neutral -- it's nonsensical on its face to hand that decision to a bunch of political appointees and bureaucrats.
That scares me. I know that scares Richard too. I know he has many of the same concerns, which is exactly why he's reaching for a sort of middle-ground solution, and I very much appreciate that and understand where it comes from because I'm left with the same uneasiness with the corner solutions.
But sometimes the corner solutions are better than the middle, and my view, in this case, is if these markets are appropriate for a conclusion that they are affected by the public interest, then, goodness, I don't know which markets are not. These markets are important to be sure, but there are certainly markets that are less dynamic with more market power, durable monopoly power, because they are granted by governments.
It worries me that a collateral effect of this sort of solution is that we get a you-didn't-build-that approach to regulation. All these private companies affect the public interest in some way, and then we're off and running, and we've done something to damage what I think really is a crown jewel of the American economy, which is unlike some jurisdictions around the world, particularly in antitrust this is true. We let people compete. We let them compete. We let them fail. If they become monopolists, we let them charge the monopoly price. We don't let them abuse it. And that's where this debate is in that tougher space, to be sure.
I will conclude with the thought that I find myself attracted here to the corner solution. The corner solution here being a little bit more of a feature and less of a bug. I'll stop there. Richard, hopefully I said something to provoke a response. I don't know, but I hope I did.
Richard Epstein: Absolutely provoked a response, and it's exactly the response that I hoped that you would provoke. I'm much more worried about criticisms of my position coming from your corner than coming from the other corner.
I wanted to sort of aberrate that I have not changed my view on net neutrality from 2015 and so forth. Those things don't involve content and the kind of discrimination on viewpoint but simply involve economic decisions about rates of return and so forth. I think, in the effect that the market solution is clearly dominating that the government, regulations are going to be downright mischievous.
The difficulty that we have here is, in fact, one of timing, in a strong sense of the word. What you used constantly was the correct phrase and that, in time, this thing will sort itself out with new entries in the market.
The great question that one has to ask is just how much time is time enough for this to take place? If you start looking at the last election cycle, it was clear that there were immediately strong centers of disagreement; that some of the stuff that was kept off the market was an implicit form of defamation, an issue which you didn't raise which bothers me.
Saying somebody, like myself, who believes that HCQ is a preferable drug to the kinds of treatments that are hawked by Anthony Fauci, is so far out of the range that we're not even allowed to speak on that stuff. That kind of issue, I think, becomes very, very troublesome, and it's what provokes this sort of response.
I agree with you. If I thought the system was one that had really strong entry characteristics, I would drop this proposal in a nanosecond. One of the things that's so troublesome about it is that you did mention that it's clear that you can find networks that will get very large concentrations of viewers very early on, and then what you did is you found that there were certain kinds of upstream operations so that these guys could not get themselves a foothold inside the marketplace.
I think Parler went for a long time, it may still even be, without kind of an internet home because it hasn't been able to find some place in which it could do it. I'm not saying that I would subscribe to it, care what it says, and so forth. Those are not the issues. The issue seems to be that this stuff has become a little bit more tenacious.
The other thing I think is when you start dealing with stuff in which it does skew the political debate. The question is do you need a monopoly to do this? I'm going to answer with the following kind of way. Of course, it bothers me as much as it bothers you. This is a situation where we share our anxieties, as it were.
When you started to talk about the common carrier situation, it was not from the very beginning only with legal monopolies. It applied to natural monopolies, like the only harbor in town and it could only have one firm to unload and carry ships. That was the original case on which this thing was mentioned.
But as we start to deal with the antitrust law, the way you would describe that is a situation in which the Herfindahl Index was won. It turns out that, as Josh knows and as I know, it turns out you're worried about situations where there's increased competition even though you don't end up at one firm, and I think the kind of equilibrium position we have today is that four to three mergers are probably okay; three to two mergers are not; five to four mergers are almost always not something that you're worried about because the only concern you're worried about in these cases is the difference between the competitive price on the one hand, monopoly price on the other hand, and that gets small enough that, after a while, the administrative cost override that.
But in this particular field, it turns out it's just a little bit more complicated. What you do is you don't have a monopoly, but you've got four or five major carriers on this business and their combined shares kind of look like you're in Herfindahl land with a little bit of trouble.
And then you have, also, the kind of collusion feature that's involved in this case, so that if all of these companies got together with the same views and kept everybody out, I don't think the new entry character would work. If you could decide and establish that Google, Apple, Twitter, and so forth, Facebook, basically had a common policy, organized, essentially, in some kind of a covert way.
What bothers me terribly about my own solution—because I think I'm supposed to speak out of both sides of my mouth on this—is I just don't know who to put in charge of this kind of thing, which is why I end it in the way in which I do. The thought that somehow or another it's going to be the Biden administration which is going to come to the rescue of competitive markets when, in fact, its instincts are going to go exactly in the opposite direction means that I suspect that most of the proposals that would come forward are proposals that I would strongly oppose, and I would align myself, in the end, with respect to Josh on this thing. Which is why it was that I said some kind of self-governance would be there.
Now, let me give you what I thought may be—I'm cautious about this—a common response. One of the things that Facebook did, and I think they tended to try and play this a little bit more sensibly than the other kind of thing, is they say you get a serious issue like keeping off political candidates in the middle of an election from a major source, we're going to have this independent board and we'll try to create our own quasi-judicial system.
It's not perfect, and in fact, if it is conscientious, it might be a better solution than anything else that I'd want, so let me now endorse, in this weird discussion, Josh's position. It may well be that the situation that you need here is not one of full entry. That may be very difficult to achieve. But what you need to do is to have some sort of institutional arrangements that soften the blow and give you some confidence that the politicos are not making these particular choices.
If we don't do that, the calls for regulation are going to get worse, and I fear that we will do something very dangerous under these circumstances. The basic situation with the common carrier kind of rule and the nondiscrimination misinformation rule is what you're trying to do is to soften that. But Josh is right. I'm saying it again. You can soften it, but are you going to have an administrative remedy? Are you going to have a judicial remedy? Are you going to have an arbitral remedy? These are all immense difficulties that you have to work out.
At this particular point, I put this forward as a possible meeting ground around which you could see constructive proposals. Do I think that they're going to take place in the current political environment? Well, I'll just make one sad political comment. The age of unity between the red and the blue in this administration lasted for one hour. That is, you had an inaugural address after which you had a completely partisan Democratic party announcing that it was going to go it alone.
My view is if that's the attitude that's going to be taken towards the various kinds of regulation, then, in effect, I would agree with Josh, and I would want to say no thank you. What you're trying to do is to figure out how you could get moderate political institutions to institute moderate kinds of arrangements, and if it turns out that that is impossible, and we have to slide to one corner or to the other, I'm going to slide to Josh's corner.
But I would hope that we could try to get some kind of a debate in the interim where people would be candidly aware of the pros and cons in an effort to put some kind of institutional arrangements which would stop the horror stories on the other side of having major outlets systematically barred from reaching customers because they put forward views or advance candidates which are not to their liking. They are strong enough for that to be a concern.
The question is strong enough to be a concern is not strong enough necessarily to define a solution. As somebody who spent all too much time in administrative law, that huge body of incoherent doctrine is often a huge wedge between a serious problem on the one hand and a sensible solution on the other.
I will now let Josh again push me again to the corner. I will hold my peace until the questions.
Joshua Wright: I will be brief so we can get to questions. You will always be welcome in my corner, Richard, when and if you decide to come over.
I think we agree that if the slower that the firms we are talking about are to respond to some of these problems in their own way—and I think you're seeing different approaches from different companies, as you would expect to happen in a market where there's some competition—the slower they are, the more likely we are to get a political response that hands this over to an agency of some sort or creates some other form of political or legal apparatus to do the decision making.
I worry about that in the same way that you do, and I hope for more attempts at responses by those firms as marketplace responses, and I expect that those will continue to come. Will they come fast enough? Faster than the politics moves? My view on the speed of how quickly political movements happens has changed over the last couple of years, so I don't have a strong bet about how fast we will see those responses, but I am very worried about imposing duties to deal via antitrust or otherwise.
I think it is Scalia in Trinko, the antitrust decision Trinko, said we shouldn't impose a duty to deal that we can't reasonably supervise. I think this is such a duty to deal here that we're talking about, whether the "we" is the FTC or some Article 3 judge or an administrative agency or what have you.
I think that leaves me in the corner hoping that we see a response quickly and competitively to deal with some of these concerns from what amounts to about half of the market, at least. I am hopeful for that. I will raise -- in the back of my mind, floating around, is the other concern that I worry very much, as most students of the history of regulation do, that attempts to find solutions to a competition problem via a regulatory apparatus often make the competition problem worse, not better, by entrenching these interests.
You see a lot of that debate around 230, and you saw a lot of it around net neutrality and consumer protection regulation for social media as well, and I think those concerns are well founded. The economies of scale associated with being able to handle some of those regulatory costs are more likely to be found in already-large, established firms.
So, I worry a great deal that most of the attempts to find intermediate solutions, as long as they involve some sort of barrier to entry or duty to deal apparatus, end up making things a little bit worse on the competition side rather than better. I think the history of regulatory attempts with duties to deal, certainly in this country, tell that story and are one of the reasons I find myself not totally comfortable with my corner but comfortable enough to stay.
I will stop there.
Richard Epstein: I admire your comfort. Let me just agree with you on one point. Again, on the straight economics stuff. The Trinko decision, I thought, was one of Justice Scalia's better efforts to see how badly things could go. The effort of Judge Koh in the Qualcomm case to basically treat that exception as though it no longer existed would've been catastrophic insofar as it would've resulted in forcing you to deal with your direct competitors as opposed to complementary products and setting prices in a world market.
What Josh is talking about is no joke, and that's an issue here. The question is whether you could find a way in a way this particular area to sort of moderate it because, unlike the other cases, the issue of truth and falsity is one which is extremely important. What you have with these carriers is they're making substantive positions and claiming, in effect, that they're neutral arbiters when they're not. That results in a kind of defamation for which 230 tends to be a bar.
There's just a lot of issues that aren't quite isomorphic with the economic cases. It doesn't mean that the solutions are easy, and, as I said in the Wall Street Journal piece, if you could establish firm entry, which is what Josh prefers, then I think, in effect, that's clearly the dominant solution.
We should take questions, Elyse?
Elyse Dorsey: Yes, absolutely. Thank you both so much for this really interesting kickoff. I think you've identified quite a lot of legal hurdles and different inflexion points upcoming.
One of the questions, to start with, we have in Q&A. I'm going to paraphrase a little bit, but basically is could a common carrier strategy deal with -- some of it has focused on how that could potentially address political speech, but could that deal with other types of issues that we're seeing come up like harassment towards women or other minority groups?
Richard Epstein: That was actually a historical issue. When they started with the railroads, they did create women's cars for women, children, and various guardians, and the explanation was very clear that they had these cigar-chumping, tobacco-spitting guys terrorizing these people. A common carrier would be under a duty to provide reasonable and safe transportation, and so this was generally regarded as an appropriate situation, and it would certainly be true here.
The great problem you have is exactly how do you treat this speech? Remember, there's no direct physical intervention in these cases, but I certainly think if there were a direct call of violence to a mob that was standing outside a crowd, you probably could shut it down just as you could shut it down if they were not a common carrier but were standing there. This is an issue which is going to be involved in the huge debate we have over the Trump impeachment, and you can see just how dangerous these charges of insurrection are when you actually compare it to the record.
I don't know, Josh, is that your view or not?
Joshua Wright: I don't really have a strong view here, to be honest. I am sure you know the history of common carrier regulation better than I do.
Richard Epstein: It's one of my occupational hazards, yes.
Joshua Wright: Not the only area that you know better than I do, but the closer you get me to monopoly and antitrust, the more comfortable I get. I do think I certainly agree that, like these other problems on the social media platforms, whether it is misinformation or bias or harassment of women or the like, I do think—and I'll sort of go back to where we started—I do think the overall question becomes who's got the comparative advantage in addressing those problems?
That's not to deny any of them are problems. There are plenty of problems on Twitter or Facebook or any of these places, but the question becomes who's got the comparative advantage in solving them? And can we solve them fast enough? I do think that it is very much the case that for some of these issues, they've been vexing for long enough that people are left looking for someone else to solve them because the firms aren't.
Richard Epstein: Let me mention one thing about the discrimination cases at least. These were situations introduced by the railroads, and the question is would that violate the nondiscrimination rule? The upshot is, just as with every other norm, there are justifications for deviations from it.
To give you a modern illustration, we have an age discrimination norm and a sickness discrimination norm. Is that something which says that Mr. Cuomo should be allowed to force sick people back into nursing homes and take away their rights? No. This is a case in which the peril to health was so great that that particular nondiscrimination norm ought to be overridden.
Remember, that was government forcing it upon you as opposed to the other situation where it was private party self-imposed. Indeed, even in Plessy v. Ferguson, another common carrier case, it was a collusive litigation because the railroad didn't want their racial discrimination stuff. It was the state that imposed it, which is very different from what it was in the women's case.
Elyse Dorsey: Great. Sean Ross, I think I see you have your hand up. We can get our tech person to unmute you and you can go ahead and ask.
Sean Ross[sp]: Hello. Yes. I want to pick up on Josh's invocation of net neutrality and bring in Amazon Web Services, which is essentially, in its broadest sense, a service to access the market, the market being the internet. I want to bear in mind what Milton Friedman says that economic liberty can secure civil liberties.
It's a miracle to me that Twitter isn't worth one-quarter of what it was six months ago. Microblogging is an extremely brittle business because there are no walls around your customers. They can leave in an hour. Twitter only has 36 million U.S. users. Parler was grabbing a million a day. We're talking about weeks away from disaster.
Parler did allege a Section 1 violation in its lawsuit talking about how AWS was inordinately interested in whether Trump was going to sign up for Parler. That could've been the nail in the coffin. Twitter was worth $10 billion in 2016. It's worth about $50 billion now, and a lot of people signed up just to see what Trump was tweeting and then started using Twitter for other reasons.
My question is, look, would we have this conversation if Parler was still up and Trump was still tweeting? Would anybody care really about what Twitter's content moderation was? Why don't we just have a net neutrality rule for --
Richard Epstein: Josh said no. I think we would not care. But remember, Parler was shut down around this strange way because of the App Store. Josh?
Joshua Wright: Yeah, let me take a couple of pieces of that. I find myself in the odd position that I do agree if Parler was up and Trump was tweeting, people would care a lot less about this.
But also, the Section 1 claim there—I'm going to use a technical antitrust legal term—it's totally nonsense. The Section 1 claim was not the one Richard's talking about where you've got competitors agreeing. The contract conspiracy or claim alleged to violate Section 1 was the infrastructure arrangement between Parler and AWS. It was a vertical deal. They were supplying services.
It was not -- I agree -- I think Richard and I both agree. If we saw an agreement between Google, Facebook, Amazon, and Apple to simultaneously boycott, that's a Section 1 case that is a real problem. If the facts are -- we could come up with the hypothetical horror story that I think we could -- if any of my law students are watching, probably will be next year's final exam.
But that's not what that case was. That case said the Section 1 agreement wasn't even an agreement between rivals; it was just the service contract between AWS and Parler. They said, well, that thing is a Section 1 conspiracy. No, it's not. And two, the defense is -- they'll litigate this thing and we'll see what happened, but the defense was Parler violated the contract. I don't know what the contract says, other than reading in discovery, but this is not the stuff from which real antitrust cases are made. If that thing doesn't get dismissed at the Twombly stage, I'll buy you a beer.
Richard Epstein: I think he's right about that. The other point, of course, to remember is that the contract remedy may well, in fact, be viable. One of the things that happens with respect to all of these things, including the stuff that you see when you're talking about standard essential patents and so forth, is that if you decide to yank one of these things from it, it's not a Sherman Act violation, I think, under recent law, but it may be a breach of contract violation given the kinds of complication obligations that you have.
But that reinforces -- I'm on both sides of this thing, so now I'm with Josh again. To the extent that you've actually demonstrated a contractual remedy that's trying to address the holdout problem in a constructive fashion, it's a case for saying keep the antitrust laws out.
Elyse Dorsey: We also have a couple more questions in the Q&A section that are relating back to Section 230. It seems like they're kind of along the veins of maybe there's a tweak to Section 230 that might help get at some of this.
One of the suggestions is potentially only providing immunity for moderation decisions that are in line with the First Amendment or perhaps removing the -- I think there's a kind of catchall or otherwise objectionable provision in Section 230. Any thoughts on maybe tweaks there? If that should be part of the conversation?
Richard Epstein: Yeah. What that does, by the way, is it makes the market power issue almost irrelevant. But the basic structure that you have is 230 is dealing with the modern lending library. The ole lending libraries, we'd get a lot of books in there and they would lend them out and some of them would contain defamatory content. The rule was that the author could be held in the lending library or not.
These websites says we're just a lending library. This stuff comes up. Then, of course, the changes happen. What happened is that somebody gives them notice of the fact that the thing is defamatory. In a library, you'd have to take it off the shelf. Then, what they do is they'd say I'm not even going ot look at this particular book. Anything that Mr. Trump writes is going to be defamatory, so we can take that off the shelves right now."
At that point, I'm not even sure that the current immunities that you get under 230 should apply because these guys start looking like editors. Now, we're back into Josh's unhappy world where some court's going to have to decide whether or not these content things are defamatory or not.
I think you don't even need to change the statute. I think when you start getting these programmatic statements, it's so crazy. One of the things you could say about Donald Trump is we're going to shut you down when you start to say "storm the White House"—make that up—in one way or another. But to say you're going to shut him down for life even after he's out of office, that seems to be wildly disproportionate, and the question, then, would be whether there's a private right of action under some system of defamation given the way that this thing has done for which the immunity isn't there. I'll leave it to Josh to answer the defamation question.
Joshua Wright: Twitter is where I go to find bad takes on 230, so it's been a constant supply of enjoyment, and I've learned from watching that on Twitter that I don't pretend to have expertise in areas where I don't, and 230 is one of them.
Richard Epstein: Join the crowd.
Joshua Wright: Yeah. 230 is certainly one of them. Obviously, I think that's an issue where there's going to be a lot of discussion and potential legislative action and what not, but I have not studied 230 closely enough to know what tweaks would be helpful or harmful.
Like most instances, I'm worried that a lot of the complaints about 230 are really about the First Amendment, and I'm not sure what they're going to fix, but I sure do know a lot of the tweaks could make things worse. I don't have much to say about 230 other than that.
Richard Epstein: Where are we now, Elyse?
Elyse Dorsey: All right. We have a couple minutes left. We have one more hand raised, Matt Preston. If you want to ask a quick final question, go ahead.
Matt Preston: Hi, there. Thank you both for your time. I'm wondering, Mr. Wright, you were discussing proving collusion to be the witness for whether we should be worrying about viewpoint discrimination, and I apologize if that's a mischaracterization of what you're saying.
I'm thinking historical abolitionists like Frederick Douglass had to move out of the U.S. to find their voice and had to come up with alternative mediums to be able to get their voice out. Many people with those minority viewpoints were silenced during that time. Here we are now, 230 years later, pretty much still trying to fix the lack of equity through the Black Lives Matter movement and stuff. We have, like Judge Epstein mentioned --
Richard Epstein: Judge?
Matt Preston: Oh, sorry. Mr. Epstein.
Richard Epstein: I love it.
Matt Preston: The quasi-judicial mechanism. Facebook created their supreme court, as some people call it, arguably determining what is and is not, should qualify as freedom of speech if you assume that they have market power here.
I guess what I'm wondering is empirically, funneling content from behind closed doors, which I think is where a lot of the real concern comes out—directing certain content to their users. I guess there seems to be attention in the different ways to regulate.
I guess I'm wondering what you think an alternative where social media platforms can opt into either maybe default common carriers with no liability and no ability to ban users, and then, in the alternative, where they opt in to 230 regulation where Congress loosens the bounds of liability but then leaves it to an agency or the legislature to begin to define under what circumstances the social media platforms are allowed to regulate others so that we don't end up in more Parler situations where minority viewpoints that arguably could lead to a better society on very debatable topics have a way to voice their ideas.
Richard Epstein: I'll take one crack at it. Josh coudn’t go first. Is that this is just another version of what Josh defined as the durability question. If you think these guys are kind of rigid in some way or another and the real nightmare scenarios -- yes, we have four guys now and we'll have a different four guys tomorrow but they'll still have the same political confrontation, then you're going to be tempted to do this.
I think -- Josh, I would ask you, too. If you decide to move to the getting rid of the 230 and the defamation model, you're going to have a lot of the same kinds of judgements that lead you and Josh and me to be very uneasy about the antitrust regime. It's exactly the same kind of question of what means misinformation, broadly conceived.
Josh, your views? Do you agree, disagree?
Joshua Wright: I agree. I do think part of this turns on the durability problem. From my perspective, a lot of what I see in public debate, in conservative circles in particular, plays a little bit fast and loose with that part of the problem. It quickly becomes it's all our public square. We didn't build the public square. Private companies did. And that distinction has always mattered and continues to matter.
I agree that it's not an absolute distinction, but it's an important one. And it's an important one for thinking about durability of monopoly power, thinking hard about whether there are barriers to entry and what they look like. I do not think the discussion is well served by folks shouting that the companies are now our public square because they're big and important, which they are. All of a sudden, we sort of reflexively hit a duty to deal.
To the extent that we take the argument that any company that's commercially successful, relies on the roads to get there, is built on an internet in which the government had a hand. All of this is sort of a sufficient condition for vesting the economy with a public interest, which means duty to deals everywhere. That's sort of a fun hyperbolic a little bit of a caricature, but not much in terms of some of this conversation.
My view is we have tools to think hard and rigorously about what real monopoly power is. We use them in antitrust all the time. We get good results sometimes. We get bad results sometimes. But we have a toolkit to think about this stuff, and I think it belongs in the conversation. I think where that toolkit takes us in this conversation is you've got lots of features that lead me to think, with my economist hat on, that you're going to have innovation and dynamism and response to consumer preferences that are changing really fast.
Again, politically, fast enough to fend off some of this stuff? I don't know.
Richard Epstein: He doesn't know why.
Joshua Wright: They don't know either. If I don't know, my money is on—to end back in my corner—my money's on betting on firms with a profit motive to respond to this stuff in a way that makes more sense than letting a bunch of FTC commissioners do it.
Richard Epstein: Yes. What he's doing is he's basically taking Aaron Director's great contribution, which is laissez-faire means the presumption against government regulation, and Josh thinks it hasn't been overcome. I think it may or may not have been overcome, but I'm certainly much closer to Josh than to many of the strident critics on the other side.
I think our time is up, isn't it, Nathan?
Nathan Kaczmarek: Yes, it is. As expected, an insightful and enjoyable discussion. We're really grateful to Elyse, Josh, and Richard for your comments today.
Just a quick note for our audience. If you do want some Section 230 discussion, we've got a program, a webinar, this Friday coming up at noon. The event is titled "Shapers of Cyber Speech—Silicon Valley and American Discourse." That event will feature Stewart Baker, Neil Chilson, and Billy Easley.
Again, we welcome feedback by email at email@example.com. Thanks for a wonderful program. Have a great day.