Social Media Oversight: The Debate Over Regulation and Antitrust Enforcement on Tech Titans

Listen & Download

Russian interference in the 2016 election, Facebook and Cambridge Analytica, and claims of political bias in banning users and restricting content have all led to calls for regulation and antitrust enforcement against the preeminent social media platforms.  Tech titan executives are making regular trips to Capitol Hill to explain the actions of their companies.  Are Facebook, Twitter, and Google in need of greater government oversight?  If so, what type of regulation is warranted?  Our panel of experts will answer these questions and offer their views on what we can expect next in the tug of war between the politicians and the internet giants.

Featuring:

Neil Chilson, Senior Research Fellow for Technology and Innovation, Charles Koch Institute

Prof. Thomas W. Hazlett, H.H. Macaulay Endowed Professor of Economics, Clemson College of Business

Prof. Jamil N. Jaffer, Adjunct Professor, NSI Founder, and Director, National Security Law & Policy Program, Antonin Scalia Law School, George Mason University

Paul Rosenzweig, Principal, Red Branch Law & Consulting PLLC

Megan Stifel, Nonresident Senior Fellow, Cyber Statecraft Initiative, Atlantic Council

Moderator: Matthew R. A. Heiman, Visiting Fellow, National Security Institute, Antonin Scalia Law School, George Mason University

Teleforum calls are open to all dues paying members of the Federalist Society. To become a member, sign up here. As a member, you should receive email announcements of upcoming Teleforum calls which contain the conference call phone number. If you are not receiving those email announcements, please contact us at 202-822-8138.

Event Transcript

Operator:  Welcome to The Federalist Society's Practice Group Podcast. The following podcast, hosted by The Federalist Society's Regulatory Transparency Project and the Corporations, Securities, & Antitrust and Telecommunications & Electronic Media Practice Groups, was recorded on Monday, September 17, 2018 during a live teleforum conference call held exclusively for Federalist Society members.        

 

Wesley Hodges:  Welcome to The Federalist Society's teleforum conference call. This afternoon's topic is Social Media Oversight: The Debate Over Regulation and Antitrust Enforcement on Tech Titans and is hosted by our Regulatory Transparency Project and our practice groups, specifically the Corporations, Securities, & Antitrust and Telecommunications & Electronic Media Practice Groups. My name is Wesley Hodges, and I'm the Associate Director of Practice Groups at The Federalist Society. 

 

      As always, please note that all expressions of opinion are those of the experts on today's call.

 

      Today we are very fortunate to have with us an accomplished panel of experts and moderating them today is Mr. Matthew R. A. Heiman, who is a Visiting Fellow at the National Security Institute at the Antonin Scalia Law School at George Mason University. After our speakers give their remarks today, we will move to an audience Q&A, so please keep in mind what questions you have for this subject or for one or several of our speakers. Thank you very much for speaking with us. Matthew, the floor is yours.

 

Matthew R. A. Heiman:  Thanks, Wesley. And I'm going to say thanks in advance to our speakers, who I'm going to announce in just a moment. I wanted to do a quick plug for the Regulatory Transparency Project, and in particular the Cyber & Privacy Group, and all of our speakers today are members of the Cyber & Privacy Working Group. As the name would suggest, our work is really focused on regulation and proposed regulation that touches upon the cyber economy as we think about it and privacy issues. And if you're interested in the work that we've done today, you can go to RegProject.org. That's R-E-G Project.org and then you can see the various working groups, and if you want to look at what we've done, click on Cyber & Privacy. We've got a white paper that's there that was published last year. We've got a number of shorter papers on topics related to privacy, and I believe you can also see the links to the various podcasts that we've done over the last year and a half or so on these topics. So we'd encourage you to check that out if this is of interest.

 

      And so with that, I want to quickly introduce our panelists. I'm going to give very short introductions, but if you want to know more about their background, you can go to RegProject and look them up one by one and see their full bios. But in order of -- in alphabetical order we have Neil Chilson, who's a Senior Research Fellow at the Charles Koch Institute. Prior to that, Neil held a couple of roles at the Federal Trade Commission, including the Chief Technologist. Neil, welcome to the call.

 

Neil Chilson:  Glad to be here.

 

Matthew R. A. Heiman:  We have Professor Tom Hazlett. He's the H.H. Macaulay Endowed Chair in Economics at Clemson University. Prior to being in academia, Tom was the Chief Economist at the Federal Communications Commission. Tom, welcome to the call.

 

Prof. Thomas W. Hazlett:  Thanks for having me.

 

Matthew R. A. Heiman:  We've got Paul Rosenzweig. Paul is the founder of Red Branch Consulting. He's also a Senior Advisor at the Chertoff Group. He also teaches at George Washington University, and he was the Deputy Assistant Secretary for Policy at the Department of Homeland Security. Paul, thanks for being with us.

 

Paul Rosenzweig:  Glad to be here.

 

Matthew R. A. Heiman:  And last but certainly not least, Megan Stifel, who is a Nonresident Senior Fellow at the Cyber Statecraft Initiative at the Atlantic Council. She's also the founder of Silicon Harbor Consultants. And in public service, Megan did a stint at the NSC, where she headed up cyber issues and also had a couple roles at the National Security Division and others within the Department of Justice. Megan, welcome to the call.

 

Megan Stifel:  Glad to be here. Thanks for having me.

 

Matthew R. A. Heiman:  And then, finally, is Jamil Jaffer, so I am going out of alphabetical order. Jamil is with us. He's the Founder of the National Security Institute at George Mason University's Antonin Scalia Law School. He's a Visiting Fellow at the Hoover Institution. He's also the Vice President at IronNet, which is a cybersecurity consultancy. In the public sector, he was Chief Counsel at the Senate Foreign Relations Committee. He also held positions at the Department of Justice and the White House. Jamil, welcome to the call.

 

Prof. Jamil N. Jaffer:  Thanks, Matthew. Good to be on.

 

Matthew R. A. Heiman:  So that's our lineup. Why don’t we begin with the call as it's been titled? There have been various calls of different kinds calling for regulation of social media platforms, and whether we're talking about the kerfuffle with Cambridge Analytica and Facebook or Russian interference with elections or partisans on the left or right complaining that Facebook, Twitter, YouTube, whomever, is not being fair to their point of view and banning them or suspending them or curtailing content, the answer seems to be repeatedly there should be further regulation of these tech platforms because they're taking over our lives. They are so central to our lives and we can't trust the individual management of these platforms to just operate them without some level of regulation, including some calls for antitrust regulation which could take the form of breakups.

 

      And so I wanted to get from the panel their views on A) should there be some regulation and B) if there should be, what forms should it take? So why don’t we go in reverse alphabetical order, which means we'll start with Megan. Megan, what's your view? Is there something that needs to be done? We've got the executives trooping up to Capitol Hill seemingly every week to talk about what they are or are not doing. Is there something more that the government should be doing to make these platforms fair or more open, less prone to manipulation by enemy actors that want to disrupt elections?

 

Megan Stifel:  Thanks, Matthew. As I think most of my colleagues would say I don't think there's a quick yes or no answer to the question. I think ultimately what we're seeing at this point is, at a minimum, the appropriate response, which is that we are examining what the current situation is: gathering facts, looking at whether existing authority is sufficient and interviewing and having folks testify. Certainly, I think more of that needs to go on. Of course it can't go on forever because we do have elections upcoming where we need to be thinking about the future of where we need to head in this space.

 

      But I think what's also important is not to rush to hit the easy button and say, "Okay, let's regulate," because I think as you highlighted in your opening thinking about using a regulatory tool in this space is a challenge. We, at the extreme, run the risk of becoming exactly like the countries about which we are most concerned, those being authoritarian states who think that speech online is subject to regulation because should it threaten the regime, it's therefore in consistent with the existence of the state. And that's certainly something that our country's First Amendment protects us from in most instances.

 

      I think further examination is certainly necessary, and I would wrap up by saying I think it's clear that the companies that have been testifying have quickly identified actions that they could take to try and address some of the concerns that have been raised over the past year. And I think it's unfortunate that it's taken kind of this long for these companies to recognize that they do have a role to play and do, whether it's formally a duty of care or informally a duty of care, I think some of my other colleagues can speak to that a little bit better, but certainly we need to be thinking more closely about where we expect, in society, these large flares to extend their business operations and how that impacts us as a society.

 

Matthew R. A. Heiman:  Thanks, Megan. Paul, do these companies have a duty to care? Should we be thinking about them that way? Is there something else that we should be doing from a regulatory perspective, whether it means a greater, a heavier hand by the FTC or other actors?

 

Paul Rosenzweig:  Well, let's start, I think, by acknowledging that this is a classic "wicked" problem. We certainly don't want the government in the business of content moderation, in the business of saying who gets to say what's on the web. That's kind of -- we would win the battle against electoral interference, for example, but we'd lose the war in giving up free speech. So the question is whether or not there are ways that the government can encourage private-sector actors who've set up these new fora for discussion to exercise their responsibility to moderate those forums and, to some degree, in a way that is consistent with First Amendment values and is also consistent with the need to do something protective in the domain so that it doesn't continue to be a free-fire zone where the election, you know, social media goes all hither-and-yon.

 

      There isn't a lot of empty space there that actually is suitable for government efforts. The few that I can think of mostly involve attempting to identify more rigorously those who are speaking inside the domain who should not legitimately be permitted to speak. I'm thinking mostly of artificial bots. I would think that nothing in that -- no regulation, and/or no standard setting in that regard would trench upon First Amendment principles. So I would look to something like that. Maybe excluding foreigners from discussions of election activity would be another, but we're not going to be successful if we decide that anything other than limited narrow shots are available because we just don’t want either the government or Facebook moderating the content of what happens on the network.

 

Matthew R. A. Heiman: Thanks, Paul. And, Jamil, do you take a similar point of view? Is it something where we could say, "Well, bots don't have First Amendment rights and they're not entitled to a voice, but others, you know, humans do?" Is there any sort of regulation that you think is appropriate, or would you rather see further private ordering by the private sector to allow it to work out these issues?

 

Prof. Jamil N. Jaffer:  Well, it strikes me that typically when we think of regulation, we think of regulation being appropriate in a case where the market's not succeeding, where the market is failing. But one of the things you have to assume about a market and for a market to succeed is full information. And we know for a fact that in the market place of these issues, the content regulation space, there's not full information, right? The companies aren't aware necessarily of everything the government's aware about in terms of who's taking these actions, the nation states behind them, the intent behind them and the like. And so I think before we jump to regulation—and that regulation can come in a variety of ways, whether it's the FTC or the SEC or antitrust enforcement actions by the Justice Department or the FTC itself—there's a variety of things that might be done.

 

      But before we do any of those things, we have to give the market a chance to function, and the market can only function with full information. So the government has a responsibility I think to share with the companies information about what it knows about what's happening in this space, and to the extent that information is classified, it should find mechanisms for sharing that information in a classic setting with the companies insensitive. And ultimately, the companies have a responsibility, I think, to share information with the government also about what they're seeing and what they're detecting in order to allow the nation to use its own authorities to take action as it sees fit.

 

      So all that is a long way of saying to be sure there are real problems in this space, and they're problems that are brought, and not just by the big companies, but smaller companies that face challenges also. And so, before we jump to the extremely strong medicine of antitrust regulation, antitrust enforcement, you know, breaking up companies that have been wildly innovative and have moved the ball forward tremendously in terms of technological innovation in our country, before going after the engine of our modern economy out in Silicon Valley with antitrust orders, it strikes me that the first step ought to be with the government trying to work with industries to try and get it right in the first instance. And then that's true of speech regulation and the like too. It's not just limited to antitrust enforcement. Now, before we start passing laws about what can and can't be said online and going down that road, it seems to me that working with industries to solve that problem through sharing of information is a first critical step.

 

Matthew R. A. Heiman:  Thanks, Jamil. And as we talk about markets or imperfect markets, we are fortunate to have an economist on the line with us, Professor Hazlett. Tom, is it a case where the market is imperfect and there's some things that the government could do to share further information with these companies to help make it a more efficient market? What's your take?

 

Prof. Thomas W. Hazlett:  Well, of course the market's imperfect, but so is life. The question is not whether or not there's a perfect market with completely full information; it's whether or not there's a plan to actually improve upon the situation with some new set of rules. And nobody has such a new plan right now. We have to be very careful. We've had a lot of bad plans that we've actually put into this space—the public square, the electronic communications—and we have a lot of those results in. And so we have to be very careful about trying to think that we can, for example, impose a fairness doctrine. And Paul I think was dead on when he said we don't want content regulation. It's very -- you know, the record is so poor on that in terms of suppressing free speech, and in fact, chilling open discussion of public affairs. And it's not just with the fairness doctrine, but it's the whole canopy of broadcast television regulations through licensing and through government oversight going back literally to the 1927 Radio Act. We've seen all kinds of free speech suppressed on both sides, or all sides I should say.

 

      Now, in terms of competition and sort of this antitrust sort of public utility idea that the tech titans need to be broken up, again, we have laws in place. We've certainly had laws in place since 1890 that make it a violation of the Sherman Act to monopolize or even attempt to monopolize a market. The fact is that those laws are in place, we have antitrust regulators, multiple federal agencies, in the states as well, and we have civil commercial litigation that can take place. And in fact, some cases have attacked some of the tech titans, including Apple and certainly in Europe we've had cases against Google. A lot of these cases are not good cases. And in fact, the sort of the standard on this is some people are romantically recalling is the U.S. v. Microsoft 20 years ago in 1998. And in fact, that was a case against Microsoft, or in essence predatory conduct, trying to monopolize the operating system software in the old PC market—and I say the old PC market because things have really gone in a different direction and not thanks to the case the government wanted. There's been a really opening of that market through competition and entry of mobile operators and the importance of search engines. And, in fact, the attack of Microsoft proved not to be so worthwhile and a real diversion from the competitive forces that were going to, in fact, improve consumer welfare.

 

      Overall, we really want to encourage competition, and certainly the antitrust laws are there to assist. But the burden of proof should be that the case, the remedy, is going to improve the consumer's position. Lowering prices, improving quality, expanding output in the market. The cases that are actually suggested are actually furthered on the grounds that they're going to stop companies like Amazon from lowering prices. That's very anti-consumer. It's going in the wrong direction.

 

Matthew R. A. Heiman:  Thanks, Tom. And, Neil, you came from one of the primary actors in the regulatory landscape when it comes to the cyber economy, the Federal Trade Commission. What's your take? Is there something more that the FTC can or should be doing, and if so, where would it be?

 

Neil Chilson:  Well, thanks for having me on. It's a great question. I would say that the key thing that we need to do here is disentangle some of what the problems are. We have, you know, in the lineup here we talked about Cambridge Analytica, which is more or less a data security issue. We talked about bias, we talked about elections, these are all really different problems and we should try to find tools that actually work to solve those particular problems. For example, using antitrust as a way to address something like bias online is a really indirect tool, probably not well-suited for that, and in fact, probably could cause a lot of harm in part even if you assume that bias is a problem. What we're learning is that content moderation on these platforms is very difficult. It takes a lot of manpower to even start to broach the problem, and you're still going to have a lot of problems in making distinctions online when you're trying to do this. And so it may be the kind of thing that only a big company could do. It's certainly probably at the scale that Facebook and Google are trying to do it. But it raises a whole bunch of issues.

 

      So that's just one example of the fact that the tools that we might apply to deal with these problems, we need to make sure they're actually effective in doing that. I say that because I'm somewhat concerned that a lot of these news stories are more the hook for people expressing some other policy preferences that they have, but they're using this opportunity to move those forward. So we have I think from the left just in many cases we have a general discomfort with big companies. We have people who are privacy hawks who don't want companies to use data. And I think on the right we have people who resent the idea that maybe the leadership of these companies or their workforce might be -- have different political orientation and may have supported different candidates in the past. And I think those underlying motivations help explain why there's such a mismatch between the problems that we're seeing and the tools that people are suggesting that we use to solve them.

 

      And the FTC does have a pretty good record in the data security space. It's brought more than 50 data security enforcement actions, and including it brought cases against some of the major players online, including Facebook and Google. And those cases weren't antitrust cases; they were consumer protection cases. But they got at some of the same concerns that I hear swirling around in the big complaints about these big platforms.

 

      So I do think that there are tools to be used to address some of these problems. I think that we need to be very specific about which problem we're trying to address. And we should be really careful to not let our motivations -- to be clear on what our motivations are when we're trying to apply these tools to these big companies.

 

Matthew R. A. Heiman:  So let me—thanks, Neil—let me play devil's advocate, and this will be sort of a jump-all question for whoever wants to take it. We've got a massive industry that, as Jamil has pointed out, has really been the engine of economic growth over the last 10, 20, 30 years. We've got an industry that is compared to any other industry of its size, whether you're talking about the energy industry or the transportation industry, very lightly regulated. We've got a public that's very concerned about use of data, the interference in election process by foreign state actors and sort of what the industry is doing about it. And then you've got politicians that are, for either principled reasons or political reasons, are pretty agitated about what's happened on these large platforms. If we think about how this plays out, is there something that should be done to head off an even more dramatic -- you know, if the next story is even more dramatic. There's some terrific misuse of data or the Chinese get involved in disrupting midterms, is there something that is pragmatic that could be done to both lessen the risk going forward of bad acts by the tech companies, while at the same time heading off some really bad legislation? In other words, can the answer -- is it sustainable to just say "Let the private sector sort itself out because that's the best answer, and anyone that calls for regulation is all wet?"

 

Paul Rosenzweig:  Well, this is Paul. I'll jump in. I think that's exactly the problem, which is that for far too long, the tech community has pretty much had its head in the sand by its regulation and it's only now coming out. They've started doing better. The testimony the other day in the Senate, in which both Facebook and Twitter seemed to acknowledge their role and realize that they had to find better ways to do it, is a good step. They haven't really stepped up the game that much, though there was a report today about some new steps Facebook was taking. If it were me and I were trying to do good things and forestall regulation, I would be convening a tech community-wide standard setting group that would try and come up with some baseline best practices that everybody could or should implement that involve not just talking amongst themselves and acting unilaterally but incorporating input from some of their critics as well as some of the privacy advocates who fear that they may go too far so that they at least have a kind of broad-based program on which to build. That's how I would do it.

 

Matthew R. A. Heiman:  Anyone want to take a swing at that.

 

Neil Chilson:  Yeah.

 

Megan Stifel:  Well, I don't know if I'm swinging at it, but I'm swinging to support it. I think—this is Megan—I agree with Paul. I think it's clear from, even I think in Jack Dorsey's testimony the week before last, he indicated that there is some small group of these companies that's getting together to think about how they might collectively try to address this. So at a minimum if I were them, I would be absolutely trying to figure out, not obviously technically but sort of at a policy level, what actions could be taken that might become public that they could then all basically attest to. I recognize that that then opens them up to potential challenges. But highlighting their good efforts is something, I hate to say going on the charm offensive, but greater transparency is something I think a lot of us seem to all collectively support.

 

Matthew R. A. Heiman:  Neil, or --

 

Neil Chilson:  Yeah.

 

Matthew R. A. Heiman:  Go ahead.

 

Neil Chilson:  Yeah, I'd love to jump in. With a word of caution, I think these are smart political moves on behalf of the big platforms who are taking a lot of heat right now. I worry about -- there is a connection between the fact that the tech industry has been one of the most dynamic and innovative and beneficial to the U.S. economy and the fact that it is very lightly regulated. Those are connected, and for a good reason, I'm a bit concerned that this political pressure will lead the big companies to embrace a level of regulation that they can easily handle but which will perhaps foreclose dynamism in the very sector that has been extremely beneficial to the U.S. economy and where we are a world leader.

 

      So while it may be a smart political move by some of these big companies, and certainly transparency and a charm offensive are wise things, if it leads to regulation that only the big companies can handle, that is not a good outcome.

 

Matthew R. A. Heiman:  And, Tom, I'm just wondering as sort of our economist and market expert, I'm just wondering if you could maybe talk about or if you've got a view on the very risk that Neil just picked up on which is the tech companies say, "We'll accept a certain amount of regulation," which essentially creates a moat around them. And I'm just wondering if you've got a view on that.

 

Prof. Thomas W. Hazlett:  Well, sure. I mean, it's a real issue and obviously a problem we've seen in a lot of places, including broadcast television regulation, where we've used regulation to truncate entry into the market and create a market power for incumbents and really stifle innovation. And we've done it on the premise of regulating the public interest in news information and public affairs programming. That's largely the basis for that whole licensing scheme. And people are talking about something, if not exactly the same, something similar, and even public utility regulation, which in some sense goes farther to the regulatory side on that.

 

      Indeed when companies get together, nobody I think will negatively comment on greater transparency. This is what you want to see market competition for is firms to be more upfront and better labelers of content and also forthcoming on privacy issues or security threats within their network. But it's interesting, too, if they get together, if the firms in the market get together and actually do work out best practices as an industry, that that may well take some kind of antitrust immunity. That is to say regulators may have to come in and allow the market to negotiate that out.

     

      Now, that may be exactly what should happen, but we have to understand those tradeoffs, the competition will give us something and a lot of it's good. But sometimes decentralization does not take care of everything. Sometimes you do want some standards—labeling standards, security standards—and there may be a place for some kind of regulatory involvement because that, in terms of antitrust immunity, that would be, in essence, what you're talking about.

 

Matthew R. A. Heiman:  And, Jamil, I know with IronNet you work with sort of entrepreneurial types up to some of the big players in the market, from what I'd call sort of the mid-size actors in the cyber economy to the more entrepreneurial folks. Do they have an appetite for some sort of standard setting even if it's private? Or are they concerned that they get boxed out of the market if we get too regimented in either a private standard setting or obviously a regulated one?

 

Prof. Jamil N. Jaffer:  Well, look, I think certainly there's an opportunity for industry to come together and identify standards and opportunities. There's always a concern that that will result in the exclusion of smaller or mid-sized players, if the big players come together to create standards that don’t allow them the flexibility 1) to get in, but B) then to innovate from there.

 

      That being said, all of that sort of approach is dramatically more preferable to a government-instituted regulatory process for a variety of reasons. One, we all know all too well that the government itself is slow-moving, to say the least, and being slow-moving that means that as you have a highly innovative industry rapidly changing, if you impose regulations upon them, whether done through the regulatory process or the legislative process, you ensure that those things will be irrelevant by the time the ink is dry on the regulatory process. And then to attempt to change them, these laws and particularly laws that are regulations too, are fairly sticky and the changes take place at the margins if ever. And they'll take place sort of in massive trudges, except at an every 10, 20 -- 15, 20, 30-year interval.

 

      And so when you're dealing with a highly innovative space like technology, I think that government regulation is a bad idea. I think that industrial-based regulation makes a lot more sense. I'm good on these issues that we're talking about. But at the same time I do think it's a fair point, and I do think there are concerns among the smaller players that if the bigger players have too much of a say in setting those standards, that they get boxed out and innovative, better technology doesn't get adopted, whether that's [inaudible 31.32] in any given circumstance, that might be an open question.

 

Matthew R. A. Heiman:  Fair enough. So let me ask one more question and then we'll open it up to our callers. Let's imagine that we reassembled this group of luminaries in 18 months, do we think anything massively different has occurred? Has there been any significant regulation passed? Has there been a sort of a private sector, sort of a good-housekeeping stamp of approval as to how a social media platform should operate? What do we think the future looks like in this space? Is it private ordering? Is it regulation of a big-bang type? Is it smaller regulation around the edges perhaps? People talking about elections have to say, "This message was paid for by the Russian Federation," or whomever. What do we think the future looks like if we look out maybe 18 months? Anyone want to take that?

 

Paul Rosenzweig:  Well, this is Paul, again. I'm always happy to be the first to jump in where angels fear to tread. The one piece that I think we haven't mentioned at all, which would be my primary prediction, is that European regulators will take a lead in content regulation that we will find very dissatisfying. It will be both about bots and also about hate speech and probably about things like right to be forgotten as well. And 18 months from now we'll be further down that road, and the U.S. Government will not have taken any aggressive steps to oppose that. And so it will become a bit of the reality.

 

Matthew R. A. Heiman:  So perhaps further balkanization of the internet as we know it. Is that what you think a potential outcome is, Paul?

 

Paul Rosenzweig:  Either that or without U.S. government support of the tech companies caving to European demands.

 

Matthew R. A. Heiman:  Yeah. Anyone else want to chime in on either that or where you think other things might be in 18 months?

 

Neil Chilson:  This is Neil. I think that is a real concern. I think we are right now set up for a nice natural experiment between the U.S. and the EU—not so much on the big tech players because as global companies, they tend to be already doing GDPR, for example, and would be the ones who would negotiate those European deals—but more, I bet, on the small companies, and I think we'll get to see over the next 18 months how small companies and innovation in Europe compare going forward to the U.S. I think the evidence there is pretty good on the U.S. side already, and I think it'll get stronger over the next 18 months.

     

      As far as regulatory or legislative action, I think there will continue to be pushes on privacy, perhaps data security legislation. I don't know how well those things will move. Data security is an easier problem in many ways, at least from a legislative standpoint because people tend to agree on what the world should look like. They tend to agree on what the outcome should look like anyway; they may not agree on how to get there. On privacy, people just fundamentally disagree on what the world should look like, on what companies should be allowed to gather, and how they should be allowed to use information. So I think that's a much harder legislative problem. So I don't know what will happen there.

 

      I do want to point out that while this is a big deal in D.C. and it's a big deal in the press, there is some recent polling that suggests that consumers don’t have the same level of concern or certainly don’t want the same sort of tools used that have been very popularly discussed in D.C. NetChoice just did a new survey and I wrote a blog post on it, which maybe we can share in the show notes or something like that, that just talks through how consumers are perceiving some of these issues, and the one takeaway I would say was the interesting note that consumers don’t think antitrust should be used to break up the platforms. In fact, most of them think that the platforms respond to competition and many more of them are concerned about, for example, the pharmaceutical industry and anti-competitive effects than they are about the online platforms. And I think that's in part because as the poll shows, many of them 43 percent of the people polled had actually quit using a social platform. So despite the fact that we often talk about these being essential, consumers can make choices in this space. They are making choices, and I think that means there's market signals to shift these companies in the direction that benefits consumers. It's a really interesting poll. I don't think it's flawless, but it does suggest that maybe consumers don’t share the same appetite for heavy-handed regulation that maybe some of the advocates and regulators do.

 

Matthew R. A. Heiman:  Thanks, Neil. Tom, Megan, Jamil, any thoughts on what the next 18 months might hold?

 

Prof. Thomas W. Hazlett:  Well, there's certainly a lot of headline risk for these major companies and that means that you really can get your sock hammered if you step in it, step in the wrong direction, and customers do a double-take and, in fact, and investors do a double-take. And so you've seen Facebook have a couple of episodes this year with headline risk. And so the companies, in a competitive sense, are trying to solve these problems. But we have some good suggestions here that there may be some policy innovations that would assist that process, and I think that we will move forward, lurching forward. One of the interesting things is that obviously the European Union has really jumped towards a much more aggressive regulation, particularly because the major social media platforms are American. And it's established through research and observation that the European Union tends to be much tougher on antitrust actions against American companies than they do against European companies. And they've gone after these social platforms, and, in fact, they've instituted fairly far-reaching regulation on privacy issues. So if you want to look at what's possible coming, we certainly can look to Europe.

 

Matthew R. A. Heiman:  And it seems that, at least within the U.S., if there's any jurisdiction that's following the European approach, it may well be California, given their GDPR what I call "light legislation" that recently went through as well as the proposed legislation on security around internet of things. Jamil, any thoughts.

 

Prof. Jamil N. Jaffer:  No, I mean, I think all of what's been said is exactly right and that there is substantial risk for some of these companies going forward. And a lot will turn on how things go over the next -- you know, particularly through November, and if there's [inaudible 39.05] election. I think that's exactly right. I think we'll see and know more here shortly, but I think the risk remains high, and I think that the comment that, I think it was Tom that made it earlier or maybe Paul, which is that these companies are not in the business or have not been in the business of being as engaged as they might be with, you know, the D.C. swamp as it were, and I would dare say they are probably regretting that now. And to be fair, they don’t have friends on all sides because they've chosen various sides in these debates. And so I think the industry does have substantial risk and needs to be prepared for what that risk means going forward because there just aren't a lot of people out there pulling for them when it comes to some of these areas. And so I think that they are substantially exposed.

 

Matthew R. A. Heiman:  Megan, you get the final word on the question. Next 18 months, if you're looking into your crystal ball, what are you seeing?

 

Megan Stifel:  I don't know that I have much more to add. Certainly, we'll have the outcome of our midterm elections and sort of the after-action review of how things went both from the state side as well as from these companies and how they feel that their sort of interim steps did or did not improve the effectiveness and reach of some of the tools that are being deployed against many voters. So that I think together with obviously what happens with Europe will give us some good indication. Personally, I would be under 50 percent likelihood that we're going to see some kind of significant legislation in this space. I expect that that we'll continue to see some kind of ongoing dialogue between the Hill and the Valley, particularly if the midterms change the House, but I don't think that we have enough yet to say that there's clear indication that we're going to see some massive or even minor legislation.

 

Matthew R. A. Heiman:  Thanks, Megan. Wes, I think we're at a point in the call where we'd be happy to open it up to anyone that's got a question for us.

 

Wesley Hodges:  It looks like we do have two questions in the queue right away, so let's go ahead and move to our first caller.

 

Matthew R. A. Heiman:  Great.

 

Janet Randle:  Yes, my name is Janet Randle. I'm a litigator in Dallas, Texas, and this is an area I'm very interested in. I'd like the panel to comment on in listening to the testimony of Mark Zuckerberg before Congress, I was truly amazed at the level of ignorance that was displayed by Congress regarding these technologies, and I also pretty much was surprised at Mark Zuckerberg's answers to many questions. And I was wondering if the panel has any idea on how we can develop forums—for example, I was thinking of how a long time ago if you remember Arthur Miller at Harvard had that forum on the Constitution, and he would bring in people from all aspects that involved the constitutional issue he was discussing, if we could start doing something like that to start educating Congress people as well as the public and these technology entrepreneurs who are developing the program.

 

Matthew R. A. Heiman:  Great question. So, panel, is there -- it clearly seems from the public discussion, I think the caller's right, the level of knowledge of technology and policy, and for that matter, regulatory making is probably not where it ought to be on all sides. Does anyone foresee any opportunities to raise the level of discussion either amongst the regulators or the entrepreneurs?

 

Prof. Thomas W. Hazlett:  Well, sure. In part, that's what we're trying to do. It's certainly the Lord's work to have public discussion and try to tap into various areas of expertise and to engage with policy makers and advocates of various policies. So I think it's a great idea, and I think there obviously has to be more of it because you do when you listen and you see the congressional debate, you blush. It's remarkable that policy would be made in that environment.

 

Paul Rosenzweig:  Here's one concrete suggestion. Back in the late '80s, early '90s, Congress had an Office of Technology Assessment. They closed it mostly for budgetary reasons and in a fight, historically, over the technology assessment of climate change. We should reopen it. Congress should have a devoted branch of people who understand technology and help educate them on that. There's actually a study, the last funding bill, actually directed the Congressional Research Service to make a study of whether or not it should be brought back. But I'd9 vote yes.

 

Matthew R. A. Heiman:  Any other thoughts on the caller's question?

 

Neil Chilson:  I would just note that at the Charles Koch Institute, we're very interested in understanding how and why cultures embrace innovation, and part of that is helping educate and understand how public perceptions of technology affect that. And so we are very interested in that question, and we are always looking for good projects that help us better understand how the public can understand that. I know there's a lot of talk about the OTA and bringing that back. I have not made up my own mind on whether or not I think that that is a great idea. Having been a chief technologist at a federal agency, I can tell you that it's the rare technologist that can break out of the technocratic mindset. It's the rare engineer who doesn't try to engineer society the way that he might build a computer. And so I think there are some challenges with that sort of institution. But I think if built correctly, perhaps it could be useful.

 

Matthew R. A. Heiman:  Thanks, Neil. I know, Jamil, at the National Security Institute you've been doing some work in terms of trying to educate technologists about the regulatory process and regulators and lawyers about technology. Do you have any thoughts on how to bridge the gap between what we see among the tech entrepreneurs and then the regulators?

 

Prof. Jamil N. Jaffer:  Well, certainly, I think that getting more technologists, by which I mean coders, engineers, data scientists, involved in the policy process, getting them up to speed in thinking about and understanding how these things are developed and getting their input can only be valuable to the regulatory and legislative community as they look at these issues. And I think what they'll learn is that a cautious hand is probably the best hand to play in this arena.

 

      On the flip side, I think that for regulators, policy makers, and lawyers, there's a need to educate them about technology and how it works and the impact of the decision they make on these things. And so we're also running an effort, like you described, for technologists on the lawyer side, an effort we call "coding for lawyers." It's not really about writing code but more about explaining how technology works. And on the technology side, it's known as "decompiling government." And so we're hopeful to get more folks involved in that conversation, get education and getting people up to speed on how things work is really important. So to the extent that folks on this call know of or are interested in these issues, they should definitely reach out to us at the National Security Institute, and we're happy to—it's a little plug in there for us—to help educate folks in this space.

 

Matthew R. A. Heiman:  Great. Just in the interest of time, Wes, maybe we'll get to the second caller in the queue?

 

Wesley Hodges:  Sounds good.

 

Diane Katz:  Hi, this is Diane Katz at the Heritage Foundation. Thank you for your comments. Several times during the discussion you touched on risks of regulation to the companies, but I haven't heard a pretty vigorous opposition from the companies, at least on the Hill, to regulation. And so I wondered if you could touch on the possibility that, in fact, they may not oppose regulation too much, whether it's because they can harmonize policies with the EU or there's great barriers to entry for potential competitors. There's lots of reasons that at least the Big Five would perhaps welcome some regulation.

 

Matthew R. A. Heiman:  Great question. Are the policy directors at FANG, are they secretly happy to have a little bit of regulation? This gets to the theme I tried to flesh out a little bit earlier in the discussion around regulation can create a moat. It certainly did for the automobile industry. It does for all industries, so our Apple, Alphabet, and others may be kind of winking and nodding to each other thinking, "You know, a little regulation wouldn’t be bad for us in terms of the competitive space." I don't know if anyone's got thoughts on that.

 

Prof. Thomas W. Hazlett:  Well, sure. I mean -- this is Tom Hazlett -- it depends on what kind of regulation it is of course, and there's a natural coalition of interests between some policymakers and some of the incumbents, the larger firms in the industry. Now, they may want to craft something to respond to certain public demands, but also the companies can certainly think about better versus worse regulation. And so of course the danger in that is that the better regulation from the standpoint of the firms is going to be less competition and blocking innovation from outside the incumbents.

 

      So, yeah, I think that -- my reference was to the problems with regulation in terms of consumer welfare. It's not a hit on the companies; it's the fact that even in things like -- I mean, the problem with the Microsoft case was not that Microsoft got harmed so much as it was certainly that competition in the marketplace was not advanced. And that should be the metric on what we're trying to do.

 

Neil Chilson:  Yeah, this is Neil. I would just caution against lumping all of those, the same companies, together. Their business models are radically different. The types of regulation that might be good for an Apple would not be necessarily good for a Google because they do very different things. Or an Amazon is even, you know, again, added in—completely different. So I don't know that even if they were relatively silent on certain aspects of regulation, they might, if they're trying to gain a competitive advantage, I think there is still some distance between the various companies on what would work for their business models.

 

Prof. Thomas W. Hazlett:  Fair point. Fair point.

 

Matthew R. A. Heiman:  Anyone else on that theme? Wes, do we have any more calls in the queue?

 

Wesley Hodges:  We do have one more.

 

Stuart Gerson:  This is Stuart Gerson. It's an interesting talk. I mean, it is clear there are some regulations that the industry has embraced with alacrity, CSA and the Safety Act, though they don’t do everything that they need to do, were certainly warmly embraced. And one gets the sense, especially given what the Chamber of Commerce has had to say in the last week, that there would be a great desire to have a uniform data breach legislation if only to reduce the vagaries of litigation among the states for national companies. But isn't the fundamental antitrust problem that's being faced is what the more liberal elements are trying to do with antitrust law, which is change it to the European model to protect competitors rather than competition. A friend of mine recently addressed a panel at the EU suggesting in this regard that they were attempting to regulate an industry that they don’t have and that there ought to be a lesson in that. But isn't -- do you observe that at all?

 

Prof. Thomas W. Hazlett:  Absolutely. I mean, there's actually a pay-on now to Louis Brandeis and his view on antitrust, which was openly hostile to consumer interests. In fact, he attacked consumers for buying the cheapest products. And of course, Amazon is criticized for having low prices and waiting so long to make profits. I mean, this is in a competitive context. This is what you want. Now, there is an argument about predatory conduct, but when you're making the argument while prices are low and before there's any price increase and before there's any evidence that there will be a price increase, that's a highly leveraged argument and it inevitably becomes very hostile to consumer interests. And consumers are benefited by low prices and better quality, and the competition that we've seen has actually gotten into trouble by being too effective in that dimension. So that's something we have to be very careful of is policy commentators to point out that hostility to consumers is probably not the way you want to go on antitrust.

 

Neil Chilson:  Yeah, and I'll just add to that that this argument around antitrust, especially when I hear it from those on the right, that it should be more vigorous and more, maybe sort of political ends or other ends other than consumer welfare, I find extremely dispiriting because antitrust law is probably one of the very few segments of federal law at least where economic arguments are taken extremely seriously and where we have moved in the right direction over the last, you know, 30 or 40 years. It's the progressives, the Brandeis arguments of the past that lost trouncingly over that time, and to hear people on the right echo some of those arguments is somewhat disappointing to me.

 

Matthew R. A. Heiman:  Wes, any other calls in the queue?

 

Wesley Hodges:  Seeing no immediate takers, I turn the mic back to you, Matthew.

 

Matthew R. A. Heiman:  Thanks, Wes. I think this is probably a good point on which to end the discussion. But before we do, I want to thank the panelists, Neil Chilson, Tom Hazlett, Jamil Jaffer, Paul Rosenzweig, Megan Stifel. Obviously a lot of ground covered in this call. Impossible to do it in-depth, but watch this space. Our group continues to think about these issues. We'll continue having teleforum calls like this on the issue as well as publishing papers. So, again, if you're interested in learning more, particularly in the cyber and privacy space, go to RegProject.org. Click on our tile and you can see all the work we're doing and watch the space for more to come. So thanks to the team, and with that, Wes, I'll turn it over to you to close this out.

 

Wesley Hodges:  All right. Well, thank you, Matthew. On behalf of The Federalist Society, I'd like to thank all of our experts for the benefit of their valuable time and expertise. We welcome listener feedback by email at [email protected]. Thank you all for joining. This call is now adjourned. 

 

Operator:  Thank you for listening. We hope you enjoyed this practice group podcast. For materials related to this podcast and other Federalist Society multimedia, please visit The Federalist Society's website at fedsoc.org/multimedia.