Transatlantic Debate: Evaluating the EU-US Data Privacy Framework

Event Video

Listen & Download

In October 2022, President Biden issued an executive order regarding the European Union - U.S. Data Privacy Framework.  The Framework allows for data flows between the EU and the U.S., and it was established after the European Court of Justice struck down a prior agreement known as the EU-U.S. Privacy Shield.  The executive order addresses U.S. collection of signals intelligence, which has been a source of concern for EU regulators and privacy advocates.  The executive order limits signals intelligence collection to defined national security objectives, requires the privacy and civil liberties of all persons be considered regardless of nationality, and the collection must be proportionate.  In addition, the executive order calls for a multi-layered review process that will allow individuals to lodge complaints regarding the collection of signals intelligence.

Our experts will discuss whether the Framework addresses the concerns of privacy advocates in the EU and the U.S., and they will consider the implications of the review process for U.S. intelligence collection.  This program will also explore whether the EU and U.S. can reach a durable privacy agreement given the tension between EU privacy preferences and U.S. national security needs.


  • Max Schrems, Founder, NOYB
  • Stewart Baker, Of Counsel, Steptoe & Johnson LLP
  • Moderator: Matthew R. A. Heiman, General Counsel & Corporate Secretary, Waystar Health; Senior Fellow and Director of Planning, National Security Institute


As always, the Federalist Society takes no position on particular legal or public policy issues; all expressions of opinion are those of the speaker.

Event Transcript



Jack Capizzi:  Hello and welcome to today's Federalist Society virtual event. Today, June 15, 2023, we are pleased to present "Transatlantic Debate: Evaluating the EU-US Data Privacy Framework."


      My name is Jack Capizzi, and I'm an Assistant Director of Practice Groups at The Federalist Society.


      As always, please note that all expressions of opinion are those of the experts on today’s call. After the discussion, we will turn to you, the audience, for questions. If you do have a question at any point during the program, please type it into the Q&A function at the bottom of your screen, and then we'll handle those as we can towards the end of today's program.


      With that, I'll introduce our moderator for today's program. Today we are moderated by Matthew Heiman, who is General Counsel and Corporate Secretary of Waystar Health. He's also the Chair of The Federalist Society's International and National Security Practice Group.


      With that, thank you all for being with us. Matthew, I'll turn it over to you.


Matthew Heiman:  Thanks so much, Jack. I'm delighted to moderate this conversation today because we have two genuine experts to talk about today's topic. And I'm going to make very brief introductions of them. And I'll go in alphabetical order.


      Stewart Baker is our first speaker. Stewart practices law with the Steptoe & Johnson firm in Washington, D.C. Stewart has had significant experience in the public sector as a senior member of a couple of administrations. He was an Assistant Secretary at the Department of Homeland Security during President George W. Bush's administration. Before that, he was the GC at the National Security Agency. And Stewart's practice, as those posts would suggest, focuses on national security topics, cybersecurity, technology issues. He's also the host of the Cyberlaw Podcast.


      So, Stewart, thank you so much for being here.


Stewart Baker:  It's a pleasure.


Matthew Heiman:  And then our second speaker: We're delighted to have Max Schrems. Max is an activist, lawyer, and author whose campaigns have focused on privacy issues in Europe, particularly focused on infringements by companies like Facebook. And he's been very active with litigation before EU tribunals and courts. And in addition to that, he's the founder of the None of Your Business nonprofit organization that focuses on privacy issues. He's based in Vienna, Austria.


      And, Max, we're delighted to have you.


Max Schrems:  Thanks a lot.


Matthew Heiman:  Great. Well, why don't we jump right in? This topic can, in some respects, be, I think, overwhelming for someone that's not an expert in the space. And given that our audience likely includes nonexperts that are trying to learn, there's a long history of what I'll call "data negotiations" between the EU and the United States. And if we go through the history of it, it began with something called Safe Harbor. Then it became Privacy Shield. And the latest version is referred to as The Framework.


      I'm just wondering if maybe, Max, we could start with you, and then, Stewart, I'll ask you to comment too. Could you just sort of walk us through the evolution of this—I'll call it a dialogue or discussion—that's been going on between the EU and the U.S. for approximately the last 25 years?


Max Schrems:  Yeah. A very short version, the EU Member States had data protection laws ever since the '70s, but they were very inconsistent. So in the '90s, the European Union issued a data protection directive. So that is a law by the EU that would have to be implemented into national law. Part of that was that there's going to be a free flow of data within the EU Member States to foster the Common Market and not have any boundaries on data transfers anymore. But that also meant that, basically, there was an expert prohibition on personal data, so any data that relates to the individual.


      And, obviously, there needs to be some flexibility for data transfers so there is options if it's really necessary to transfer data abroad. So if you sent an email abroad or you need to book a hotel abroad or something like that, that's already built into the law as an exemption. And then there is an option for outsourcing situations, so when you're really just feeling it's more practical, cheaper, whatever to process your data abroad.


      And these are either done with so-called "adequacy decisions." So it means that another country has the same law as Europe. A typical example would be Switzerland. They basically have the same data protection laws the EU use, so we basically recognize each other as being equal, and then there is no problem with data transfers.


      And then the other option is contractual systems, so where basically an entity in a third country, in simple terms, signs a contract saying, "Contractually, I'm going to follow EU law when I process data, even though there is no national data protection law in my country necessarily."


      Now, that's the overall situation. The U.S. was in that second bucket because there is no omnibus privacy law in the U.S., but there was very intense pressure already in the end of the '90s to have a "special deal," so to say, that was Safe Harbor.


      I was 13 years old when that thing was passed, and already then, some university professor said, "There is some questions if this is really so adequate and works so well." What that happened is that Snowden came out with disclosures over, especially, FISA 702. And then that was the first time we looked at the surveillance laws in the U.S. a little bit closer. And then it turned out that there's very little protection for non-U.S. citizens. And that led to the litigation. We had the first time around at the Court of Justice—the Court of Justice is kind of the Supreme Court of the European Union—and they invalidated that deal, saying that it would violate the Charter of Fundamental Rights, which is the European version of the Bill of Rights. And there is an Article 6 and 7 for privacy and data protection, and 47 for access to courts. And they basically said it violated all of these.


      That led to a second deal, which was hammered out between the European Commission—which is the executive part of the European Union—and the U.S. government that was called Privacy Shield. Now, sounds like a new deal, but it was, basically, most of the text was literally the same. But they just copied it over, put a new text on it, changed little parts within the actual text of the situation, but didn't really do more than that.


      What's important, oftentimes, all of this is called a "deal," or a "negotiation," or an "agreement." In reality, what it is is an executive decision by the European Union. They can declare another country to be adequate. It's not a bilateral international treaty or anything like that. So it's a one-sided situation that, in real terms, is hammered out between the two countries as they get back and forth. But it's not a legal instrument between the two countries. It's one country recognizing the other one.


      And that was struck down a second time around, not overly surprisingly, because there was not a big difference to the situation before. Then we had one and a half years now where the European Union negotiator said there's really no option for a new deal because the U.S. is not moving all too much. That usually comes out of the situation that the law in the U.S. is very hard to change. They usually try to work with executive orders. I don't have to explain to the audience what they are, why that is.


      And then there was a meeting between Biden and von der Leyen. And all the stuff that the lawyers haven't figured out for one and a half years, these two guys figured out in five minutes. And, suddenly, we had a new deal, this new framework, and that is based on the idea of putting an executive order there that limits the surveillance in the U.S. So, basically, the idea is FISA 702 allows more, but the executive order would limit it a bit. And the idea is that that would be enough to convince the Court of Justice on the European side that such a deal would be able to go forward.


      One thing that's a big drop there, usually, is that the executive order, the new one, is very similar to PPD-28, which is the old Obama executive order. And I think on the European side, there's not too much comparison between the two because PPD-28 was already before the Court of Justice, and they have analyzed that and didn't -- said it's not enough. And now it's a bit of what politics tries to do is to say, "Oh. There's an executive order with limitations. Oh, great." Instead of saying, "What is the delta? What is the difference between the old one and the new one? And how much did it go forward?"


      So my short summary of the new deal off the executive order is it's a bit like that the Court of Justice on the European side asked for a proper fence. And now we're talking about a fence that's an inch higher than the one before. But I'm very much questioning if that is going to convince the judges after all. That's, I hope, a somewhat fair and accurate summary of what happened, even though that's, I think, 25 years of litigation that I tried to summarize here.


Matthew Heiman:  Yeah. No. Congratulations to you, Max, for trying to summarize 25 years of litigation in about eight minutes. I know there's a lot more detail. And I appreciate you hitting the headlines.


      Stewart, I'd welcome your thoughts on the how we got here. And then we can get into the details of what's in front of us now, which is, obviously, the framework.


Stewart Baker:  Right. So I'll start even further back. I actually lived through parts of this and negotiated over the adequacy of U.S. law with respect to passenger name records. So I've been in the trenches on this and have followed it since the beginning. I do think it's worth stepping back and thinking a little bit about what a remarkable thing it is for Europeans to say, "If you want to have our data, we have to decide that your law is adequate."


      There is a whiff of colonialism in any determination by a European state that somebody else's law is adequate. And it provokes, at a minimum, some resentment and some jokes. When I negotiated these deals and we got a successful outcome, I gave the entire team underwear that had on the front a little stamp that said, "European Union Certified Adequate," which was a way of mocking the European Union's notion that they should sit in judgment on the adequacy of laws of the rest of the world.


      It is nonetheless -- the justification is—as Max said—we're sending our data someplace, and we will no longer have control of it. We want to make sure that it is treated with appropriate respect and protection for the people whose data is involved here.


      And so, it's possible to reach an accommodation. But I do think that one of the real concerns throughout this exercise has been a failure in Europe to recognize that they are acting in a fashion that is highly judgmental about other people's laws and that they ought to be cautious about saying, "We know better than you what your law should be." And that is -- that theme goes through most of the problems that the U.S. has had with Europe, and particularly with the Court of Justice.


      What happened, in my view, with respect to data protection and exports is this was originally a provision designed by the European Commission to make sure that when commercial data was exported, the rules that governed how it could be used commercially would remain the same more or less and that you couldn't ship it to a data haven and then start spamming everybody whose address you had gotten in Germany.


      But there's no clear distinction about the adequacy of the law between the adequacy of the private law that governs private parties and the adequacy of the public law that governs government access to the data. And so, as Max said, with the Snowden leaks, suddenly there was enormous concern about what U.S. intelligence could do with data and an effort to use the adequacy provisions to discipline U.S. intelligence. And, indeed, the European Court of Justice was so eager to respond to the Snowden revelations that they accused the United States of being inadequate on the basis of a Guardian article about how U.S. law worked that was wrong. They've never corrected that, as far as I know. They were just -- they just had to reach decision.


Max Schrems:  To add to that, we had even a representative from the U.S. that, I think, was under the Obama administration had this top secure contained as -- I think we had witnesses all over the place and six weeks of hearings and arguing all the facts of U.S. law. So --




Stewart Baker:  And yet -- and yet, the court --


Max Schrems:  -- correct that for sure.


Stewart Baker:  And yet, the court got it wrong. They got it wrong. And they cited the Guardian article. So I think that tells you that there was a determination to reach this issue on the European Court of Justice.


      The second thing I think is worth saying about this that frames the whole thing is that the European Court of Justice, sitting in Luxembourg, has construed the treaty that formed the European Union, the Charter of Rights, in a fashion that doesn't apply to anybody except the United States. They have made up a bunch of rules, and they have made up these rules for how to do intelligence in the complete absence of practical experience. And they almost bragged that none of these rules would apply to European governments because they didn't have the competence to address them. And they did not borrow from existing laws where people had actually asked the question, "Can we make this work as a practical matter?" And that enthusiasm for making up rules -- it's a kind of --


Max Schrems:  If I may --


Stewart Baker:  Yes. Go ahead.


Max Schrems:  -- it gets a bit much.


Stewart Baker:  All right. Go for it.


Max Schrems:  These rules are not made up. It's the Charter of Fundamental Rights under Article 6 and 7, where all the Member States have agreed. It's part of the constitutional fabric of the EU. It's not—like the Bill of Rights—wasn't made up by someone. It's literally there. So, and --


Stewart Baker:  So yeah. I -- can I stop you there? Because I agree with you --


Max Schrems:  And one short second.


Stewart Baker:  Okay.


Max Schrems:  Pardon. We did have a lot of litigation about surveillance in Europe as well. The typical thing is data retention, which is a bit like the 215 programs in the U.S., where the Court of Justice also found them to be unconstitutional or against the treaties in EU law, so that is not only on the U.S. There is probably ten to 20 cases in similar fashion about the European Union and European Member States.


Stewart Baker:  So then, I think in very limited circumstances that is true, although it is a fact that the countries of Europe have, in many cases, said, "European Court, there's no such" -- you do not know what you're talking about. We're not going to apply your data retention rule." I think the German courts said that. France has gotten its Conseil d'Etat to come to the same conclusion.


      There is -- the European Court of Justice has aggressively interpreted the framework, the Charter of Rights. And I think it's worth pointing out that most of the things that they are relying on -- they have said, essentially, two things about the latest -- about the deal with the United States: "We think you need a law"—and Max is saying that means a law passed by Congress, I believe—"that sets out all of the restrictions on your intelligence agencies. And we think you need to be able to send people to a court to get redress, to get particular rights recognized, determinations about whether your rights were violated."


      Some of the arguments that we're going to hear here are when the court says, "You need a law," it raises the question: Do you need a statute passed by a particular legislature, or can you use the whole bundle of ways in which courts are -- jurisdictions make law: executive orders, regulations, and the like?"


      I think if you look at the framework -- the agreement that the Charter of the European Union, it provides that you have to have a framework. It does not require that all of it be statutory law. And so, the court -- to the extent the court is going to say, "No. I want a written law," it's going beyond what the words of the [CROSSTALK 17:42] said.


Max Schrems:  That's not -- I think that was never the discussion, to be honest.


Stewart Baker:  Okay.


Max Schrems:  That is a sidestep, and there is a differentiation. And to be honest, it's always a bit hard because the Court of Justice and the whole system—how the EU is set up—is, I know, a bit foreign to the U.S. lawyers, so I try to interpret -- explain a couple of these things.


      So the Charter does require that any limitation of a fundamental right is set out by law. Now, law has its own meaning in EU law. It's different than what law means in Austrian law or German law. Any EU law is always interpreted only in the light of EU law. Now, we do not have Common law jurisdictions like Ireland, or it used to be U.K., Malta, Cyprus that do have Common law as well. So that is also considered "law" from that perspective; however, the more you interfere with a fundamental right, the more the law has to be precise and accurate.


      The biggest problem, currently, with the executive orders is that they do not generate third-party rights. So, basically, if someone violates that, there is no court right now that I can rely on and say, "You violated Section 351 of that executive order, and now I have a case to go against the NSA in that specific court." That is the --


Stewart Baker:  I think we redress --


Max Schrems:  -- call that came out of the discussion so far.


Stewart Baker:  I think redress is going to be an important part of the discussion.


Max Schrems:  Yeah.


Stewart Baker:  I just want to understand: You don't really think that it matters too much whether this is an executive order or a law by the -- Congress?


Max Schrems:  So in abstract to any third country -- so fundamentally, you can transfer data to other countries. And it's really a matter if another country is seen as adequate, meaning essentially the same as the European Union. And that is something that the U.S. wants to have. That is something that is not like a handout or something. And there's not a lot of countries that have that. So if that is the path that the U.S. government wants, we have to go through the different steps that are simply required in the law.


Stewart Baker:  But I just -- I'd ask you one question. Is the -- is it --





Stewart Baker:  If there's a statute -- does it have to be a statute or does it have to be an executive order?


Max Schrems:  Now, the law has to deal with the situation in 190 jurisdictions in the world. It's not specific to the U.S. If you then take the U.S. reality, an executive order would, I think in my personal view, work if it would confer third-party rights. If it would be directly --


Stewart Baker:  Okay.


Max Schrems:  -- or just like a normal statutory law. Now, my information—not the expert on U.S. law here—is that right now how an executive order is usually working is that it can limit the government in that situation, but it cannot go to district court in D.C. --


Stewart Baker:  Right.


Max Schrems:  -- and sue the NSA because they violated a section of executive order -- I forgot the --




Stewart Baker:  Let's jump to the redress question.


Max Schrems:  We don't look at what is the name of it but what is the quality of the legal instrument.


Stewart Baker:  Yep.


Max Schrems:  And then that's the quality of the legal instrument, and that make your determination what the --


Stewart Baker:  What this solution -- what the U.S. has proposed is they say, "We can give you a tribunal that is impartial and that is objective and independent. It will not be U.S. courts. We can't constitutionally provide access to the courts for claims that simply say, "There's been a violation." There has to be a much more concrete harm than that. That's U.S. constitutional law.


Max Schrems:  Yep.


Stewart Baker:  So we can't do that. So we have created a two-step process in which first your complaint is reviewed by the civil liberties officer and then by an independent court established by the Justice Department with guarantees of impartiality and objectivity.


      So, to my mind, it's pretty clear that nothing in the Charter requires that all of these determinations of redress be made by courts. There's a lot of references to courts in the Charter, and I don't see it referenced in the context.


Max Schrems:  But they require -- you're correct in that sense that the court -- in the EU, there's something else than a court. So we usually operate under -- or largely, for a long time, have operated under the Charter of Fundamental Rights -- the Convention, which is a separate system that comes out of the Council of Europe. And there what the word that is used, generally, is a tribunal. And the understanding is that a tribunal can be less than what, in the U.S., is considered a court. So you can, for example, have a judge that only sits for ten years, not for a life tenure. There is a little bit more room to maneuver for a tribunal than for a court. That's the way we usually have it, also, under Austrian constitutional law.


      Now, the interesting part is that that tribunal still has to be effective, and you have to have an effective procedure. Now, there are tribunals in the European Union that have that executive kind of situation where they're embedded in the executive but still are an independent decision-making body if I want to call it that way. And that could be a tribunal, and that could work to a certain extent.


      The problem that we have right now is largely the procedural steps. So if you want to erase something within your system, you need to know that you had a negative situation, whatever it is. Let's say you were denied a visa if you go to the U.S. So you were put on the security list for the next flight. Now, as a citizen that is concerned about that situation, I have to prove to my local Data Protection Authority in Europe -- so each Member State in the EU has a Data Protection Authority. Germany has 17 because they're federal. And I have to go to that authority, prove that this decision or that processing happened based on a data transfer of a specific company under a specific legal instrument after the new deal came into force, which is impossible, in the first step, to prove. Or, at least, I have to make it likely.


      So that's already on the first level. And we tried it under the old deal, that had the same requirements. Already, the authorities in Europe rejected our claims because we couldn't really show that, for example, a rejection of the visa or being on the security list of flight was actually based on such data that was transferred under these instruments. So that's already the first step, where in reality, you don't have access to it.


      If you, for whatever random reason, have all that information because someone—I don't know—put it, leaked it, or something, there's really, typically, no other way that you get it because there is no information about surveillance, usually. There is no Freedom of Information Act that applies here. The government is very closed up about what it does there. So if, in theory, you have that, you basically go to the civil liberties officer, and they will give you an answer that is literally spelled out in the executive order. They will tell you that they neither confirm nor deny that there was any surveillance. If there was surveillance, it was either legal or it was not legal, and it was not legal to remedy the situation. And that's --


Stewart Baker:  Isn't this what --


Max Schrems:  And ever --




Stewart Baker:  Let me stop you there.


Max Schrems:  One second. This answer is exactly blueprinted in the executive order. So you have a court system where you have the judgment before you even brought your case. And if you have --


Stewart Baker:  No, you have two choices. You have two choices. And the facts, of course, could produce different results. So I think it's worth the --


Max Schrems:  If you read in the quote what they have to answer, it's under quotation marks what this civil liberties officer has to answer.


Stewart Baker:  Yes.


Max Schrems:  They can—from my understanding—not deviate from that specific wording that they have to answer to you.


      Now, if you're unhappy about it, you can have an appeal. And because you're not going to know what happened in a procedure, you're not allowed to talk to them. You're not represented. You don't hear anything. You can write "I appeal" because that's all you're going to be able to do. Well, it goes to this tribunal or court or however you want to call it in detail, and you will get literally the same answer a second time around, which is also in quotation marks in the executive order.


      Now, just for the sake of short cutting the whole discussion, the Court of Justice has to look at that on the European side and assess if that is a proper court and a proper court procedure under Article 47 of the Charter, which is the same article that they have to apply to Hungary, to Poland, to all the craziness that goes on there. And they would have to come to the conclusion that the appointment of judges in Poland is problematic under 47 and a violation of the fundamental rights, but this system is actually a perfect court system. That's going to be a very, very hard way to bring charges.


Stewart Baker:  Let me stop you now, Max, because you've gone on quite a ways with this. I think it's fair to take a step back and ask what we're talking about here.


      We are talking about intelligence programs—intelligence programs that are vital to the security of the United States and, frankly, much of Europe. And they have to be -- you cannot just say, "Oh, let me tell you how it worked in your case."


      If Vladimir Putin takes advantage of this and says, "My data has gone to the United States. I'd like to see what you have on me," we're going to say no, and no one will think that's a good idea to provide him with that information. So to construct a system that does intelligence under law without wrecking the intelligence part of it—and that's clearly important today—we can't provide extensive information.


      And, in fact, at some point, with any system that has classified information of this sort, you have to rely on the internal workings of the system after the complaint has been filed. And so, there's going to be an investigation. There's going to be an objective determination by two decision-making levels, but they will not be able to provide a lot of data. This is exactly what the Germans do. This is exactly what the French do. They say, "This is what the British have done. We can tell you that we looked at your question. We looked at it hard, and we've taken appropriate action where action was necessary. And that's all we can tell."


      Actually, in the United States, the U.S. has said it's going to appoint representatives to be amici to advance the interest of the party complaining, but they can't tell the party complaining all of the details of the intelligence.


Max Schrems:  And that's -- all I think all is the important word. So that's all accepted under EU law as well. We have the same issues with -- we also have wiretapping if there is a reason for it.


      Now, the interesting part is either you usually have ex ante, like before the event, any kind of traditional determination if that's really if there's a probable cause, for example. And that's an interesting part, too, to also bridge our discussion a bit. The interesting thing is that 702, as it operates right now, is unconstitutional under the Fourth Amendment for U.S. citizens. So we actually agree on both sides of the Atlantic that how 702 operates --


Stewart Baker:  No, I'm sorry. I don't agree. I'm on this side.


Max Schrems:  Well, you could run 702 surveillance on a U.S. citizen without any problem, and you would never need to go. Because my understanding of 702 is very simple. It divides U.S. data from --




Stewart Baker:  Okay. You're right that it requires that the target be non-American --


Max Schrems:  Exactly. Because otherwise --


Stewart Baker:  -- because the legal rules that govern that --


Max Schrems:  -- what happens?


Stewart Baker:  Yeah. And your objection has been, from time to time, that you think that this is a more -- that in Europe everybody gets the benefit of these rules, although I didn't hear you rushing to Vladimir Putin's defense. So but maybe --


Max Schrems:  That is -- but to be accurate, we do have, in the EU, a system of human rights. That is true, and that is a historic situation. We also have, basically, citizen's rights, up to the Second World War.


      Same thing in Austria. Our old constitutional rights are citizens' rights. Only citizens have the right to demonstrate in Austria. Foreigners wouldn't. Now, that was switched after Second World War. We now have either the Charter or the Convention that have human rights. So any Russian is free to demonstrate on an Austrian street and have exactly that fundamental right just like an Austrian does.


      Now, that's --


Stewart Baker:  But you see how -- you see how that makes it even more complicated to say --


Max Schrems:  Exactly.


Stewart Baker:   -- human rights law requires access or correction or --


Max Schrems:  Exactly.


Stewart Baker:  -- really any information about how the program is working.


Max Schrems:  Exactly. Now, there's two parts. First of all, we have a human rights system, which I think most countries globally have moved towards. The U.S., as you know, has a very old constitution that just has -- comes from another time in history.


      But we could -- so I think to try to find common denominators, what's interesting is that at least on my understanding on both sides of the Atlantic is that a surveillance system as it is right now with 702, if done on your own citizens, would violate the Fourth Amendment.


Stewart Baker:  That's correct.


Max Schrems:  That's correct? That's great to hear.


      The same thing is true from a European perspective. If you apply 702 on the European citizens, you come to the same conclusion. So it's very hard for the Court of Justice to say, A) "There is unconstitutional surveillance in another country, but at the same time, we see it as adequate." It would be very hard for the U.S. to get a way around as well, which we see with the worldwide discussion --




Stewart Baker:  Which is why the -- which is why the executive order and all of the other provisions --


Max Schrems:  Exactly.


Stewart Baker:  -- that are special rights for Europeans.


Max Schrems:  Exactly. Then it will be interesting if the executive order would get into a level that, for example, would be compliant with the Fourth Amendment in the U.S. I would doubt it personally. I don't know if you have a different view, but you see the problem here, that if we have data transfers, and if we send data across the globe—and that is fundamentally, in the long run, going to be the very interesting part—how do we get among democratic or western countries—where we have a globalized internet, and we want to have data flows—into a situation where we can have data go back and forth without worrying that once the data is outside of our borders, our citizens are under our total surveillance? And that [CROSSTALK 31:13] very interesting part.


Stewart Baker:  Matthew, go ahead.


Matthew Heiman:  I'd like to ask a question about that. And I'll start with Max, and maybe, Stewart, you could respond.


      So what you said, Max, I thought was interesting, which is data flows across national borders or outside the EU to other parts of the world. And I'm wondering: How do you think about data -- so, obviously, the focus of this long-running debate has been a U.S. EU-focused discussion.


      I'm wondering how do you think about data flows that are going from the EU to not democratic countries such as China. And just out of curiosity, I went on the NOIB website, and I typed "China" into your search box, and I didn't see any reference to China. And I'm just wondering: How do you think about a company like ByteDance that has TikTok and they [CROSSTALK 32:04] Europeans to China?


Max Schrems:  Yeah. So the reality is we look at kind of apps and software that people use on a daily basis. And to be honest, Byte Dance is the very first example where we ever had a data transfer to China. The reality is that 99 percent of the cases where you have data transfers is the U.S. That is the reason, from a relevance perspective, the U.S. is the one country that is really relevant.


      And now there's another differentiation. China has never asked the EU to be deemed adequate. They will probably -- the Commission would probably laugh off -- laugh themselves off their chairs if China would come around and say, "Hey, we want to have a deal like the U.S. too. So there is a differentiation here that, first of all, we don't have --




Stewart Baker:  But wait. Wait. Stop right there, Max, because --


Max Schrems:  -- [inaudible 32:47] to that extent, and secondly, there is not an adequacy decision with China or Russia or all the other examples that you used.


Stewart Baker:  That just means it's far more illegal to be sending data to China than to the United States because the U.S. has at least gone through this process and it --


Max Schrems:  That is absolutely true.


Stewart Baker:  -- got us functioning under an adequacy rule.


Max Schrems:  Absolutely true.


Stewart Baker:  And so, to say that any data flows to China ought to be something that is of concern, especially to somebody who thinks that the Fourth Amendment ought to apply globally. To you, why is it they've never even bothered to raise this as a case?


Max Schrems:  Because they factually -- so, again, you have to understand how EU law works to get the -- to understand the answer, maybe.


      So EU law works as -- EU law has different situations for data transfers. There is a whole bunch of different options to transfer data. Now, if it's really necessary to transfer data—so I book a hotel in Beijing. That is always --




Stewart Baker:  You don't think that's the only kinds of data transfers that are occurring?


Max Schrems:  No. No.


Stewart Baker:  WeChat moves massive amounts of data.


Max Schrems:  We have to be able to -- I have to be able to -- I have to give you the three or four examples to [inaudible 33:54] --


Stewart Baker:  No, no. Come on. You're filibustering now.




Matthew Heiman:  Wait a minute. Stewart, let's just let him answer the question.


Max Schrems:  So, really -- so there are situations where it's really necessary to transfer data. That is possible. But any country in the world—North Korea if you want—that's fine. No one ever questions that.


      Then, there is a second situation where you really have an adequacy determination of a country, which means that country is basically the same as the EU when it comes to privacy. That was always a steep argument for the U.S. anyways given that there is no omnibus privacy law, given the surveillance situation. But that's a separate stuff. That is the highest thing you can get. That's what the U.S. wants, which literally, ten countries outside of the EU so far got.


      And what's in the middle of it is these contractual arrangements, which you can use right now with the U.S. where there's no adequacy decision currently. But you have to assess it versus the national law, and that is a situation why there's hardly any data transfers to China. I can tell you from my practice of five years as litigation, I came across a data transfer to China once with 800 cases, once. But I can tell you in these 800 cases, I get a data transfer to the U.S. in probably 700 of them. So the reality is -- and it's really not a broader issue --




Stewart Baker:  Are you saying -- are you saying you're confident that TikTok and WeChat and TIMU and Shine, every one of those companies is keeping all this data out of China?


Max Schrems:  No. That's exactly what I'm telling you. We have these situations, but they are hardly existent. When I turn on my Android phone, all my data goes in that moment to Google and to the U.S. before I even installed any TikTok or whatsoever. So the reality is that from a European perspective, data transfers to the U.S. are the vast bulk issue that we have.


Stewart Baker:  And is that --




Max Schrems:  And I know that the U.S. is automatically going to China. Whenever there is an international discussion, the first thing that comes up is China.


Stewart Baker:  Yeah, yeah. And you have [inaudible 35:52] --




Stewart Baker:  -- [inaudible 35:53] not published an investigation of that.


Max Schrems:  -- [inaudible 35:55] on our phones. On our phones, we don't have Chinese apps in the bulk of it. There is TikTok.


Stewart Baker:  So you -- this is the narcissism of small differences. You have a fight over whether this is exactly the right remedy for the -- that you would -- being offered by the United States and maybe there should be more communication. And that's got to stop all the data flows to the United States. And then the massive human rights violations in China, which are undoubtedly aided by data transfers, you say, "Well, I don't see them. They're not --




Max Schrems:  I don't say I don't [inaudible 36:31] --


Stewart Baker:  -- [inaudible 36:31] of mind for me. And there's no Snowden who's come forward to get our attention. Frankly, you should be ashamed of yourself.


Max Schrems:  I don't know if that is the level that you want to have this conversation on. But I personally don't, but if you want to have evidence and factual discussion, you can see that we factually don't have bulk data transfers to China, but we do have them to the U.S.; that we do not have a discussion if there's an adequacy decision for China. It's absolutely off the table, but the U.S. wants to have adequacy. And that is two very different sets of situations. If you want to conflate them to make your point, quite honestly, that's something I would be ashamed of.


Stewart Baker:  You could bring a lawsuit some -- you could bring a lawsuit tomorrow in Austrian courts against data transfers to -- using WeChat or TikTok chat if you had a plaintiff. But you have to install the apps. And you haven't done it.


Max Schrems:  So I work here for free. We have a budget of 1 million euro a year and have ten lawyers. I have a certain -- I have to make conscious decisions of what we do or not. And if something is a niche issue, we're not going to litigate it. If it becomes a larger issue, we will. And I'm the first one to go after it because the Chinese adequacy decision, which is not existent. You're talking about something that simply doesn't exist. It would be the easiest thing to shoot down ever. But it simply does not exist. We do not have an adequacy decision with China




Matthew Heiman:  Time out. I want to ask one other question before we turn it over to our audience for feedback.


      And that is, Max, could you -- could you help us to better understand—what appears to be a historic mention in his comments, and I think you've noted it, too, in your work—this sort of dichotomy that exists between the European Court of Justice's view on data privacy issues and whether or not the U.S. system is adequate or not to protect European citizens' rights versus how European nation states go about this work and think about these issues. And I'm just wondering if you've got a perspective on that you could share with us.


Max Schrems:  Yeah. That's actually quite an interesting topic. And there is hypocrisy in the EU law at that specific point.


      Now, as it was with the U.S. Constitution 100 years ago—and you guys probably know better than me—that did not apply to the states as far as I'm informed. That gradually, then, was expanded that these rights would also apply to the states, like the Fourth Amendment or whatever.


      We do have a very similar situation in the EU treaties that the EU Member States—when they signed the treaties to join the EU—exempt what they call "national security from EU law," and that is national security of the Member States. Now, there is a discussion of how far that goes. For example, that is only what, for example, the German secret services do internally or that would also, for example, include if the German secret services ask a private entity to provide them data too. There is a good argument to say the private entity already falls under EU law, and you would have to apply the things. National security's also very narrow. It's really about the statehood and the statehood being imploding.


      For example, criminal situations—like, as I mentioned before, data retention—is under EU law. And there is case law in that where the Court of Justice says you can. Now, that is a really interesting part, especially if you look at Brexit, that the U.K. actually—for their national security—were exempt from EU law from that provision. But once they left the EU, automatically it's not a national security of a member state anymore. It's then a third country, and that provision doesn't apply anymore. So in a very interesting situation that we can have litigation about third countries before the Court of Justice but not about the own member state's national security, which is just the reality of how the treaties are [inaudible 40:30].




Stewart Baker:  Let me address that too.


Max Schrems:  One last sentence. What fills in here is that at least the Convention, which is the Strasbourg Court, still applies. But what's really interesting from a legal perspective is that the Luxembourg courts, or the EU court, has a very high standard for privacy. While Strasbourg the Convention court, has a very, very low standard. They pretty much permitted almost any mass surveillance. And that is really interesting because it plays out exactly at this little point, especially for Germany, France, and Sweden, which are the only three countries in the EU that really have vaster surveillance knowledge or capabilities [inaudible 41:08].




Stewart Baker:  I think that's -- I think that's exactly the problem that the court sitting in Luxembourg, the Court of Justice, doesn't have authority and is proud of not having authority to set rules for anybody except people who come before it with litigation that NOIB has chosen to bring against their adequacy determination.


      And the place I would start here is when you're trying to regulate surveillance for national security purposes, you are on very dangerous ground. Of course, it's important to regulate it, but it's also important not to lose your capabilities, not to expose those capabilities. And people—countries—go through experiences as they try to find the right balance between protection and control of surveillance, protection of rights, and control of surveillance. And they learn by doing, by their experiences.


      We learned a lot, and it was very painful at 9/11, when we had thought that it was a great idea to have a wall between law enforcement and intelligence, and it cost us 3000 lives, and that was a disaster. That means that that principle—which otherwise kind of sounds pretty good if you were sitting in Luxembourg it would sound like a good idea—will never be acceptable to the U.S.


      Most countries have been through this. They have arrived at more or less the same place. And if you look at the OECD—which brought together 39 democracies to talk about what the limits on surveillance ought to be—they came up with limits that are very different from what the court has proposed, the court in Luxembourg has proposed, and very different from what Max Schrems is proposing. But they are all saying, "Yes, your legal regime is going to be your legal regime. But as long as it's basically similar to others, we can live with it." And they acknowledge that you can't give people full redress with correction in the context of a national security decision.


      So it seems to me that the problem here is not that the Luxembourg court is the good court that's holding firm, but that everybody disagrees with them. And then --


Max Schrems:  To be honest --


Stewart Baker:  -- if everybody had bailed out except us because --


Max Schrems:  I don't know who everybody is in this case.


Stewart Baker:  Well, how about Germany? How about France? How about OECD --




Stewart Baker:  -- 39 countries.


Max Schrems:  How about the [inaudible 43:55] parliament that passed all these laws with 90 percent majority including all the conservatives and so on? It's much far nuanced and differential, to be honest, than everybody on one side but the Court of Justice.


Matthew Heiman:  I appreciate the back and forth and the rigorous -- the vigorous discussion.


      We've got a couple of questions from our audience that I'd like to put to our panelists.


      We've got two questions that I think I can sort of combine into one. One audience member's asking, "Is it possible to achieve adequacy—meaning U.S. adequacy—without a federal U.S. privacy law? Is there anything the U.S. government can do to achieve an arrangement that would hold up at the ACJ?


      And then another person asks—which I think these are interwoven questions—"What do Max and Stewart's perfect U.S.-EU data transfer arrangements look like? Presumably, there's some common ground around the desirability of data sharing between -- being easier between democratic countries that are governed by the rule of law, which, I think, there's really no debate. The U.S. and EU Nation States are democratic countries, and they're governed by the rule of law.


      And so, is there a common ground that makes sense for all to live under, notwithstanding the debate we've had today?


      Maybe I'll start with Max -- if you could give a brief answer. And then, Stewart, if you could give a brief answer --


Stewart Baker:  Yep.


Matthew Heiman:  -- because there's maybe one more question we need to get to.


Max Schrems:  I'll try my best to brief.


      So I think on the federal privacy law, that's not an issue because, basically, there's a vacuum for the commercial sector. That is something you can fill with a contractual situation. That is what Privacy Shield and Safe Harbor did.


      And to be honest, we were at the Court of Justice and also supported all these parts that work. We were really in the middle ground in this litigation, which, as a privacy activist, you hardly are in.


      The part where it really gets tricky without any legal change, to my understanding of the U.S. structure, is the 702 surveillance part. And there, leading over to the perfect solution, I think if among democratic countries, we come to certain standards, we go along the, for example, probable cause system where you have a judicial approval before data is actually tapped into, that is a typical situation where you say, "Okay. That is something we can all agree on." Or if it's thereafter—because also to rebut with the idea of you have to have everything transparent— usually that is also a proportionality principle in the EU.


      So you say, "Okay. Is 20 years later on the minor thing, it is really impossible to tell people what happened? Or is it, in that specific situation where you have an ongoing threat, really not possible to say which is fair in that moment? And there, we need probably a bit more than this black-and-white situation of this cost right now to get the solution from the Court of Justice perspective will probably pass.


      I hope that's useful as an answer.


Matthew Heiman:  Yeah.


      Stewart, thoughts about common ground?


Stewart Baker:  Yep. I agree with Max that there's a lot of discussion in the U.S. about how we can get a privacy law and that will solve the problem. It won't. We have solved the problem with regulating private companies that move the data at least for advocacy purposes. So this is just a debate about whether the restrictions that the U.S. has proposed and the remedies the U.S. has proposed are adequate.


      I do think, Max, I should say -- I think you really overstated the notion that you have to go in and prove that your rights were violated in a particular way in order to get a Data Protection Authority to act. At least, under the U.S. proposal, you just have to say, "I think my rights were violated." And that is enough to trigger the review, at least, in the U.S. If you had problems with your Data Protection Authority, I think that's a problem with the adequacy of European law.


Max Schrems:  That part, just to be accurate on who can actually go into the DPA is part of the deal from the U.S. side. That's part of the executive order. It's not part of the EU law. The EU law, I never have a problem to -- because we don't have Article III standing in the EU --


Stewart Baker:  But no one's [CROSSTALK 48:01] --


Max Schrems:  -- [CROSSTALK 48:01] doesn't even exist.


Stewart Baker:  No one says the Data Protection Authority has to find a violation or has to have a detailed --


Max Schrems:  They do have to be able to -- so you cannot communicate directly to the civil liberties person. I forgot the exact name. You have to go to your National Data Protection Authority. They can then raise it with the U.S. authorities.


      So, as an individual citizen in the EU, you cannot directly communicate. You have to go through the DPA. And the DPA has to assert in certain elements for the U.S. to accept that reference. And they are set out in the executive order. This is not EU law. That is part of the executive order that the EU authority has to comply with to be even able to use that remedy, which I --


Stewart Baker:  They only have to say there's a credible allegation of a violation. They don't have to say --


Max Schrems:  Exactly. Are you going to come up with a credible violation --


Stewart Baker:  You can. If you --




Max Schrems:  If you simply didn't get your visa, or you were put on a security list -- I was put on a security list a couple of times on flights. You just get your SSSS on your boarding pass, and you have no clue whatsoever. And your airline doesn't have any clue why that happened. And then you have to prove that that somehow happened under one of the data transfers.


Stewart Baker:  I think you would be truly --




Max Schrems:  I have done it. I have done -- I've personally done this submission and got rejected.




Stewart Baker:  Well, you've got rejected by your Data Protection Authority, right?


Max Schrems:  Exactly.


Stewart Baker:  Okay. So, look. I can't solve your Data Protection Authority problems, even though you seem to want to solve all of mine. And I think that that objection is an objection based in the adequacy of data protection law in Europe.


Max Schrems:  Again, it's really hard to have a conversation if it's not fact-based. This is part of the executive order. I bet you a million euros --


Stewart Baker:  Yes. Of course. The data protection --




Max Schrems:  -- part of the executive order. It has nothing to do with law. But if you just try to generate some hate against EU law, be my guest. It's just not very accurate of what you're doing here.


Stewart Baker:  No, no. But I think you're trying to blame the U.S. for the decisions of your Data Protection Authority.




Max Schrems:  I'm just saying the reality is] -- I'm just basically telling you the realities of the system and why it's not working. If you don't want to hear the answer to it, that's fair, but then we may want to move on to the next [inaudible 50:08].


Stewart Baker:  I think the system is not working the way you would like it. I think we can agree. But to say there's inadequacy in saying, "We would like to hear from somebody who actually has a position of responsibility that this is an allegation worth reviewing," it's not an unreasonable thing.


Matthew Heiman:  Let's move on. There's another question in the Q & A section. And, Stewart, this one's directed to you. And it says, "Why doesn't the U.S. take a similar approach to China
—I assume as the EU is taking to the U.S.—" and simply not negotiate or request permission from the EU? It seems like the EU wants to hold the U.S. to a higher standard than even some of its EU Member States."


Stewart Baker:  This is true. And I think the reason the U.S. has worked so hard to find common ground here is because the consequences for a lot of U.S. companies—and a lot of European companies as well—would be disastrous if there actually were an effort to cut off exports of data. There's just an enormous flow of data across the Atlantic. And if Europe were to say, "That can't happen, and we're going to impose billion-dollar penalties for moving that data," it would be disastrous for the economy.


      And so, the U.S. is being a little bit blackmailed into accepting these restrictions, notwithstanding that they are not the same restrictions that apply to European governments. So that, essentially, why we're here and why we keep coming back.


      But, yes, if the court tells us to do something that violates our Constitution, as I suspect Max is trying to get us to do --


Max Schrems:  Violate the [inaudible 52:01].


Stewart Baker:  -- then we're probably not going to do it, and then there will be a crisis, and we'll have to take the Chinese approach I suspect.


Matthew Heiman:  And just -- Max, just coming off of that response from Stewart, I'm wondering—and I'm just looking for some common ground here—would you agree with at least the statement that, if the EU and the U.S. are unable to broker an accord on this issue, that it would be highly detrimental to European businesses as well as U.S. businesses? I assume you agree.


Max Schrems:  Yeah. I think that's a problem on both sides. What we do see with U.S. businesses is that they largely host data in Europe by now, anyways, for technical reason, latency, and so on; that usually you can already choose that your data is hosted in Frankfort, in somewhere, Scandinavia, oftentimes, because it's colder. That already exists.


      The bigger issue that we have right now for these companies or for these situations is that it still has service access from the U.S. to have a 24-hour service situation. So one company that I talk to, they say they have 12 hours in Dublin, 12 hours in California, and they can, basically, access stuff all over the globe.


      Now, if you would cut that access, that would already, for a lot of the processing, enable companies, American companies, to provide that data in Europe and can be able to credibly say that they would not have possession, custody, or control of that data.


      That is what Microsoft did in Germany. They basically outsourced the data holding to a German company saying, "We still provide the software. We still sell the product, but we actually don't have access to the user data anymore. And that way we can tell the NSA that we don't have the possibility there."


      What I found interesting is that the question before on China wasn't really answered in the other way around because we do have the same discussion, which, "Well, why not being able to provide our network services for 5G in the U.S., in Canada, in Europe just as well for exactly the same concerns?" And it's interesting because when it comes to these discussions, the U.S. has a very similar view than Europe that they say, "Okay. If we cannot guarantee that these technical elements are actually not spying on us, we not allowed -- we don't allow them on our market." Which is the most normal thing to do as a Nation State. And it's just interesting that there is, on the other side kind of this very different view if Europe does that, and the situation is --


Stewart Baker:  Well, could I stop you there?


Max Schrems:  -- because we are very much -- that there is a dependency that is very different that we do have most services [CROSSTALK 54:24] the U.S.


Stewart Baker:  Max, let me stop you on that.


      Max, let me stop you on the point of the U.S. has similar rules. They're not exactly the same, but they certainly have the same concern about espionage. And they have restricted what kinds of data may be -- what kinds of equipment may be used in 5G networks. But the equipment that they allow to be used all comes from Scandinavia. It is Nokia. It is Ericsson equipment because the U.S. has recognized that Finland and Sweden are broadly aligned with the United States in terms of their approach to democracy and surveillance.


      What's unusual is this demand by Europe that the U.S. adopt every single thing that the European Court of Justice wants.


Max Schrems:  It's not every -- it's not every single thing because the Charter is the baseline of fundamental rights. That's the minimum standard. And that is what it simply will require from any international treaty. You can [inaudible 55:37] it any other international treaty under that. And I'm sure you can do that under U.S. law as well if you found some international agreement or law that, for whatever reason, violates the Bill of Rights. You have the same situation that, actually, any judge in the U.S. can declare that law unlawful. That is simply the setting we're operating in. We're not going to change that to comfort the executive order.


      One very short thing. What's super interesting is the new executive order, by the way, does exactly the same thing. It will only come into force if the EU grants the same rights to American citizens, which makes sense. But is it kind of a copy of the exact part that you criticized on the European side right now?


Stewart Baker:  Well, we've learned something.


Max Schrems:  [CROSSTALK 56:19] the executive order as well.


Matthew Heiman:  Let me ask one last question because we got an interesting one that's obviously timely given -- all of the flurry of activity around AI. And someone's asking a question around probably asking you all to look in your respective crystal balls and comment on how do these -- how will data privacy laws affect Large Language Models?


      And the question is put this way: "If I file a complaint in my home country that my Reddit user data was used to train a Large Language Model, and LMs don't store my personal data but rather the statistical inter-relations, would that imply that my data was used to train a commercial model that has no way of removing my data?" And then he says, "So if there's a single complaint around this Large Language Model, would that require the shutting down of that LLM so that that piece of information can be stripped out of that Large Language Model?"


      I'm just wondering if, Max, you've given thought to the intersection between data privacy laws and the way it works today and how LLMs work?


Max Schrems:  Two things at the beginning. The GDPR itself, the European privacy law, applies to personal data. And it's kind of a raw data law. It tells you if you can use data or not data, but -- or not but not how you use it. So if you use it with an AI or the most old-fashioned algorithm you could find, is not very regulated in the law.


      The European Union just passed an AI Act right now—it's just in the making—that will deal with these things specifically, but that's not law yet.


      For your question specifically, if the data is personal, which means you can trace it back to that individual person, which in a statistical model you usually can't. But let's assume for the sake of the question that you can really figure out that Max Schrems is interested in—I don't know—this thing online. And you can actually get it back to that individual person, you're --


Matthew Heiman:  You're interested in purchasing Stewart Baker's book.


Max Schrems:  Exactly. If that is in the data, then you would have -- that would be considered personal data, and you would need a legal basis for that. There are six legal bases in the GDPR -- what usually everybody knows is consent. There are five others. What they would claim, usually, is legitimate interest to say that they needed that for training it. And currently, there is no case law on that how far legitimate interest in that sphere goes. Usually, legitimate interest is more defensive in the sense of security. For example, CCTV cameras, all of that is accepted. If it's more for making money, more or less, legitimate interest is very limited. So that could be interesting.


      The other thing that may come in is there is exemptions on a national law for research. And a lot of this kind of stuff could fall into the whole research bucket. And that's the point where, as a lawyer, you would have to say, "I need to kind of make more facts and details to actually give you a final answer."


Matthew Heiman:  Yeah. Yeah.


Max Schrems:  But that's kind of the elements of where you would be in. If that part is really illegal, you could shut down the whole thing, yes.


Matthew Heiman:  We're just about out of time.


      Stewart, comments? Just a quick comment on the intersection between data privacy law and AI and LLMs?


Stewart Baker:  I think that it is true that training a Large Language Model is processing data. If there's personal data in there, you're processing that personal data. I think it would be very difficult, not impossible, but difficult to survive a challenge to having processed personal data for purposes of training a Large Language Model that will be used in a wide variety of ways even if you can't get the data back out.


      And then raises the question, can you fix that by saying, "Oh, okay. I did not mean to include Max Schrems' data. I'll take it out." No, I don't think you can. I think, as a practical matter, it would be impossible to fix that problem once the training has occurred. So GDPR probably raises serious questions about the legitimacy of every large language model.


Matthew Heiman:  All right. Well, we're at time. I thank both Stewart Baker and Max Schrems for a vigorous, energetic, and informative discussion. And I will put an invitation out there. Max, Stewart, we'd love to have you come back again at some point to discuss this because I know, even in 61 minutes, we barely scratched the surface of this debate and these issues. But I thank you both for joining us.


Stewart Baker:  Thank you. It was great.


Max Schrems:  You're welcome.


Jack Capizzi:  Thank you all. And, yeah, thank you, Matthew.


      Just on behalf of The Federalist Society, I want to say thanks to Max for joining us all the way from Vienna and for sharing his time with us.


      As always, we do welcome listener feedback by email at [email protected]. Please keep an eye on our website and your emails as well for any announcements about upcoming webinars and live events.


      With that, thanks again, and we are adjourned.