2023 Annual Mike Lewis Memorial Teleforum: Big Data and the Law of War

Event Video

Listen & Download

Big Data is one of the most important resources in the world, yet the rules for its protection are just beginning to develop.  The danger comes into focus by the possibility of a nation-state cyber operation attacking Big Data and having a major detrimental impact on the functioning of another nation-state.  Consider, for example, a cyber attack corrupting, stealing, or destroying the records of important financial institutions, causing widespread confusion and panic.  Would such an attack warrant a kinetic, lethal response, with bullets and bombs?

This issue implicates the UN Charter, the Law of War, International Humanitarian Law, jus in bello and jus ad bellum, attempts to formulate rules in the Tallinn Manual, conflicting priorities among nations, and pure geopolitics. Professor Paul Stephan of the University of Virginia Law School and John Eisenberg, Former Deputy Counsel to the President and NSC Legal Advisor, joined us to explore the issue. 

Mike Lewis was a naval aviator, and then a renowned law professor, widely admired by other scholars and practitioners. He was a great friend of the Federalist Society, appearing at dozens of lawyer and student chapter events, as well as the 2014 National Convention. He was also a member of the Executive Committee of the Society's International & National Security Law Practice Group. Each year, the Practice Group holds a Teleforum in his honor.


  • John Eisenberg, Former Assistant to the President and Deputy Counsel to the President, Former NSC Legal Advisor
  • Prof. Paul Stephan, John C. Jeffries, Jr., Distinguished Professor of Law, University of Virginia School of Law
  • [Moderator] Vince Vitkowsky, Partner, Gfeller Laurie LLP



As always, the Federalist Society takes no position on particular legal or public policy issues; all expressions of opinion are those of the speaker.

Event Transcript



Jack Capizzi:  Hello, and welcome to today's Federalist Society virtual event.  Today, September 5, 2023, we are honored to present the "2023 Mike Lewis Memorial Teleforum on Big Data and the Law of War."  My name is Jack Capizzi, and I'm an Assistant Director of Practice Groups at The Federalist Society. As always, please note that all expressions of opinion are those of the experts on today's call.


After our speakers have given their remarks, we will turn to answer questions from the audience. If you have a question at any point throughout today's program, please type it into the Q&A function at the bottom of your screen, and we will handle the questions as we can, towards the end of the program. With that, thank you all very much for being with us today. Vince, I'll hand it over to you.


Vincent Vitkowsky:  Thank you, Jack. Hello all. Thanks for sharing part of your day with us. I'm a lawyer in private practice and former Chair of the Society's International and National Security Law Practice Group, which is sponsoring this event, the Annual Mike Lewis Memorial Teleforum. Mike Lewis was a great friend of the Society. He began his national security activities in a fully kinetic way, as a naval aviator: first in his class at Top Gun, and then in Operation Desert Stop and Desert Shield. 


      Mike went on to become a leading international law professor focusing on the law of armed conflict and counterterrorism.  He was beloved by his students. And he was liked by all, even the many scholars who disagreed with him from the other side of the spectrum. He really was an amazing person, an incredible blend of talent, energy, charisma, sincerity, generosity of spirit. He died far too young, of cancer. Mike spoke at countless events for the Society and was a member of the Executive Committee of this group. So, each year, we hold a Teleforum in his memory.


      This year, we're fortunate to have two distinguished experts address an important unresolved question in the law of armed conflict. Maybe it doesn't even belong there. An unresolved question: whether certain cyber-attacks might justify a response with bullets and bombs. Paul Stephan has [inaudible 00:02:39] 1979, focusing on national security. He served as counselor on international law to the legal advisor of the state department, and as Special Counsel to the General Counsel of the Department of Defense.  His 2023 book -- is it out now?  It's out now, Paul?


Prof. Paul Stephan:  Yep.


Vincent Vitkowsky:  The World Crisis and International Law: The Knowledge Economy and the Battle for the Future addresses emerging threats, including cyber security. John Eisenberg is a national security and white-collar attorney with deep experience in government and private practice. I should mention, he is serving under duress today. He is not feeling well, a bit feverish, and, therefore, dressed in a way to absorb sweat. But you would typically find him in a suit and tie.


      John has served as Legal Advisor to the National Security Council, Assistant to the President, and Deputy Counsel to the President. Earlier in his career, he served in senior positions at the Department of Justice, including Associate Deputy Attorney General and Deputy Assistant Attorney General in the Office of Legal Counsel, again focusing on national security issues. So, with that, let's get going. Paul, the subject of this Teleforum was inspired by articles you recently wrote called "Big Data and the Law of War."  So, let's start by asking you, what do you mean by "big data"?


Prof. Paul Stephan:  Thank you, Vince. And I want to express how honored I am to be asked to speak at this forum and, particularly, in tandem with John. John was my boss's boss the last time I was in the government. So I feel privileged to be here. Big data is the accumulation and organization of vast data sets that are used for some purpose or interrogated in some way. Artificial intelligence, which is a source of a lot of excitement and even moral panic right now, depends on big data to work. Artificial intelligence is basically what you get when you interrogate big data with algorithms.


      Internationally, the technology owes a lot to the United States for originally developing it. China, I think, in some ways, has something of a lead over the United States for two reasons. One, it's quite unconstrained about issues of privacy or personal liberty and the accumulation of data. And, secondly, the line between the private and the public sector that's very important in the United States is very fuzzy, at best, in China.


Our allies in Europe, by and large, with some exceptions, are hostile to the entire concept. They're so far behind that they're not trying to catch up, but rather adopting new legislation using old legislation to block at least the private businesses that are the principal proprietors of big data sets in the United States.


Vincent Vitkowsky:  Just to set the stage: we have many knowledgeable listeners, but "law of war" is kind of a loose term. So how do you mean it? What do you mean it to consist of? And what effect would --?


Prof. Paul Stephan:  So, within the field of people who do this stuff, there's a lot of terminology. And there are some distinctions and definitions that may not be completely clear to outsiders. So, on the one hand, we have the law of armed combat, which is understood to apply to the regulation of kinetic activity within an armed conflict, the Latin jus in bellum. And the law of war encompasses that, but also the law as to the legality of the initiation of armed conflict and responses to armed conflict, the jus ad bellum.


So, when I talk about the law of war, I'm talking both about the line between active measures that are permitted, generally, under international law, versus measures that go over the line and are treated as impermissible uses of force that interfere with sovereignty of another country, as well as the use of force that justifies self-defense under the U.N. Charter, as well, of course, as once we're in an armed conflict: what constraints are on us, and particularly what might be treated as civilian, as opposed to military, for purposes of the limits of the law of armed conflict. You're not supposed to target civilian objects, for example. 


Vincent Vitkowsky:  John, turning specifically to cyber conflict, again stage setting, what's the Tallinn manual, and what approaches does it take to the question of targeting big data?


John Eisenberg:  Sure.  First, let me also state that I'm really honored to speak here and to speak with Paul. And while I might have been Paul's boss's boss in the last administration, I think you'll see who deserves top billing here. And that isn't me.  But, anyhow, in 2007, Estonia was, I'd like to say, "attacked." But that might be wrought up in too much other language, here. It fell victim to large-scale and pretty serious cyber operations of the malicious form.


Partially in response to that, NATO sped up the establishment of its — I want to get this right — the NATO Cooperative Cyber Defence Centre of Excellence, which I will just call COE, in Tallinn, Estonia, of all places. The COE, in turn, quickly commissioned a study of cyber warfare. And that was to be conducted by a team of, I think, 18, anyhow, a group of international legal experts, joined by some technical advisors and actually some observers which included U.S. Cyber Command, to come up with what the rules might be for such a thing. 


Among other things, they looked into the use of force by states, and also cyber operations in the context of an armed conflict. The result was the Tallinn Manual, version 1.0, which was published in 2013. Pretty much immediately thereafter, the COE commissioned a study of additional topics, which were mostly sort of the leftover topics in international law that it hadn't done in version 1. And that was Tallinn 2.0, which was published in, I think, 2017, and is the current version of the Tallinn Manual. Basically, the Tallinn Manual is just as near as these people could come up with the rules for cyber operations, in and out of armed conflict.


And, given how difficult it is to come up with these things kind of in a vacuum, they did a pretty good job. I say, "in a vacuum," because very few states — we'll come to this later, in more detail — very few states will say what their views are on the matter.  And, unlike regular military operations, most cyber operations are concealed. And so, the public may never know about them. And what they do know about them is often not exactly correct. And, actually, very few states have the technical ability to know about most of them. So it's all very secretive, compared to other military operations. So that takes us to -- sorry.


Vincent Vitkowsky:  John, let me circle back. So, what do we have?  We have a bunch of law professors somehow distilling from the air what the law of cyber conflict should be?


John Eisenberg:  Mostly what they have is what we'll get to in the next part when we get to a specific -- is they have treaties. And there is customary international law practice that, while the treaties and the international law practice may not be focused on cyber operations, that doesn't mean they're not applicable. You can always do what lawyers do, and generalize from one set of topics to another, and just reason by analogy. And that's a lot of what was done. And then, some of the treaties may have been more on point, as if they had been in a world that had cyber operations in them.


      So, when it comes to the exact question of whether data, itself, is an object, the group of experts found most compelling, and the majority of the group of experts found dispositive, essentially, was the history around Article 52 of Additional Protocol 1 of the Geneva Conventions, and the commentary there, too.  First, it's an initial matter of the word "object" in the dictionary has a meaning that's closely tied to tangible objects. I think it says something like "material things, that," blah, blah, blah. Anyhow, that definition, I think you can make a pretty good argument, does tie to tangible things, and not just everything in the world.


      But then, also, the commentary to the additional protocols — 1987 commentary — states that objects are "visible and tangible." So the majority of the experts concluded, based on those things, I think, primarily, and almost exclusively based on those things, that data is not an object. It's not tangible, therefore, it's not an object. There's a substantial group of dissenting experts in the legal experts group that came to the opposite conclusion, finding that that would be underinclusive, because you could imagine all sorts of purely civilian data sets that the destruction or manipulation of which could have extremely serious effects on people, even if they have no immediate effects on the outside world, so nothing that would bring them within the rule as the majority stated it.


      And they thought that Additional Protocol Article 48, which basically says that the general population is to be protected against the effects of hostilities, also suggests that they should take a broader view of what an object is. So that was the dissenting view. In my view, I think, on the plain text of just the treaty issue, I think, on the plain text the majority has a pretty good argument. It's hard to get around the fact that "object" does have this pretty much number one meaning, which is confined to material things. And the whole concept is based on Additional Protocol Article 52, which has the same word and the "legislative history" that says "visible and tangible." You have to be visible and tangible to be an object.


Vincent Vitkowsky:  Object is totally different than property, right?


John Eisenberg:  Yeah, right.


Vincent Vitkowsky:  Property -- does that concept have any relevance to this? Because one of our readers asked, "Would you consider data as property?"


John Eisenberg:  Yeah. It is property, of a form. But no, I think this rides almost entirely on the fact that it uses the word "object" in Article 52. And so, there may be intellectual property. No, I don't think so. I mean, the relevant rules, even in Tallinn, come from Article 52, here. 


Vincent Vitkowsky:  So, let me catch up here. So, when we talk about big data, we talk about it in the abstract.  Is it fair to assume that something like financial records fit into the category of big data, for the purposes of this discussion?


John Eisenberg:  Yeah, I think so. It's a large data set. I don't know. I'd ask Paul that again. I don't know if big data has to mean something that is being used for AI purposes, in which case, financial records, while they probably are being used for that reason, they may not be, for all I know. They don't necessarily have to be. They're certainly data. 


Vincent Vitkowsky:  Paul, do you have a thought on that?


Prof. Paul Stephan:  Yeah. So, big data is something that stands alone from the way it's interrogated. Historically, go back a few centuries, and it's big data that supports actuarial tables that allowed the insurance industry to be invented. So data is simply a recorded observation, and big data is a lot of them.


John Eisenberg:  Just a lot of them?


Prof. Paul Stephan:  And it is less than property. I teach property. Property covers a lot of stuff — say, your likeness and image — that we wouldn't really consider data. Intellectual property, we really wouldn't consider data. But data is usable knowledge that's stored in a form where it can be retrieved and interrogated. AI is a form of interrogation, but not the only form of interrogation.


John Eisenberg:  To be big data, it doesn't have to be interrogative at all. It just has to be data and big.


Prof. Paul Stephan:  Yeah.


Vincent Vitkowsky:  We know that the experts are of two views about whether data — and, again, just to lay out the chain — if data is an object when targeted, then it may justify a kinetic response. A broad oversimplification, but can we go with that, Paul? So, the experts are of two views. What about the people who actually have to make decisions: governments and government leaders? So, John, let me, again, go to you. Because I'm just guessing that this is not always entirely an abstract question for you.  


John Eisenberg:  Obviously, I can't comment on anything that I would have done or learned in my last job, except for more experiential parts of it. I think it's likely that all sorts of people in the decision-making structure have no idea of these things at all. They would learn of them in meetings at the White House, probably in what's called the Principal's Committee, or even in the National Security Council, which is the Principal's Committee, essentially, plus the president.


      The people that would have the strong views are somewhat lower down, who deal with this stuff sort of exclusively, as part of their jobs. But their view, I don't think, counts. It wouldn't go to what the United States is -- if a person in the legal advisors' office of the State Department believes that big data is an object, it doesn't mean the United States believes it. It's going to take somebody far higher up to believe that. And it may never have come to this knowledge at all. They never thought about this question at all.  And they would think about it only in the context where it became really essential for the United States to make the decision whether to treat big data as an object. And that would come in two forms.


It would either come in "our big data has been messed with, and we have some serious consequences, and we want to know what kind of responses are lawful under international law." And then we'd have to know whether big data constitutes an object and whether we have an armed attack in response. And, conversely, it would be important to know the same thing when we were considering attacking someone else's big data, at least outside of the context of an armed conflict. Inside the context of an armed conflict, it may be a little bit different of a question. 


But those are the contexts in which people high enough up in the United States government for their decisions to count would have even thought about the question. That has good effects for the United States, in a way, because, in a way, you think the United States probably doesn't want to have to commit to a view of whether big data is an object until it absolutely has to. And the reason is it doesn't want to publicly announce, because that's giving away an awful lot of information and tying it down. And second, and probably much more importantly, you never know what side of this argument you're going to be on beforehand. And so, before you commit to whether it is or isn't an object, you'd like to know whether it's a good thing for it to be an object or not. 


Vincent Vitkowsky:  Now, not all governments are that shy, right? At least a couple have put forth position papers.


John Eisenberg:  Yeah. There's one thing I wanted to flag before we went to that, which is there are two questions. I answered the question because I think that's largely the way the legal experts for the Tallinn Manual went down as a treaty question, essentially, because the word "object" comes from Additional Protocol 1 of the Geneva Convention. The other way it comes up, though, is as customary international law. And it's perfectly conceivable for those two things to have different answers even, although, ultimately, you'd think what states are doing is the more important thing.


      But, to the second question, what's the answer with respect to CIL? I think that's really difficult because, as mentioned, cyber is so secretive and concealable, it's very hard to know what the state practice is. The only people who know what the state practice is are probably those that have access to intelligence files and operational files. And so, it becomes very difficult for a group of experts to make a decision based on what international law is. They just don't have access to the information.


      But, yes, you were saying there are states that have actually pre-announced their views on this. And I think France probably is the big one. And it said that civilian -- I'm going to read something from their -- "France, considering civilian content of data to be protected objects, and cyber operations directed at data may constitute an attack for purposes of triggering law of armed conflict without generally qualifying as an armed attack for Article 51 purposes." That's a little bit strange of a position to split the question into whether it triggers armed conflict and whether it triggers Article 51.  You'd think if it was sufficient to trigger the rules of armed conflict then it would also be sufficient enough to trigger a state's right of self-defense. But France sort of split those two things apart. 


      Germany, through a position paper, has stated that, "A computer network and cyber infrastructure, or even data stocks, are examples of objects." So they appear to take the view that data is an example of an object unless "data stocks'' is supposed to have some other meaning that I'm not aware of. Romania has taken the "preliminary view" that cyber operations against data do trigger the application of IHL — International Humanitarian Law — and, further, "cyber-attacks can only be directed against those data that represent military objectives according to IHL and cannot be directed against those data that represent civilian objects, which must be protected under the principle of distinction, i.e., under the law of armed conflict.


      The Danish Ministry Manual states, on the other hand, that data, in general, do not constitute an object.  Chile's position appears to be about the same. And Israel's Deputy Attorney General has stated for Israel that, as it currently stands, only tangible things can constitute an object. So, other than, potentially, Israel and France, those are not the biggest players in the world of cyber operations. And even those ones, they split almost down the middle on what data is. And France takes both sides. So we can consider it equal that way.


Vincent Vitkowsky:  Let me just pause for a second and ask about a related subject. The phrase "attribution" is used to mean a designation of a particular cyber actor as the entity causing the attack, whether it's a state or a non-state actor. How does attribution fit into all of this? I'm still with you, John.


John Eisenberg:  One way that I think it can fit into this, to affect the analysis here, is that -- let's compare it to regular military operations. Traditional military operations can be kept, and frequently are kept secret until they're triggered, until they go off. We may not say that we're going to send a special operation over to this particular place and blow something up. But once they get there and the shooting starts, or the thing blows up, almost never can you conceal it at that point. Because the whole point of it is to take some major action in the actual world.


And, in fact, we often announce it from the podium at the White House or the Department of Defense — I guess, depending on how it goes — that we took some measure.  We might still keep certain parts of it secret, but the overall fact that the United States anyhow has done something in the world, is announced. And that's very different from how cyber can, at least, go down. People may never detect that it ever happened. If they detect that it happened, they may never know who did it. And they certainly are rarely going to know who did it with certainty. 


That being the case, we sometimes don't even know, was it a state?  Or was it a non-state? And if it wasn't a state, then we can't really figure this into the analysis of CIL. It won't become a good example of what nations do between each other, and then, therefore, figure into the analysis of customary international law. So, to me, that's the biggest factor in trying to come up with, is big data an object? Is that we just may never know the character of a particular cyber operation. 


Vincent Vitkowsky:  There are some. There have been attacks that have been attributed by each of the Five Eyes intelligence agencies to a given country, in response to which we did nothing. That was a prior, prior administration. And I guess that's what we're struggling with. Paul, let me ask you to comment broadly about a number of the subjects that we've had. Perhaps the role of, problems with, developing customary international law in this context. What would be, in your mind, the key arguments for and against responding with physical force to an attack on big data, if that attack doesn't produce physical loss or have any impact on physical systems?


Prof. Paul Stephan:  Just to talk about the legal instrument background for a second, because I think that's relevant. So, with respect to the law governing armed conflict, we do have a number of treaties, as well as customary international law. A lot of states, most states with the capability to project force, have manuals that they publish, although they are often careful to say, "Our manuals are for internal uses only, but not statements about international law." But, in general, I would say there's a lot of stuff out there about the law of armed conflict, the law of how to do armed conflict. 


      With respect to the law governing the initiation of armed conflict, including forceful responses to attacks by others, it's basically the U.N. Charter. And there are some regional treaties that go a bit further. But most of the conflicts we see are not between parties to a regional treaty. And the thing about the U.N. Charter is that there is a rich and contentious interpretation of what they mean. And sometimes states find themselves on both sides of that. So, that's the background.


      What that means is you can say things about the law of armed conflict with a certain level of confidence. And you can make analogies better. At least, you have more materials to make analogies with, with respect to armed conflict, than you can to the law justifying the use of force, the starting of wars. So that's point number one. And then, point number two is the consequences are different. I'm a consequentialist. I want to think about the law in terms of what happens if you violate it or ignore it. And in the law of armed conflict, ultimately, you answer to criminal law, which often is state based.


We have this backstop of international criminal law. But a country like the United States' Uniform Code of Military Justice is there to impose restraints on our own decision-makers. And we have the label, "war criminal," which we can attach to adversaries who have done bad things. And we have different mechanisms. Typically, they involve a reckoning at the end of a conflict. We're not a party to the international criminal court. No state that seriously uses force to project it is a party to that tribunal.


With respect to arguments about when you're allowed to use force against someone who's gone from adversary to enemy, there is no tribunal. There's an attempt to amend the international criminal court to take that on. But it's very limited, and really doesn't apply to any country that uses force or projects force. So, the issue is, I think, given these consequences, the law of when you can use force is the only body of law that justifies the use of force. That's to say, if you think use of force, blowing things up and killing people is a big deal. It's different from putting people in jail. It's different from charging them money through sanctions or other kinds of economic penalties.


So you ought to think of it as a big deal. And then the issue becomes are there maybe constraints on when we start wars that might be different from what we do once we're in conflict? Partly, you don't know, when you start something, how it's going to end up. It's a lot of uncertainty. While, when you're in a conflict, there is at least some -- there's certainly the fog of war, but there's also a lot of uncertain circumstances. So, we're seeing among countries -- also, I hasten to add, it's not U.S. doctrine to suggest that the general background principles, the Geneva Conventions and the like, apply to disabling data sets, the poisoning or disabling of big data in a context where there are no kinetic consequences.


It's the difference between bank records to get people angry versus taking down an air traffic control system that crashes planes. That’s one way of thinking of the distinction. With respect to starting wars, one of our concerns is there are asymmetric bad actors out there who have cyber capacity that's outsized, in respect to their capacity to project force. And Iran would be an example.


They don't have nukes yet, although they're trying to acquire them. North Korea only has nukes. They don't really have force that they can threaten anyone but South Korea with. But both of those countries have first-class cyber capacity. And Russia is certainly a world leader in cyber capacity. And the events of the last 18 months suggest that their ability to project force is not as great as they certainly hoped they would have.


John Eisenberg:  Other than nuclear, anyhow.


Prof. Paul Stephan:  Except for nuclear, exactly. So, one of the issues is when a high-cyber, low-kinetic country does something damaging to a country that is very vulnerable to cyber-attacks. And the U. S. is certainly the most vulnerable country in the world because we are more dependent on our cyber capacities than any other country. And we have great kinetic capacity. Should countries like the United States have a lower threshold, in order to treat cyber operations as uses of force where they are very painful, but they don't involve physical destruction or deaths of people?


Or do we want to be careful about maintaining those guardrails? Partly, this is about do we want to constrain -- we certainly want our decision-makers, our political masters, to have all the tools they need to protect the nation's security. But sometimes, legal constraints can focus their attention somewhat on some of the downstream consequences. So the argument for treating cyber assets as objects — even when there are no physical world consequences — is they are significant. In the modern world, they're more and more important. Emotional and economic pain follows from their destruction.


And, therefore, this distinction between physicality and cyber is so 20th century. We've evolved above that. And the pushback argument -- and I've been careful in my writing. I'm not endorsing any arguments. I'm a teacher. I'm trying to get people to ask these questions. I'm not obligated to answer them. But I think it's a legitimate question, given the consequences of responding to a cyber attack with violence, as opposed to other cyber operations. Do we want some guardrails there? Israel is the only county that I'm aware of that has used violence in response to cyber-attacks, but only against adversaries where they're already in an armed conflict. So it's a different situation, it seems to me.


John Eisenberg:  And did they do it because they felt that that was law, or because they just felt that it was a much more acceptable thing to do it within the confines of an existing armed conflict?


Prof. Paul Stephan:  Yeah. So, I think they felt the law of armed conflict allowed them to do that. And it's debatable whether their responses were consistent, if the law of armed conflict were to apply. Were the responses consistent, in terms of the principle of distinction and the principle of proportionality? And I think the lawyers for the IDF would say, "First of all, we did comply with distinction and proportionality, given that our target, which was Hezbollah, is not a state entity." And so, these distinctions are harder to maintain when it's not a state adversary -- a state supported adversary, but not a state adversary.


And then, I think they would say argument B, fallback argument is we don't think the law of armed conflict limits cyber operations in the way that it limits kinetic operations. But, again, that was the law of armed conflict, as opposed to should we go to war? There are voices out there. A sort of off-the-wall example, but one of the issues, with respect to our sanctioning regime with respect to Russia, does the president have an existing authority to confiscate, as opposed to freeze, Russian assets?


And a very famous law professor -- I'm going to be very careful what I say here — so a very famous law professor, retired from Harvard Law School with a famous treatise, argued under the Patriot Act, which was adopted in response to 9/11, says that if there is an attack like 9/11, we can confiscate, rather than freeze. And, although that was meant to cover a non-state actor, Al Qaeda, it could cover a state actor as well.


And this famous law professor said, "Well, of course, since we have labeled" — this is what John started his remarks with — "Since we've labeled some of Russia's actions as cyber-attacks, they're attacks. That's the language in the Patriot Act. Therefore, the president has the authority to start confiscating goods right away.  We've been very careful, deliberate, to say Russia has not engaged in any attack on the United States for purposes of the law of war, either ad bellum or, much less, in bello. And we're not in a state of armed conflict with Russia. Given all the consequences of being in an armed conflict with Russia, I hope we stay that way until events drive us in another direction, and I don't --


John Eisenberg:  And I know we're arguably a co-belligerent with an enemy of theirs.


Prof. Paul Stephan:  Yeah, yeah, yeah. Absolutely true. But we've been very careful not to say that. And, interestingly, so have the Russians. So, the Russians might be entitled to make that argument. But they've been very careful not to make that argument. And I think that's because, whatever you think of the Russians — I, myself, have done a lot of litigating against them — I think the idea of where we might end up with, in terms of escalation and nuclear capacity, does concentrate that mind a bit.


Vincent Vitkowsky:  We did not discuss this in our prep, so we're hearing for the first time.


Prof. Paul Stephan:  I'm trying to turn on the lights in my room.


Vincent Vitkowsky:  Okay. So, you're hearing it for the first time. And I just can't help it. Going from lives to dollars and cents -- in my day job, I spend a lot of time helping insurance companies draft war or state-sponsored cyber operation exclusions. And it's a very fluid situation right now. It's a work in progress.


But the trend seems to be through the exclusion, when there's a state-sponsored cyber attack that causes a major detrimental impact on the functioning of a state by disrupting the availability, integrity, or delivery of an essential service. And we're defining “essential service" to include financial services and associated financial market infrastructure, so it's got major detrimental impact on an essential service, including finance. What do you think of that, is my question?


Prof. Paul Stephan:  Vince, I think this is where being a consequentialist matters. So, the stakes, in that contractual language and any disputes based on that language, is money.


Vincent Vitkowsky:  Right.


Prof. Paul Stephan:  And the stakes in the law of war may be about money, but it's often about — particularly if we're talking about the law of authorizing the use of force — it's about unleashing the dogs of war. And you might take a different approach to a legal test where the consequences are so different. So, it's a great point. But we also, I think, shouldn't automatically fall down the “rose is a rose is a rose” path. Sometimes something can be treated as associated with armed conflict for insurance purposes, and we treat it completely differently when we're talking about how we actually exercise our kinetic capacities.


John Eisenberg:  And if what you care about is the treaty-based version of "the is data an object," it may be that it's not an object, regardless of what the consequences are, as long as they're not something in the real kinetic world, for the reasons you discussed earlier.


Vincent Vitkowsky:  As you've indicated, we just don't know how it's going to develop. But one can envision, in some future world, a devastating attack on our financial infrastructure that causes people to go crazy, riots in the streets, destruction. I guess that's a different [inaudible 00:44:00].


John Eisenberg:  You faded out for a second. For me, anyhow.


Vincent Vitkowsky:  What's that?


John Eisenberg:  You faded out for a moment, for me. I don't know if it was just me, though.


Prof. Paul Stephan:  Yeah, I didn't hear everything you said either, Vince.  I'm sorry.


Vincent Vitkowsky:  Okay. [Inaudible 00:44:17], but you can imagine a situation where the scope of the attack on financial records is so substantial that it causes riots in the street. Is it, at least, theoretically possible that that becomes something that warrants a kinetic response?


Prof. Paul Stephan:  The British have said that. And my old boss Paul Ney, in his gloss on the U.S. take on this, noted that the British said that, without expressly endorsing that idea. So, the idea is out there. And I actually think, for strategic reasons, we don't want it clearly resolved.


John Eisenberg:  Absolutely.


Prof. Paul Stephan:  We don't want to say to Iran or North Korea, "Go ahead. Trash our banking system. We'll be really angry. But, of course, we're already really angry at you. But we won't use kinetic responses." We don't want to be pinned down on that. But, at the same time, maybe we want, as lawyers, to be able to tell our principals, "You've got to worry about this. We can't tell you that it's written in stone. But it is a legal concern that you have to take account of, however you respond to it."


Vincent Vitkowsky:  There are several questions from our audience. Before I start going with them, I wanted to ask each of you whether there's anything you'd like to mention or emphasize that you have not so far.


Prof. Paul Stephan:  I'm good.


John Eisenberg:  I'm okay.


Vincent Vitkowsky:  Okay. Let me see what I can do out here. I think the question is for 2016, not 2026. But, "In 2016 Hillary Clinton said the U.S. government should treat cyber-attacks like any other attack. ‘As president I will make it clear that the U.S. will treat cyber attacks just like any other attack. We will be ready with serious political, economic, and military responses.'” The question is, would that have been legal for a president to make this declaration?"


Prof. Paul Stephan:  So, John, if you don't mind, I'll jump in first and then you can fix the damage that I caused.


John Eisenberg:  Absolutely. 


Prof. Paul Stephan:  So, he sounds a lot like that famous Harvard Law professor that I just mentioned. And I have my reservations about it. And this is a political speech. She's making a political claim. And she's speaking in broad, unqualified language, as people running for president tend to do. So, I'm happy to cut her some slack, up to a point. But I think that the core idea is the kindest thing I can say about it is pandering. And I would say, just to be very careful, it's legally problematic.


John Eisenberg:  As you said earlier, though, there's a lot of ambiguity here. And I think if the president, under the right circumstances -- and I'm speaking without having done all of the analysis that would need to be done, which is stupid of me. So maybe I should just be quiet. But there's probably enough ambiguity that the United States could almost certainly get to something that made that a true statement still. 


      For one thing, we're not bound by Article 52 of Additional Protocol 1, so I don't think we're bound by what the Tallinn Manual -- their main argument in favor of saying that data is not an object. So, the question is, can we get there under CIL? And, actually, the other question is, do we even need to get there under CIL, since CIL doesn't bind the president, at least in my view?


I think you could probably get there, at least under what she said, which was still pretty vague. Treat cyber like any other, sure. So, kinetic operations all have kinetic effect. And everyone agrees that if a cyber operation has kinetic effect, then probably you could treat it the same way as well. So, maybe it's a null statement.


Prof. Paul Stephan:  Yeah. And when she says, "military resources," given that NSA is certainly part of our military resources -- a response confined to NSA, I wouldn't question that claim.


John Eisenberg:  No, we don't.


Prof. Paul Stephan:  The only issue is when we pass the torch from NSA to our other kinetic capacities then I would be concerned.


John Eisenberg:  Got it. 


Vincent Vitkowsky:  A comment and a question: the listener writes, "I think we get too wrapped up in how law applies in the cyber world. Wouldn't it be more practical to consider cyber assets as property, and then just apply the general principles of law?"


Prof. Paul Stephan:  Again, if I can jump in first. So my work-in-progress Law Review article is about applying property to data. So I think that's a great idea. I think there are very rich possibilities there. And I could go on for two hours on that point, which you absolutely don't want me to do. But I would just say this, that the law of property doesn't really give us a lot of guidance, in terms of thinking about the connection between cyber activity and kinetic activity. I think property law does a lot of work. I love property law. But I don't think it answers those questions.


John Eisenberg:  I already said what I thought earlier.


Vincent Vitkowsky:  Interesting. I don't have any familiarity with the facts here. But one of the questions says that Klaus Schwab, head of the World Economic Forum, is telegraphing a major cyber-attack with COVID-like characteristics. Do you know anything about that, guys?


Prof. Paul Stephan:  No. But I can answer the question of what the World Economic Forum is. It is the host of the Davos Forums. If you're concerned about global warming, they're one of the world's major contributors to global warming because of all the corporate jets that go to Davos. But I know nothing about this supposed cyber attack with COVID-like characteristics. I just don't know.


John Eisenberg:  What it would mean for a cyber attack to have COVID characteristics?


Vincent Vitkowsky:  Yeah, I don't know.  It would defy my -- there's a link provided here, but I would blow my technical mind trying to check it out in the middle of this, so I'm going to need to pass.  Here's one. And you can answer it in different ways. The question is, "Do we have a U.S. cyber academy?"


John Eisenberg:  Well, we certainly don't, as the other service economies. We must have something like that somewhere, though, advanced cyber training. I don't know if that's just within the NSA, or what.


Prof. Paul Stephan:  You know about this more than I do, John. But there is the annual event that Cyber-Command hosts.


John Eisenberg:  Right.


Prof. Paul Stephan:  And I think that's connected to a training program. I don't know if it's tied to the war college or somewhere else. I don't know if they do stuff at Fort Meade, specifically. I just don't know.


John Eisenberg:  Yeah, me neither.


Vincent Vitkowsky:  So, I think we're coming to the end of the hour. Any closing thoughts or remarks?


Prof. Paul Stephan:  Well, I might jump in and answer Larry White's question about Estonia, if I may.


Vincent Vitkowsky:  Yeah.


Prof. Paul Stephan:  That was so old-school. That was, what, 2004 - '06?  Something like that. And about all we had back then was denial-of-services attacks, which is what it was. And I don't think anyone today would regard a denial-of-service attack as -- unless the service being denied was something that was immediately vital for maintenance of human -- like, denial of service wouldn't take down air traffic control unless you were severing the contact between the airplanes and the control.


Prof. Paul Stephan:  It's just like jamming. And jamming, for all time, has been considered not [inaudible 00:53:55]


Prof. Paul Stephan:  Yeah. And, beyond that, I'm very grateful that you thought my couple of articles I've written about this are worth talking about. Again, it's an honor to be part of this conversation. And I think the main point to take away is these are early days. There's not a whole lot of hard law. There's really no hard law. And all of our principal adversaries, Russia and China take the position that there's nothing until we have treaties. And they've been using the U.N. as a forum to argue for treaties, which we're not coming anywhere near to.


But I do think the way is open in the future for developing habits of practice that, at most, may be soft law, but might work as guardrails. The analogy I use is that, at least historically, during the Cold War, there was an understanding that we didn't go on our adversaries' territory to kill anybody, that we would kill our adversaries if they were on our territory. I mean spies and civilians involved in active operations.


And, A, that norm seems to have been diluted in the post-Cold War period. See what Russia has been doing. But, B, I think it's an example of how we can come to a way of understanding, without it being formalized. And we might, for example, have that cyber-attacks on essential infrastructure are out of bounds. We might be able to persuade our Chinese adversaries to sit on the North Koreans in their ransomware operations, which are principally fundraising, rather than strategic, and have the Chinese agree that state-sponsored ransomware is not a good idea. And we could get there, again, without putting anything down in writing, but still have something that has soft law characteristics. I'd like us to move in that direction.


Vincent Vitkowsky:  But we really have a gap, don't we? We have treaties, which are subject to interpretation, and don't really cover it. We have the whole traditional notion of customary international law being something universally accepted and demonstrated by state practice over long periods of time. It just doesn't really apply. We have lots of people doing things quickly and secretively. And norms and guidelines or guardrails are great. Haven't we broken every guardrail ever set in the history of the world? There used to be the understanding that people wouldn't engage in aerial combat, until they did. So, where do we find the inspiration, or the concept, or the way forward?


John Eisenberg:  Well, I think, unfortunately, I'm not sure that the relevant powers view it as a gap. They may view ambiguity as something that they want. And, therefore, it's not a gap, in a meaningful sense. So, that would make it much more difficult, even to come up with the sorts of things Paul is suggesting.


Prof. Paul Stephan:  And, I would just add, on an optimistic note, as a legal educator, if this stuff is clear and written down, you don't need a good lawyer. The whole role of good lawyering is to deal with these highly ambiguous but very important situations where maybe you don't always want to have a green light permanently on. But you do want to inspire our policymakers to have thoughtful conversations.


Vincent Vitkowsky:  Well, thanks very much for your time and thoughts, and to our listeners for your time and attention. It's been a terrific discussion. I'll turn it over to Jack to wind us out.


Jack Capizzi:  Thank you, Vince. I certainly want to echo your comments. It's been a great discussion. And just to say, on behalf of The Federalist Society, thank you to Paul and John for joining us today, and to Vince for organizing this year's version of this memorial Teleforum. As always, please keep an eye on our website and your emails for announcements on upcoming webinars. Today, at 2:00 p.m. Eastern, we have a litigation update on the case of Jackson v. Raffensperger, if you're interested. But, other than that, thank you all very much for your time. With that, we are adjourned.