Going Dark Phenomenon

Criminal Law & Procedure Practice Group and Regulatory Transparency Project Teleforum

Listen & Download

What is the effect of encryption on government/law enforcement access to digital evidence?  Default device encryption inhibits government’s ability to obtain access to electronic data even in circumstances that satisfy Fourth Amendment warrant requirements.  Lawmakers and policymakers have struggled over whether this lack of access a bad thing. How do we balance our collective security interests in allowing law enforcement to access the contents of electronic devices and our individual privacy rights in securing our data against illegal and/or unreasonable access? 


Jamil Jaffer, Founder and Director, National Security Law & Policy Program 

Greg Brower, Shareholder, Brownstein Hyatt Farber Shrek 

Kenn Kern, Assistant District Attorney, New York county District Attorney's Office 

Moderator: Hon. Michele Christiansen Forster, Associate Presiding Judge, Utah Court of Appeals 

Teleforum calls are open to all dues paying members of the Federalist Society. To become a member, sign up on the website. As a member, you should receive email announcements of upcoming Teleforum calls which contain the conference call phone number. If you are not receiving those email announcements, please contact us at 202-822-8138.

Event Transcript

Operator:  Welcome to The Federalist Society's Practice Group Podcast. The following podcast, hosted by The Federalist Society's Criminal Law & Procedure Practice Group, was recorded on Friday, February 1, 2019 during a live teleforum conference call held exclusively for Federalist Society members.           


Micah Wallen:  Welcome to The Federalist Society's teleforum conference call. This afternoon's topic is titled, "The Going Dark Phenomenon." My name is Micah Wallen, and I'm the Assistant Director of Practice Groups at The Federalist Society.


As always, please note that all expressions of opinion are those of the experts on today's call.


Today it is my pleasure to introduce our moderator, Judge Michele Christiansen Forster, who is an Associate Presiding Judge on the Utah Court of Appeals. Judge Michele will be introducing our panelists today.


After our speakers give their remarks, we will then go to audience Q&A. Thank you for sharing with us today. Judge, the floor is yours.


Hon. Michele Christiansen Forster:  Thank you, Micah. Thank you for having us on this teleforum. At this time, in 2019, we are still faced with an issue that U.S. intelligence and law enforcement communities are finding a landscape that is going dark due to new forms of encryption now being utilized in mainstream consumer products and services by the companies who offer them. Companies are increasingly adopting technological architectures that inhibit the government's ability to obtain access to communications, even in circumstances that may satisfy the Fourth Amendment warrant requirement. Government officials are concerned because without access to electronic communications, we may not be able to prevent terrorist attacks and investigate and prosecute criminal activity. However, is the solution to force companies to provide access to user communications and data? Critics of this approach fear that guaranteed access will compromise the security and privacy of users worldwide, also while hurting the economic viability of U.S. companies.


So where are we, and what do we do in 2019? To talk about this issue, we are joined today by a great, expert panel. Our panelists include Greg Brower, Jamil Jaffer, and Kenn Kern. I'm going to briefly introduce them, and then we can begin our discussion.


Greg Brower is currently a Shareholder at Brownstein, Hyatt, Farber, Shreck. In his practice, he focuses on criminal and civil litigation, as well as regulatory and enforcement actions, corporate investigations, cybersecurity matters, and federal and state governmental relations.


Most recently, he served as the Assistant Director for the Office of Congressional Affairs at the Federal Bureau of Investigation, serving as the FBI’s chief liaison to Congress on a wide range of critical oversight and investigative matters. He previously served as the FBI’s Deputy General Counsel and as the U.S. Attorney for the District of Nevada. He has also served as both General Counsel and Inspector General for the U.S. Governmental Publishing Office. Greg is currently an adjunct professor of law at the William S. Boyd School of Law at the University of Nevada, where he has taught courses in national security law and trial advocacy.


Also, we are joined by Jamil Jaffer, who is the Founder and Executive Director of the National Security Institute and an Adjunct Professor of Law and Director of the National Security Law & Policy Program at the Scalia Law School at George Mason University. Jamil is also Vice President for Strategy & Partnerships at IronNet Cybersecurity. He is the former Director of the National Security Agency and Founding Commander of U.S. Cyber Command.


He also serves on the Center for a New American Security's Artificial Intelligence and National Security Task Force, and on the Board of Advisors of the Foundation for the Defense of Democracies' Cyber-Enabled Economic Warfare initiative. That is a mouthful.


Prior to his current positions, he serve on Capitol Hill in a variety of roles, including on the leadership team of the Senate Foreign Relations Committee as Chief Counsel and Senior Advisor, and as Senior Counsel to the House Intelligence Committee. He also served as a Law Clerk to Justice Neil Gorsuch of the U.S. Supreme Court and Judge Edith Jones of the U.S. Court of Appeals for the Fifth Circuit.


And last, but not least, Kenn Kern is the Chief Information Officer and the Special Assistant for International Relations at the New York County District Attorney’s Office. Kenn joined that Office in 200, and in 2011, was appointed Deputy Chief of the Cybercrime and Identity Theft Bureau, prosecuting and supervising a wide variety of complex financial, identity theft, and cyber frauds.


In his current role, he serves point person in assessing technological needs, developing and maintaining the Office’s technological environment, managing network and application servers, coordinating the cybersecurity framework and protocols for that office, and partners with state, national, and international agencies and offices on technology and cybersecurity matters. He is also office liaison to multiple international law enforcement agencies. For the last eight years, he has coordinated the Office's Financial Crimes and Cybersecurity Symposium.


With that introduction, we can begin our discussion. And my first question is directed to Mr. Jaffer, and if you wouldn't mind talking about the origins of this "going dark" phenomenon and how we've gotten to where we are today?


Prof. Jamil N. Jaffer:  Sure. Well, Judge, thanks for the introduction, and thanks for the opportunity to sort of lead off this conversation. The "going dark" debate goes back to the encryption wars back in the 1990s and the question whether internet companies, the evolving -- the emerging internet companies, could export encryption overseas and sell it and make it a part of their core product. As we all know today, strong encryption protects all of our financial transactions online. It's in your web browser when you go to websites. Almost every credible website today uses HTTPS, which means HTTP Secure, which involves embedded encryption in those communications. But that would not have been possible had the U.S. government not loosened its export control regime to permit the export and the use in commercial products of so-called strong encryption. And that fight, which was a fight at times between the national security community, which wanted to ensure that strong encryption was available primarily to government users and was not exported abroad, and the internet community which wanted strong encryption to protect its data and the data of consumers.


The community, then, won that battle back then, and there's a question of whether there should be built in [inaudible 6:48] strong encryption, what people thought about as "back doors". There's this discussion during the Clinton administration of the Clipper Chip – a chip that would give the government lawful access to data that was encrypted if a strong encryption was, in fact, exported and sold. And so there's a fight over that, and ultimately, the technology community, which was opposed to the idea of weakening strong encryption in that way through the use of a lawful access device, a Clipper Chip or the like, where the government would have all the keys, they won that battle, too, in the first round back in the 1990s. It ended firmly in favor of those who wanted to distribute the technology broadly and who were not interested in the government in having and form of access to that data, in particular the idea of the government having the keys or a back door in this encryption technology.


The modern iteration of this debate is what's known as the "going dark" debate, and the idea that as encryption has become ubiquitous, it's not just used to protect financial transactions and the privacy of users, but it can be used to protect the communications of terrorists, and criminals, and the like. And, in fact, we've seen an increasing trend across the intelligence community of bad actors using encryption, the most alarming of which is the efforts of Al-Qaeda, and in particular ISIS, to recruit Americans in the United States and Europeans in Europe to work for them. And then as soon as they begin the recruitment process, and after they've successfully accomplished it, moving to encrypt channels, like What's App, Signal, and the like in order to continue communication. Then what happens then is that the FBI and intelligence community loses insight into those communications, even if they had them early on.


And so one of the conversations that's been taking place up on Capitol Hill and the like is do we need to have lawful access to that type of data? Even if the government doesn't hold encryption keys or doesn't have a back door to strong encryption, is there a way to find a way to obtain lawful access when either a foreign intelligence surveillance order or a warrant is issued by a federal court?


And the most clear example of that, and I'll leave off here, Judge, is the example of the San Bernardino shooter, where that individual had an iPhone. It was actually his work iPhone owned by San Bernardino County. The government obtained it after he and his wife had murdered 14 people and injured 22 others. They went to Apple and said, "Please open up this iPhone. We want to know what's on it. We want to know if there's other people planning attacks or if there was more data about who he was involved with." And Apple said, "Look, after the Snowden instance, we implemented strong encryption at rest on our iPhones, and in fact, we threw away the key. We don't have access to it. We can't give you access to it." And the FBI sought to have them build some technology. That never ended up going to a final decision because ultimately the FBI was able to obtain access through another method.


But that raised the question, very clearly, of what happens when we're building strong encryption into the product -- which is a good thing from a consumer privacy perspective and protection for our financial data. But at the same time, what happens if the government has a court order, or in this case the consent of the owner, but can't get onto the device and there may be terrorist attacks [inaudible 9:40]? So that's the background of this stuff. Judge, back to you.


Hon. Michele Christiansen Forster:  Thank you so much. And I'll turn it over to Mr. Kern now. And maybe, Mr. Kern, you can explain a little bit more about what end-to-end encryption is. Is that different than normal encryption that we find on our iPhone, and how is that different than web-based communications that are private or software that anonymizes an individual's communications? Maybe you can give us a little background on that.


Kenn Kern:  Be happy to, Judge. Thank you so much for organizing the forum and it's a pleasure to be a part of it. From a definitional standpoint, let me start with end-to-end encryption and what that means. It essentially comes down to that only the communicating users, the sender and the recipient, can read the messages. So this protects the communication from being accessed in transit, even by the communication provider. So that's Verizon or AT&T. So many messaging applications had this feature. This is What's App, Signal. Some of these applications have enabled this by default, while others allow users to the option to enable end-to-end encryption. And that's currently where Facebook Messenger is.


From device encryption, we're talking about data at rest. We're talking about something that is stored on a device, data that's stored in a device, in a manner that a password, or a fingerprint, or an eye scan is required in order to decrypt that data. Often we hear that metadata and what that stands for and what that is, and I'll just share that that's really data about data. That's the big times and senders.


And as Jamil alluded to, really from the public's awareness of this issue, we turn to September 2014 and Apple made a statement at that time where they were changing their operating system. And at that time, they decided to embrace full-disk encryption across all of its devices, and made clear on its website that—this is about eight months after Snowden—that from that point onward, using iOS 8 and later versions, Apple was not going to perform data extractions in response to government search warrants because it had, at this point, it as the maker of the device, no longer had the ability to get to that data.


And that was really the transformative moment for the data-at-rest issue because, prior to that, law enforcement agents around the world had a protocol in place where they could take a device, go to a magistrate, go to their judge, secure a search warrant, package that, send it out to Cupertino, California, and an extraction would happen at the Apple campus. Then the data would be sent back to the law enforcement agency or whoever had sent for that information.


September 2014 and afterwards, Apple could no longer do that because they had made the choice to remove themselves from that ability by adopting full-disk encryption. So I'll turn it back to you, Judge.


Hon. Michele Christiansen Forster:  All right. Thank you. And both Kenn and Jamil, you mentioned the Apple versus the FBI case in 2015, and maybe Greg, you can give us a little more information about that case and what was really at issue. Is the issue truly access or is it more of a business decision on the part of Apple?


Greg Brower:  Thanks to you and to everybody who's out there listening. A very timely topic, and I appreciate the chance to participate in this today. And a very, very good question. So let me just maybe make, I won't call it a correction because it's just a semantics thing, and I think we all know, Judge, what you meant, but just to kind of set the table for this discussion. It was not, of course, the FBI versus Apple; it was the people of the United States of American versus Apple, and I say that, again Judge, not to correct just a semantic shortcut on your part, but just because this is really not about what the FBI thinks in terms of how this should work, but it's really a matter of the interests of the people of the United States and what the Constitution requires and doesn't require, frankly. So let me just try to summarize what happened with the Apple litigation back in 2016 because I think there has been a lot of confusion about that since that time.


So, essentially, as was pointed out earlier by Jamil, during the law enforcement investigation of that San Bernardino terrorism incident back in 2015, it was discovered the shooter had been issued an iPhone. It was an iPhone 5c by his employer, the County of San Bernardino. The phone, when it was recovered, was locked, and it was programmed in such a way that all data associated with that phone would be deleted upon 10 failed password attempts. And so, given the way the phone was designed and programmed if you will, the FBI was not able to access the data, the information on the phone.


And so after exhaustive efforts by the Bureau failed, it asked Apple for assistance, and Apple declined to help the FBI open the phone. And that problem was brought to the lawyers at DOJ. One thing led to another and that problem resulted in the government then asking a court, a magistrate judge in the Central District of California, to issue an order compelling Apple to create software to assist the FBI in accessing the data on the phone. That led to an interesting effort on the part of the government, unprecedented at that time, with respect to the government's use of what's called the All Writs Act, an old -- and when I say old, I mean old. 1789 statute, and this was the first time that the All Writs Act had been used for this purpose. That effort by the government was successful. The magistrate judge did issue the requested order, and then Apple challenged the order, pointing out that the data security risks could result from compliance with the order, and also pointing out, or arguing, that the government, at least in the opinion of Apple, had not exhausted all of its efforts, all of its tools, to access the phone without Apple's assistance.


So as Jamil referenced earlier, before the ultimate hearing on this issue, the FBI was, as it continued to try by any means possible to access the phone, it was able to access the phone using a third-party vendor. And it, as a result, publicly announced that, and the government withdrew its request, then, which essentially mooted the court's order.


So subsequently in trying to explain how this went down, then-Director Comey stated that the tool that was ultimately used to successfully unlock the phone would only work on that particular model and older models, and moreover, it cost an excess of $1 million for the Bureau to gain access through that third party's tool.


The government's success in that case did avoid -- success in accessing the phone, did avoid an ultimate decision by the court on whether the All Writs Act was the appropriate vehicle and whether Apple could be compelled. And so that fight was adverted. But clearly the problem persists, which is, in part, why we're talking about it today. And I would submit, and I'm sure we'll hear some more from Kenn on this, the problem is only getting worse or getting more serious as time goes on. And we still are in need of a solution, and so that's why this issue has not gone away. And I will stop there.


Hon. Michele Christiansen Forster:  Perfect. Thank you so much, Greg, and thank you for the clarification. I did misspeak, and I should know that, given that I used to be an Assistant U.S. Attorney. I should recognize who the prosecutor is. And thank you, Greg, for the segue. It's easy for me to now turn to Kenn and ask -- and maybe, Kenn, you can follow up on the concept that this is not just a federal problem, right? This is a problem that is affecting all of our law enforcement agencies.


Kenn Kern:  That's exactly right. And Greg points it exactly correctly. This is a global law enforcement and intelligence problem. Let me start by focusing in the United States. Now, the United States versus Apple received the most attention, and that was a federal investigation, a federal criminal investigation. The vast majority of law enforcement activity in prosecutions, and investigations, and exonerations happen at the state and local level. District Attorney Vance oversees the Manhattan District Attorney's Office, one of 62 counties in New York State. So there are thousands of state and local law enforcement offices who are all dealing with the exact same problem, which is mainly a data left one, mainly a device-based Apple or Samsung device-based problem in which after a search warrant is secured, these law enforcement offices are no longer able to access the evidence which they have been given the green light to pursue following a review by a neutral magistrate judge.


Let's just talk a little bit about what the numbers look like. So at my office alone since 2014, since that landmark moment, really, when Apple announced its operating system change -- so since 2014 my office, just dealing with iPhones, has had about 2,000 lawfully obtained iPhones that are completely inaccessible to us. And these are ones in which they represent a variety of horrific cases. Ten percent of those are homicides or attempted homicides. About nine percent are sex crimes. About nine percent of violent cases involving assaults and robberies. So when we think about our oaths and our mission to secure evidence to determine what is the necessary path forward, it's a brick in our hands. These phones are essentially bricks. We're no longer able to do anything with them. And I'd really like to point out that as people know, that law enforcement mission is not focused on getting convictions. It's figuring out the truth.


So when we have taken a look at cases just in our office, what we have learned is that there are many times in which once we were able to get into a device, and whether that was using a third-party or some other methodology much later down the road, what we learned is that there're about 17 cases, 17 matters in which an individual was arrested who should not have been arrested. These are cases of exoneration. Cases in which but for getting into that device we would not know that the individual was someone who should not have been arrested. And people think, "Well, why wouldn’t the defendant just tell you? Why wouldn't the defense counsel give you the password?" That's because I think there's a certain level of distrust that exists in the criminal justice system.


And so these are very complicated issues, but one thing is clear: my colleagues at the state and local level are awash in devices that are inaccessible to them, and they're trying to figure out a path forward. And one interesting path forward is a whole new market has developed in which a third-party vendor, like the one that the FBI utilized, all of the sudden has come to law enforcement saying, "Hey, we can try to assist you with this." And we can talk a little bit about some of the concerns that exist because of this new method.


Hon. Michele Christiansen Forster:  I don't know if Professor Jaffer or Mr. Brower, if you want to jump in at this point with some thoughts about that? Also, just a question for the entire panel, aren't there other alternative means to -- maybe not to access the data at rest but other vehicles of other information that could be accessed be law enforcement? Are there alternative means of information gathering?


Prof. Jamil N. Jaffer:  Sure, Judge. Well, you know, one of the things, one of the arguments that was seen out in this public space a lot is there's a lot of information -- one argument that's been made is there's a lot of information out there about people. You don't need, necessarily, access to phone content to find out a lot about what somebody is doing. There are other sources of information, and as a result, as information about individuals and their lives is more online and more public, perhaps you need less access to data.


Of course, the response to that is, "That may be true, but somebody who's trying to conceal their activities, like a terrorist or a criminal or a bad actor, they're unlikely to have that information out there, and the people who are actually engaged in the activities we're most concerned about are the ones who are going to make the most active effort to keep that stuff away from the preying eyes of law enforcement. And so that is an interesting part of the debate. But that is one people pointed out as another way of solving the problem is, "Well, you have access to lots of other publicly available information or databases like [inaudible 24.42] or the like. You don’t really need access to people's phone and their most private information."


One of the other sort of common conversations that we're having in this space, or at least that's happening in the public space, is sort of this idea that because the iPhone has so much data on it, or your Android device, or your pocket computer essentially is a super computer back in the 1980s, is now just sitting in your pocket and carries terabytes and terabytes of data, essentially your entire life, that the government with a lawful corridor shouldn't have access to it because it's just too much information. It's too personal. It's too private.


And, of course, the challenge with that argument is that historically, the way our Framers thought about the Fourth Amendment was not that you couldn't have access to Ben Franklin's underwear drawer where he stored his most private diary, or read his private diary; it was simply that you could if you had probable cause and you went to a federal judge, a neutral third-party, and they granted you an order. And we in the modern era have become more inclined—at least there's some discussion about becoming more inclined—to think that even when you have that order, there may be a certain level of information or a certain set of private information that the government should never, ever be able to have access to. And that is increasingly part of the debate, and an interesting question that, perhaps, we ought to think about.


Kenn Kern:  And this is Kenn. I'll just add a few words on that. Jamil's exactly right. That the criminal marketplace is -- in many ways it reflects where people live in their digital space today. Most of us have one, if not two, devices that we use for our business and our personal life, and that we utilize these for almost all of the transactions that exist. And criminals are literally no different. They use the devices in order to execute their plans and to move forward. So when an individual is engaging in identity theft schemes, they're not necessarily going to a laptop or some other space. They're using their phones because that's their common means of communication.


And so what's interesting about this is the marketplace that came into really a much more prominent space has been this third-party vendor community that has now come to law enforcement, openly, and said, "We want to assist you in figuring out ways forward." So if, as a prosecutor, if I cannot use the Cloud -- if the Cloud only gives me certain amount of information, if my witness interviews only get me a certain way, if, in the end, the phone is the central fight where my evidence is pointing the case to, and that a neutral judge has agreed with that path and determination and made her decision to grant authorization, then either I choose to go forward with a third-party vendor community and start to talk with them, or I'm left with my brick and that I'm unable to access. They carry with it really challenging questions for us as a society. Do we want law enforcement to be spending public resources in this manner? Do we want to create a system by which we are no longer directly cooperating and working with Apple and they're monitoring the extraction of that data, as it was pre-September 2014?


But instead, it's happening at another company offsite. A lot challenges here, a lot to think through. And I have to say there's not a whole lot of conversations that are happening in terms of legislatures around the world. There are some that are popping up that I know we'll talk about, but this is a challenge that is not going away, especially in the U.S.


Greg Brower:  Yeah, Judge, if I could -- this is Greg Brower. If I could follow up briefly. I think Jamil and Kenn teed up perfectly what I wanted to emphasize and maybe reiterate. And that is that -- look, I can no longer speak for the FBI or for DOJ, but I can confidently say that the FBI and DOJ both do believe that encryption is incredibly important and critical to data security, not just for the private sector but for their own systems as well. So it's not a criticism of data security per se that the government is trying to articulate here. But what I think the government is trying to articulate and what we should all be mindful of in the context of this debate is that the advent of this so-called "warrant-proof encryption" capability is a serious problem. And I guess the way I've always looked at this, and I'll try to explain it at least from my perspective is in the following way.


We all would agree, I think, that there has never been an absolute right to privacy under our Constitution. That's just something that I think, certainly, we as lawyers understand and I think most individuals when they think about it also understand that. And in fact, no matter how sacred we hold -- that our homes, our cars, our personal and business records are private, we generally don't blink at all at daily examples of the government invading that privacy when a judge has ordered it by way of a warrant. It happens every day.


We saw it happen, recently, in a high-profile investigation with a target's home in Florida being the subject of a very aggressive search warrant. And while there's been a lot of debate this week about whether the FBI used too much force—I'll leave that debate aside for now—but nobody really questions the fact the government, again, with a warrant, can break somebody's door down if necessary, take all their stuff, and even arrest that person before they've been convicted of anything.


And so we have to step back and remember that we've all agreed that the Constitution doesn't ensure that that's not going to happen when a judge has said it should happen. So in my view, it's like saying that a home that doesn't have a door or a lock is fair game for a government search, but a home that has a really hard-to-breach security perimeter is not okay, and I think that's largely what this debate seems to center around is the idea that the government shouldn't have access if gaining access is hard or requires assistance by some third party.


Now, one more observation and then I'll stop. I get -- I've been part of plenty of these debates and so I understand that what industry will say is that, "Well, it's not just about accessing that one phone in the context of one warrant; it's about the idea that a vulnerability, or a so-called backdoor, is created that will, yes, allow that one warrant to be effectuated, but will then result in systemic security vulnerabilities that create bigger problems for individual privacy and for the industry." I think we're all very, very sensitive to that. I guess what I would say, and what I know DOJ would say, is there's got to be a way to resolve that difference. There's got to be a way to both allow lawful warrants to be executed successfully and to allow industry and individuals to have sound security with respect to their privacy. There's got to be a way to compromise those two things.


Hon. Michele Christiansen Forster:  Thank you for those comments. And I will just throw it out to the panel, does finding that solution require some sort of legislative solution? Or is this really focused on industry and the courts?


Greg Brower:  Let me start briefly, and then I certainly want to hear what everybody else has to say. So I believe strongly, and I would submit that DOJ agrees that this has to be a legislative solution because it has to be the people who decide what the solution is. Jim Comey said all time, "Look, I'm here to tell you, Congress, the American people, that we have a problem. But how we solve the problem is entirely up to you – the people through their representatives in Congress." With full knowledge of the seriousness of the law enforcement problem that this encryption challenge presents, the people and the Congress decide on balance that nothing should change, then that is the people's decision. But it was always Jim Comey's position that "I'm not going to allow the status quo to remain because the people don’t understand the law enforcement perspective."


And so it has to be, in my view, a legislative solution. There have been attempts in the past. There was the so-called Burr-Feinstein Bill from a couple of years ago that didn't really go anywhere. There's been talk in 2018 by Senator Feinstein again, and also some discussion initiated by Senator Grassley who's interested. And so there doesn't seem to be a legislative solution that's close to becoming reality, but I would submit that that's absolutely where the debate has to be and only Congress can solve this.


Kenn Kern:  This is Kenn. I couldn't agree with Greg more, despite the fact that we have 50 states who are all grappling with this issue of state-by-state approaches. There's no logical way to do that. This has to be a national, legislative solution. And this is a complex issue. This is not an easy one. So it's a challenge for a national legislative body to try to corral both law enforcement and its equities and the tech sector and try to find a path forward. It's very complex.


And if you look globally, we're starting to see some legislation that is popping up, like in Australia. That is a country that has now a legislative framework that the tech sector is not happy with, but that is seeking to address this conundrum of inaccessibility and warrant-proof devices, and data in motion that is inaccessible to our intelligence community in an age of all the challenges associated with terrorism.


So it's got to be national. Ideally, it would be some form of international as well, but national is the minimum.


Hon. Michele Christiansen Forster:  And Professor Jaffer, do you have any further thoughts on that?


Prof. Jamil N. Jaffer:  I mean, I think that what everybody has said is exactly right. The real challenge in this space comes when you have these fundamental disconnects between -- sort of arguments on one side and the other about what privacy means in the modern era. So if you're going to get to a reasonable outcome, I think everyone's right that the legislative process is the only solution.


The concern with the legislative process, of course, is the legislative process doesn't act until there's an immediate need. It's hard to get the consensus on these very difficult questions until there's a real crisis. And when you're legislating in light of a crisis, you have to get very bad outcomes. And so I have to tell my friends in the privacy and civil liberties community if you don't like the Patriot Act and you're concerned about the liberty implications of the Patriot Act, well, then you ought to try and solve this encryption problem now because you know it's an issue. You know it's going to be an issue. You don't want to wait until there's a kidnapped child, a compelling case, or a mass terror attack, or the like because then the outcome, both from a technology perspective and the outcome from a private civil liberties perspective is going to be worse than if you made the deal now.


The flip side is that nobody's inclined to make a deal right now because, having won Crypto Wars 1.0, the private civil liberties community says, "Well, we're going to win it again, and right now we're winning because there's no legislation, and there's no mandate, and Apple's iPhone situation got resolved without the government getting some lawful method of access." You might also ask yourself in that circumstance, though, "Was the outcome a good one?" So, yes, the government got its access to the data, and Apple didn't have to help them. And so Apple got to feel like it was doing the right by its customers. Its customers still felt protected, and the FBI got the data it needed.


But, of course, the fact is that the FBI now had a hack, from a foreign company apparently, by the way, that it can use to get into anybody's iPhone without the resort of a court, without the resort of Apple. And now there's a hack for that iPhone—admittedly an old version, an old iOS version with the old phone and the old version of the software—but a known vulnerability, at least known to one company and the FBI, and maybe some other actors that that company sold it to. Apple has no inside to it. The government's unlikely to give Apple access, having given the stiff arm by Apple on having help from them.


And so those Apple users, at least of that phone and that software, are worse off because there's a known hack that Apple can't fix because it doesn't know what it is. And so the incentives almost become perverse and odd, and so you have the situation where there's no incentive to move from the Legislature. There's no incentive to move from the companies. And the people who are worse and the interests that are worse off are the individuals, and our privacies, and our civil liberties. And so you're sort of left scratching head saying, "Well, they were trying to protect privacy, and they actually made it worse."


Kenn Kern:  And, Judge, if I could just add something to that. That current structure in which you not only have the known vulnerability but now you have a version in the marketplace, there are entities throughout the country that are able to figure out the financial resources to get into a device to further their mission of pursing justice as it relates to victims of crime. And there are other counties that will never have the resources to be able to pursue that.


And so you now have a structure in which there are haves and have nots as it relates to digital evidence. And while the price of getting into devices has certainly fallen from the over $1 million price tag, we're still talking about a disparity that is, I think, something that obviously in the criminal justice space, everyone is working in good faith trying to reduce the amount of disparities as it relates to justice. And here we have a new, technological disparity.


Hon. Michele Christiansen Forster:  It is the quandary. I think it's probably time to start wrapping up. I want to leave some time if there are any questions or comments. I think the last question I have for each of you is is there a solution in the sense of -- we'll have to brainstorm, I guess, and come together, but is something like storage of an encryption key with a third party, is that a viable solution to this debate?


Prof. Jamil N. Jaffer:  Well, you know, one thing that people have suggested that hasn't gotten a lot of traction as yet but I think may be a viable middle ground is you can imagine a world in which there are encryption keys, and they're stored with multiple third parties, and they're distributed. So you can imagine a world in which those keys are only for particular sessions. They're very individualized. They're very specific to a set of communications and a specific set of communicants, and they're distributed among multiple providers. And the keys themselves are re-encrypted.


And so they do have a scenario where if the government gets a court order, they have to take that court order and go to one or two or three providers -- it could be a system where the keys are split among 10 providers, but you only need access to three providers in order to get the key to open it. And the keys themselves are encrypted so they're secure, and they're only applied to a small set of communications. And so you have to get multiple keys for multiple communications. That'll make law enforcement's life harder, certainly, in the sense that you'd have to repeat this process over and over again in order to get a bunch of communications of the particular individual, but it's highly privacy protected.


There is, to be sure, some limited intrusion, some limited diminishing of, quote/unquote, "pure security" or sort of completely perfect security, but the diminution is minimal, and it's cabined, and it's subject to getting court orders, and then going to a third party and will verify the court order is accurate.


So there are ways you can imagine where, yes, there will be some diminution in perfect security—which doesn't exist, by the way—but is still highly privacy protected and gives government the ability to have lawful access. So there are ways to imagine it. The problem is today nobody really wants to even go down that road because, as I mentioned earlier, the private civil liberties community views itself as having won the debate and continued to win the debate. It's an ephemeral win, though, because the day the balloon goes up and there's a major terrorist attack, or a kidnapped child, or the like, in a particularly compelling case, it all goes out the door and we end up worse off from a technology and privacy perspective.


Kenn Kern:  This is Kenn. I'll just add that there are technologists who have been proposing ideas and trying to find a middle ground. Many have been working with the National Academy of Sciences, who have produced reports on this. Many of these suggestions, I think from our vantage point, has simply been generally rejected by the tech community. And the conundrum is that each side is going to have to give some; otherwise, we're left in this posture where law enforcement and intelligence agencies are not able to execute what a magistrate has given them the authority to do so. We're increasing public risk. We're simply sitting on our hands looking at each other. And as was pointed out, what's likely going to happen is there has to be a terrible event which draws the world back into this picture, which is the last thing that anybody wants to do.


So we remain hopeful, and I think law enforcement is not in any way against encryption. We are all open to the possibility of finding a path forward. And we'll see how it goes in 2019.


Greg Brower:  Judge, I would simply and quickly just double down on what Jamil and Kenn both said. I think that my discussions with tech folks, back both when I was in government and since then, tell me that there is a potential solution if all parties are willing to work together on this. It's not a technologically impossible problem. I have to just emphasize what Jamil has said a couple of times, and that is that I think that the technology company stakeholders would be well advised to put aside their pre-conceived biases against showing any signs of so-called weakness on this issue, and instead come to the table, quietly if necessary, to work on a solution so as to avoid the potential for a more draconian solution forced by some crisis down the road. I am bullish on the potential for a solution that really is workable.


Hon. Michele Christiansen Forster:  Very good. I appreciate all of your thoughts and the dimension and the intelligence that you bring to this debate. And I think with that, Micah, I think we're ready to go to questions or comments if there are any.


Micah Wallen:  Sounds wonderful. Thank you so much. Some questions are already lined up, so without further ado, we'll now go to our first question,


Caller 1:  First, I want to push back on a couple things, then I have a short question. And I'm not normally very nitpicky, but Mr. Brower, thank you for your service, but it sort of rubbed me the wrong way, your whole correction about the United States of America v. Apple. That's not even the actual name of the case. The name of the case was In the Matter of the Search of an Apple iPhone Seized During the Execution of a Search Warrant etc. and so forth, the normal sort of details you need for an asset-based litigation. So I don't normally push back on that, but your initial nitpick kind of rubbed me the wrong way.


Along the same lines, the $1 million figure is extremely misleading, considering it was a project that had been in development long before the start of that case. It was a vendor through a Remote Operations Unit of the FBI that had already been developing this solution, and there had not been a lot of communication between them and the Cryptology and Electronics Analysis Unit. That was one of the reasons they did not know about the solution until later on and had to withdraw that court order.


I guess my main question is are the only two options really to either have a backdoor that the government has or some sort of third party to hold the keys -- has access to, or a company developing software in the moment like the FBI wanted Apple to do. Are those really the only two solutions: a backdoor or companies making software on the basis of court orders when necessary? Or does anyone see any other types of solutions in their own experience?


Greg Brower:  This is Greg Brower. Let me address that first -- so to answer your last question first, no, I don't think those are the only two options. And I would, perhaps, defer to others to discuss the details of other options that are increasingly being talked about publicly.


But let me just go back to your first comment first, and I don't want to continue the nitpicking. But let's all be clear about this: when the Department of Justice is litigating in federal court, it is litigating on behalf of the United States of American and its people. You might talk about the real party and interest being the FBI in a particular matter like this, but it's really the Department of Justice on behalf of all of us, who are trying to vindicate the interests of all of us in a litigation like that. So let me just be clear on what I meant by that. But I would defer to others on the tech issues.


Hon. Michele Christiansen Forster:  Kenn or Jamil, do you have any further thoughts on that?


Prof. Jamil N. Jaffer:  I mean, yeah, it's certainly not the only set of options. I mean, there's dozens of potential ways in which companies might work with the government to come up with a methodology that allows them to have the ability to access the information ahead of time. It doesn't require a backdoor. It doesn't require -- makes of the moment. It does require, though, a willingness of companies, and/or the Legislature, and/or somebody to make it happen because right now nobody's doing it. Right now, in fact, the trend is the opposite direction: to seal off as many things as possible to ensure -- to protect privacy and security. And that's a good thing in a sense that enhanced privacy and security are good things, but they're not the only equity involved, right? And so one might expect that we are okay with some slight diminution in privacy or security in order to effectuate the traditional needs of law enforcement and/or the national security community.


And by the way, that is the deal the Fourth Amendment struck. When our Framers wrote the Fourth Amendment, it wasn't a protect privacy in all circumstances. It wasn't protect security in all circumstances. The compromise was you go to a federal judge, show probable cause with particularity, you get an order, and then you get access. Today we seem to believe that that's not the right balance. The right balance is privacy only. And the government can go pout, and if that means we lose some amount of security or some amount of law enforcement authorities, well, then that's just too bad because my privacy and my Apple iPhone and my device in my pocket, my laptop, matters more than the larger issues that the government and law enforcement are interested in. And that's, I think, one of the challenges in these debates.


Kenn Kern:  Yeah, this is Kenn. I don't have anything else to add, other than I appreciate the questioners second question, and I would direct him to the National Academy of Science reports for some specifics on it.


Micah Wallen:  All right. Let's go to our next question. And for the call, we have three questions left in the queue. I believe that's all we will have time to get to. But for the three of you, just try and keep your question as brief as possible so that we can try and get to everyone.


Ken Cuccinelli:  Well, thank you all for the very interesting discussion. This is Ken Cuccinelli. I used to be the Attorney General in Virginia, so I've been on the law enforcement side on this. But I'm also a civil libertarian and wanted to push back on one thing and then ask you a forward-looking question. One of you mentioned this has to be national, not state by state. I'm a little shocked that I'm hearing that on a call set up by The Federalist Society. That Congress should tell the Fairfax County Police or the Virginia State Police what their rules of engagement are for their state is very shocking for me to hear here. I think you should seriously reconsider that.


But second of all, since the Katz ruling—and this is where my question lies—the Katz concurrence—it wasn't even a decision as you know that moved us effectively to this subjective measure of a reasonable expectation of privacy—combined with what I think may accurately argued to be an expectation that we do have complete, 100 percent privacy. Now, that's not everybody, but I do think millennials that have grown up with the cell phone as the 207th bone in their body and that kind of thing, that subjective basis for deciding privacy seems to get very much in the way here. If we abandon that, I think we could return, obviously, to an objective measure and get to the trespass approach where it is subject to a legislative solution. But if the Fourth Amendment is going to continue to incorporate subjective expectations of privacy, then I don't know -- I think that's major handcuffing of the potential to even solve the problem you all put forth. And I'd like your thoughts on that. Does the Court have a role in backing off to allow the Legislature to fill the space?


Prof. Jamil N. Jaffer:  Okay, that's a great question, and you raise two really important points. This is Jamil. Number one, I think you're exactly right that Katz is at the heart of this problem – this whole subjective/objective analysis of the right -- the alleged right to privacy that people find in the various parts of the Constitution. But you're also right to say that it doesn't have to be a federal decision. We have, typically, at the federal level looked at these questions when it comes to surveillance laws, as you know, Title III, ECPA, all those things are federal law.


But nobody's trying to force -- nobody's trying to box states in. I think the states are certainly able to, and should come up and be innovators in the laboratories for these experiments. The problem is there's no consensus anywhere in any legislature right now to act, to protect law enforcement or national security interests. In fact, as we said, there is a trend. It's in the opposite direction in some of the states like California where you see increasing efforts to implement privacy rights like the European GDPR into state statute. So I think that's one piece of it.


But to come back to your other question about what the Court's role is, you're right. The Court could, and perhaps should, step back. You see in Justice Gorsuch's concurrence in the most recent case, Carpenter, some discussion of how you might think about some of these rights in the context of -- as a property right. You saw some of that in the return to the Olmstead case and the majority opinion of Jones. That being said, the trend across the Court has been to retain the Katz framework, even with Justice Alito's concurrence in Jones and the rest of the opinions in Carpenter. Nobody seems ready to abandon that yet. And yet, you're right. There's no real basis in the text of the Constitution for this subject of objective dance.


Micah Wallen:  And Judge Christensen, do you think we should try and fit another question in, or do you think it'd be better to wrap up?


Hon. Michele Christiansen Forster:  I think it's probably best to wrap up. We don't want to keep people too long. And this has been an informative and interesting conversation. And hopefully we'll leave people wanting more – more engagement on this issue. So I don't have any closing remarks. I don't know if any of the panelists do.


Kenn Kern:  Other than to say thank you for the opportunity to continue this dialogue with the hope that there is path forward. And with that, on behalf of my office, we'd be happy to be part of any conversation to that end.


Micah Wallen:  All right. Well, on behalf of The Federalist Society, I want to thank all of our experts for the benefit of their valuable time and expertise today. We welcome listener feedback by email at [email protected]. Thank you all for joining us. We are adjourned.


Operator:  Thank you for listening. We hope you enjoyed this practice group podcast. For materials related to this podcast and other Federalist Society multimedia, please visit The Federalist Society's website at fedsoc.org/multimedia.