Oversight Hearing on FISA Surveillance Programs

Committee on the Judiciary

United States Senate

July 31, 2013

Stewart A. Baker*

 

Mr. Chairman, Ranking Member Grassley, members of the Committee, it is an honor to testify before you on such a vitally important topic. The testimony that I give today will reflect my decades of experience in the areas of intelligence, law, and national security. I have practiced national security law as general counsel to the National Security Agency, as general counsel to the Robb-Silberman commission that assessed U.S. intelligence capabilities and failures on weapons of mass destruction, as assistant secretary for policy at the Department of Homeland Security, and in the private practice of law.

To be blunt, one of the reasons I’m here is that I fear we may repeat some of the mistakes we made as a country in the years before September 11, 2001. In those years, a Democratic President serving his second term seemed to inspire deepening suspicion of government and a rebirth of enthusiasm for civil liberties not just on the left but also on the right. The Cato Institute criticized the Clinton administration’s support of warrantless national security searches and expanded government wiretap authority as “dereliction of duty,” saying,“[i]f constitutional report cards were handed out to presidents, Bill Clinton would certainly receive an F—an appalling grade for any president—let alone a former professor of constitutional law.”1 The criticism rubbed off on the FISA Court, whose chief judge felt obliged to give public interviews and speeches defending against the claim that the court was rubber-stamping the Clinton administration’s intercept requests.2

This is where I should insert a joke about the movie “Groundhog Day.” But I don’t feel like joking, because I know how this movie ends. Faced with civil liberties criticism all across the ideological spectrum, the FISA Court imposed aggressive new civil liberties restrictions on government’s use of FISA information. As part of its “minimization procedures” for FISA taps, the court required a “wall” between law enforcement and intelligence. And by early 2001, it was enforcing that wall with unprecedented fervor. That was when the court’s chief judge harshly disciplined an FBI supervisor for not strictly observing the wall and demanded an investigation that seemed to put the well-regarded agent at risk of a perjury prosecution. A chorus of civil liberties critics and a determined FISA Court was sending the FBI a single clear message: the wall must be observed at all costs.

And so, when a law enforcement task force of the FBI found out in August of 2001 that al Qaeda had sent two dangerous operatives to the United States, it did . . . nothing. It was told to stand down; it could not go looking for the two al Qaeda operatives because it was on the wrong side of the wall. I believe that FBI task force would have found the hijackers—who weren’t hiding—and that the attacks could have been stopped if not for a combination of bad judgment by the FISA Court (whose minimization rules were later thrown out on appeal) and a climate in which national security concerns were discounted by civil liberties advocates on both sides of the aisle.

I realize that this story is not widely told, perhaps because it’s not an especially welcome story, not in the mainstream media and not on the Internet. But it is true; the parts of my book that describe it are well-grounded in recently declassified government reports.3

More importantly, I lived it. And I never want to live through that particular Groundhog Day again. That’s why I’m here.

I am afraid that hyped and distorted press reports orchestrated by Edward Snowden and his allies may cause us—or other nations—to construct new restraints on our intelligence gathering, restraints that will leave us vulnerable to another security disaster.

I. Intelligence Gathering Under Law

The problem we are discussing today has roots in a uniquely American and fairly recent experiment—writing detailed legal rules to govern the conduct of foreign intelligence. This is new, even for a country that puts great faith in law.

The Americans who fought World War II had a different view; they thought that intelligence couldn’t be conducted under any but the most general legal constraints. This may have been a reaction to a failure of law in the run-up to World War II, when U.S. codebreakers were forbidden to intercept Japan’s coded radio communications because Section 605 of the Federal Communications Act made such intercepts illegal.  Finally, in 1939, Gen. George C. Marshall told Navy intelligence officers to ignore the law.4 The military successes that followed made the officers look like heroes, not felons.

That view held for nearly forty years, but it broke down in the wake of Watergate, when Congress took a close look at the intelligence community, found abuses, and in 1978 adopted the first detailed legal regulation of intelligence gathering in history—the Foreign Intelligence Surveillance Act. No other nation has ever tried to regulate intelligence so publicly and so precisely in law.

Forty years later, though, we’re still finding problems with this experiment. One of them is that law changes slowly while technology changes quickly. That usually means Congress has to change the law frequently to keep up. But in the context of intelligence, it’s often hard to explain why the law needs to be changed, let alone to write meaningful limits on collection without telling our intelligence targets a lot about our collection techniques. A freewheeling and prolonged debate—and does Congress have any other kind?—will give them enough time and knowledge to move their communications away from technologies we’ve mastered and into technologies that thwart us. The result won’t be intelligence under law; it will be law without intelligence.

Much of what we’ve read in the newspapers lately about the NSA and FISA is the product of this tension. Our intelligence capabilities—and our intelligence gaps—are mostly new since 1978, forcing the government, including Congress, to find ways to update the law without revealing how we gather intelligence.

* * *

II. What Next?

Setting aside the half-truths and the hype, what does the current surveillance flap tell us about the fundamental question we’ve faced since 1978—how to gather intelligence under law?

Regulating Technology—What Works and What Doesn’t

First, since American intelligence has always been at its best in using new technologies, intelligence law will always be falling out of date, and the more specific its requirements the sooner it will be outmoded.

Second, we aren’t good at regulating government uses of technology. That’s especially a risk in the context of intelligence, where the government often pushes the technological envelope. The privacy advocates who tend to dominate the early debates about government and technology suffer from a sort of ideological technophobia, at least as far as government is concerned. Even groups that claim to embrace the future want government to cling to the past. And the laws they help pass reflect that failing.

To take an old example, in the 1970s, well before the personal computer and the Internet, privacy campaigners persuaded the country that the FBI’s newspaper clipping files about U.S. citizens were a threat to privacy. Sure, the information was public, they acknowledged, but gathering it all in one file was viewed as sinister. And maybe it was; it certainly gave J. Edgar Hoover access to embarrassing information that had been long forgotten everywhere else. So in the wake of Watergate, the attorney general banned the practice in the absence of some investigative predicate.

The ban wasn’t reconsidered for twenty-five years. And so, in 2001, when search engines had made it possible for anyone to assemble a clips file about anyone in seconds, the one institution in the country that could not print out the results of its Internet searches about Americans was the FBI. This was bad for our security, and it didn’t protect anyone’s privacy either.

Now we’re hearing calls to regulate how the government uses big data in security and law enforcement investigations. This is about as likely to protect our privacy as reinstating the ban on clips files. We can pass laws turning the federal government into an Amish village, but big data is here to stay, and it will be used by everyone else. Every year, data gets cheaper to collect and cheaper to analyze. You can be sure that corporate America is taking advantage of this remorseless trend. The same is true of the cyberspies in China’s Peoples’ Liberation Army.

If we’re going to protect privacy, we won’t succeed by standing in front of big data shouting “Stop!” Instead, we need to find privacy tools—even big data privacy tools—that take advantage of technological advances. The best way to do that, in my view, was sketched a decade ago by the Markle Foundation Task Force on National Security, which called on the government to use new technologies to better monitor government employees who have access to sensitive information.5 We need systems that audit for data misuse, that flag questionable searches, and that require employees to explain why they are seeking unusual data access. That’s far more likely to provide effective protection against misuse of private data than trying to keep cheap data out of government hands. The federal government has in fact made progress in this area; that’s one reason that the minimization and targeting rules could be as detailed as they are. But it clearly needs to do better. A proper system for auditing access to restricted data would not just improve privacy enforcement, it likely would have flagged both Bradley Manning and Edward Snowden for their unusual network browsing habits.

* * *

Thirty-five years of trying to write detailed laws for intelligence gathering have revealed just how hard that exercise is—and why so few nations have tried to do it. In closing, let me offer some quick thoughts on two proposals that would “fix” FISA by doubling down on this approach.

One idea is to declassify FISA Court opinions. Another is to appoint outside lawyers with security clearances who can argue against the government. The problem with these proposals is that they’re not likely to persuade the FISA doubters that the law protects their rights. But they are likely to put sources and methods at greater risk.

Declassification of the FISA Court opinions already happens, but only when the opinion can be edited so that the public version does not compromise sources and methods. The problem is that most opinions make law only by applying legal principles to particular facts. In the FISA context, those facts are almost always highly classified, so it’s hard to explain the decision without getting very close to disclosing sources and methods. To see what I mean, I suggest this simple experiment. Let’s ask the proponents of declassification to write an unclassified opinion approving the current Section 215 program—without giving away details about how the program works. I suspect that the result will be at best cryptic; it will do little to inspire public trust but much to spur speculation and risk to sources and methods.

What about appointing counsel in FISA matters? Well, we don’t appoint counsel to protect the rights of Mafia chieftains or drug dealers. Wiretap orders and search warrants aimed at them are reviewed by judges without any advocacy on behalf of the suspect. Why in the world would we offer more protection to al Qaeda?

I understand the argument that appointing counsel will provide a check on the government, whose orders may never see the light of day or be challenged in a criminal prosecution. But the process is already full of such checks. The judges of the FISA Court have cleared law clerks who surely see themselves as counterweights to the government’s lawyers. The government’s lawyers themselves come not from the intelligence community but from a Justice Department office that sees itself as a check on the intelligence community and feels obligated to give the FISA Court facts and arguments that it would not offer in an adversary hearing. There may be a dozen offices that think their job is to act as a check on the intelligence community’s use of FISA: inspectors general, technical compliance officers, general counsel, intelligence community staffers, and more. To that army of second-guessers, are we really going to add yet another lawyer, this time appointed from outside the government?

For starters, we won’t be appointing a lawyer. There certainly are outside lawyers with clearances. I’m one. But senior partners don’t work alone, and there are very few nongovernment citecheckers and associates and typists with clearances. Either we’ll have to let intercept orders sit for months while we try to clear a law firm’s worth of staff—along with their computer systems, Blackberries, and filing systems—or we’ll end up creating an office to support the advocates. 

And who will fill that office? I’ve been appointed to argue cases, even one in the Supreme Court, and I can attest that deciding what arguments to make has real policy implications. Do you swing for the fences and risk a strikeout, or do you go for a bunt single that counts as a win but might change the law only a little? These are decisions on which most lawyers must consult their clients, or, if they work for governments, their political superiors. But the lawyers we appoint in the FISA Court will have no superiors and effectively no clients.

To update the old saw, a lawyer who represents himself has an ideologue for a client. In questioning the wisdom of special prosecutors, Justice Scalia noted the risk of turning over prosecutorial authority to high-powered private lawyers willing to take a large pay cut and set aside their other work for an indeterminate time just to be able to investigate a particular president or other official. Well, who would want to turn over the secrets of our most sensitive surveillance programs, and the ability to suggest policy for those programs, to high-powered lawyers willing to take a large pay cut and set aside their other work for an indeterminate period just to be able to argue that the programs are unreasonable, overreaching, and unconstitutional?

Neither of these ideas will, in my view, add a jot to public trust in the intelligence gathering process. But they will certainly add much to the risk that intelligence sources and methods will be compromised. For that reason, we should approach them with the greatest caution.

 

Endnotes

1 Timothy Lynch, Dereliction Of Duty: The Constitutional Record of President Clinton, Cato Policy Analysis No. 271 (March 31, 1997), http://www.cato.org/pubs/pas/pa-271.html.

2 Hon. Royce C. Lamberth, Presiding Judge of the Foreign Intelligence Surveillance Court, Address Before the American Bar Ass’n Standing Comm. on Law and Nat’l Sec. (April 4, 1997), in 19 American Bar Ass’n Nat’l Sec. L. Rep. 2, May 1997, at 1-2.

3 Stewart Baker, Skating on Stilts 66-69 (2010).

4 David Kahn, The Codebreakers: The Comprehensive History of Secret Communication from Ancient Times to the Internet 12 (2d ed. 1996).

5  The Task Force’s first report called for the federal government to adopt

robust permissioning structures and audit trails that will help enforce appropriate guidelines. These critical elements could employ a wide variety of authentication, certification, verification, and encryption technologies. Role-based permissions can be implemented and verified through the use of certificates, for example, while encryption can be used to protect communications and data transfers. … Auditing tools that track how, when, and by whom information is accessed or

used ensure accountability for network users. These two safeguards—permissioning and auditing—will free participants to take initiatives within the parameters of our country’s legal, cultural, and societal norms.

Markle Foundation Task Force, Protecting America’s Freedom in the Information Age 17 (October 2002), http://www.markle.org/sites/default/files/nstf_full.pdf.

 

*Stewart A. Baker is a partner in the Washington office of Steptoe & Johnson LLP.  He was previously the Department of Homeland Security’s first Assistant Secretary for Policy.  His memoir of his time at DHS is entitled Skating on Stilts: Why We Aren’t Stopping Tomorrow’s Terrorism.

The article has been adapted from testimony before the U.S. Senate Judiciary Committee on July 31, 2013. His complete testimony is available at http://www.judiciary.senate.gov/pdf/7-31-13BakerTestimony.pdf.