An illustration picture shows a projection of text on the face of a woman in Berlin, June 12, 2013.
An illustration picture shows a projection of text on the face of a woman in Berlin, June 12, 2013.
Pawel Kopczynski / Courtesy Reuters

There is a hilarious scene in the 1991 movie “L.A. Story” in which the protagonist, Harris (Steve Martin, who also wrote the screenplay), is mugged on a busy street as he withdraws cash from an ATM. The cavalier crook happens to be next in an adjacent line of stalkers, and he introduces himself as “your robber today”; Harris hands over some of the freshly delivered bills, as if paying a child an allowance. The exchange is as casual as one between strangers swapping business cards in a coffee shop. Martin could not possibly have anticipated in 1991 how that scene would play out two decades later, with the ubiquitous ATM replaced by an electronic infrastructure that delivers not just cash but information more valuable than money. To a large and disturbing extent, we’re all Harris now.

A similarly surreal scene unfolded recently during the stage-play of congressional outrage about the alleged exposure of electronic patient health records at the Department of Veterans Affairs (VA), which was triggered by the public testimony of a former official who claimed that specific nation-state hackers had had “unchallenged and unfettered access” to VA systems. The problem is not that there are newly discovered security holes in the information infrastructure at VA. Rather, it is that Congress, acting as Harris, has effectively conceded the high ground.

In the specific case of VA, there is no evidence that any patient record was exfiltrated. Of course, it is conceivable that a foreign agent made a digital copy of sensitive health information and secreted it out through the network undetected. If that happened -- and at this point there is only speculation and accusation -- it represents a serious and scaled breach of personal data at the nation’s largest integrated hospital network. The root issue is that network operators and information officers can almost never prove that a data file was not electronically pilfered. The old aphorism “absence of evidence is not evidence of absence” is apt, though “no evidence” is typically and frustratingly the best that experts can claim. Under the spotlights, their behavior may appear to be evasive. But their serious, factual approach -- devoid of excitement and drenched in technical detail -- is exactly the right way to digest the problem.

Too often, Americans confront the inherent asymmetry of cybercrime by unrealistically expecting their network defense mechanisms to protect them against every known or possible attack: every virus, every type of malicious software, every persistent threat. Keeping track of what is already known to exist in the network wilds is itself a major and expensive undertaking. Sharing that information with operators is typically manual, on a case-by-case or threat-by-threat basis. The difference between safety and disaster can be a human operator at a keyboard. And once an infection is discovered, the amount of work it takes to rid a device from suspect code is large, and the outcome uncertain.

So should everyone just throw their hands up and look forward to the next theater of the absurd? Hardly. There are plenty of steps that Americans can collectively take without Congress enacting new cybersecurity laws, and some new interventions that they should consider if their concern about national security and patient safety is serious and lasting.


Americans have to accept as canonical fact that U.S. networks are breached. Sure, there may be exceptions to that rule -- some closed circuits that are tightly monitored and deeply inspected for literally every bit that goes in and out. But the design criteria, never mind the policy response, should be based on the assumption of tolerance, not hygiene. Our bodies have evolved to cope with foreign invaders; nature assumes that they are going to come. Some we can live with, but others are more lethal. So too should the United States learn to isolate different cybersecurity problems and focus on what matters and what is feasible, such as protecting data.

Attackers enjoy a permanent strategic advantage. Although a network’s immune system has to remember every attack and vulnerability that has ever been uncovered, adversaries have the much easier job of poking around for new weaknesses in an ever-increasing IT infrastructure. Moreover, the few tools that exist today do not guarantee perfect protection from every conceivable intrusion. In the real world, a security architect’s goal is not “no failures” but “no undetected failures,” including, especially, the exfiltration of data.

The white hats are also operationally handicapped by the inevitable lag between the discovery of a cyber contagion and the deployment of the antivirus cure, or “patch.” Many companies are diligent and responsive to intrusions, and make patches available quickly for their systems and applications -- sometimes in the same day. But installing the patch in a production machine can be like changing the tires on a moving car. In the best case, it is a delicate operation -- and only possible if you plan ahead. Alas, preparing for predictable trouble is not something that the United States is especially good at, and it probably does require, literally, an act of Congress.


The first of the two most important lessons from the national dialogue about data and network security is the need to rethink personal accountability in the federal workforce. There are safety-of-operation issues that require quick, decisive action by senior officials, and they should be empowered to take them. The second is the dire need for procurement reform that emphasizes openly architected, standards-based, and modular IT solutions instead of the public sector’s naive penchant for closed, tightly integrated, and custom solutions that only a couple of vendors can service and secure. Surprisingly, open source and open standards are not just more economical, they are more secure.

The bad news is that the cybersecurity solutions won’t be flashed on a freeway sign, as they were for Harris in “L.A. Story.” The good news is that they don’t need to be. We already know what to do.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now
  • DANIEL E. GEER, JR. is the chief information security office for In-Q-Tel. PETER L. LEVIN was recently the chief technology officer at the Department of Veterans Affairs, and is now president and CEO of Amida Technology Solutions.
  • More By Peter L. Levin
  • More By Daniel E. Geer, Jr.