Zen and the Art of Cybersecurity


In the hothouse of Congress, members have been sweating over the need to do something – anything – about “cybersecurity.” They were under pressure from the administration, the intelligence services, and the tech industry. But the latest news is that the Republican majority will be turning, in the few days left before the recess, from the contentious highways bill to a bill to defund Planned Parenthood, likely shifting the previously-catastrophically-urgent cybersecurity crisis through to the fall. So Congress, like my seven-year-olds in school assembly, can take a few deep breaths and imagine that they can smell a flower.

The truth is, there never was a “cybersecurity crisis.” Companies are already legally allowed to share information on hacking attempts with the government, and they usually do. This debate is not really about making US companies or the US government more secure; it’s about putting more of your information, that you have voluntarily shared with US companies, into the government’s hands, without companies being liable for violating their privacy policies for sharing personally identifiable information. All proposals on the table in Congress would immunize companies from suit in this way. In this sense, it would be perfectly all right for Congress to do nothing.

Nevertheless, there is a cybersecurity problem that is worth trying to solve. The government is not a good custodian of our data. Its networks are often poorly secured and vulnerable to outside intrusion. In the surveillance arena, there are now over five million people with security clearances, who are in a position to leak sensitive information. Cultivating a more disciplined approach to network protection and data retention would seem to be a good idea. That’s where the principle above comes in.

In this spirit, let’s calmly reflect on what a bill dealing with this real problem would look like.

The first-order problem with cybersecurity relates to the NSA and FBI’s efforts to systematically subvert the encryption underlying inter-computer communications. Software vulnerabilities are continually discovered and, in an open-source world, experts would compete to promptly disclose and resolve them. In our world, by contrast, the NSA discovers, purchases and hoards “zero-day vulnerabilities”, and has no obligation to disclose them. Though other surveillance agencies of course likely do the same, nothing compares to the budget or worldwide reach of our homeland surveillance agencies (Go USA!).

It would therefore seem wise for the NSA to be required to responsibly disclose vulnerabilities that could adversely affect American cybersecurity within 30 days of acquisition. This would mean that if they discover or purchase a zero-day vulnerability, they would have to report it to the product vendor within 30 days and to the public 90 days later. In practice, the NSA would still have between 30 and 120 days to exploit the vulnerability, maybe longer, as people often delay patching of their systems. Vulnerabilities found in commercial off-the-shelf hardware, software or firmware should be reported immediately. Vulnerabilities discovered in the cryptographic systems of foreign powers, or agents thereof, with whom we are not at war, should likewise be disclosed within 90 days; vulnerabilities in the systems of foreign powers with whom we are at war should be disclosed only at war’s end.

The NSA itself, and the other intelligence and military agencies, are not themselves secure. It may seem strange for an activist who deeply supports the Snowden leaks to consider this to be a problem, but one of the most astounding elements of that leak was that the NSA had voluntarily opened itself up to contractors like Snowden, employed not directly by NSA but by the private firm Booz Allen & Hamilton, and was employing them as sysadmins, with access to all the administrative authorities sysadmins need to do their jobs. This was such an egregious violation of good security that just for that – independently of their violations of the Constitution – the NSA deserves every scrap of reputational damage it has suffered. It should be obvious by now to the agencies and the military that contractors should be legally barred from administrative access to their networks; that the number of people with security clearances should be sharply reduced; and that the volume of classified information should be reduced to a level such that the total number of actual human people with security clearances are capable in practice of analyzing it.

The second-order problem is that the information security practices of other government networks – as demonstrated by the OPM hack – are also embarrassingly weak. All federal agencies should adopt multi-factor authentication for access to all computer systems and networks. They should work on ways to encrypt the data they hold, both at rest and in transit, and should train federal employees in good information security practices.

At the same time, there needs to be more openness to security research. Federal agencies should invite hackers to participate in “bug bounty” programs and test their systems, hard, without fear of prosecution. The CFAA should be tightened so as to require actual fraud or damage for a CFAA conviction. There should be grants for code review to open source projects that comprise critical security infrastructure.

On the data collection side, the principle in the picture above is a good start (let’s call it the Meinrath Principle, because it was told me by Sascha Meinrath, director of X-Labs). Certain types of data collection are, indeed, required by the US Constitution, like the decennial census. But infinitely more data is gathered without thought or care, and retained for no good or clear reason. If the government retains that data, and has it lying around, it becomes irresistible to argue that, say, SEC and IRS data should be integrated, or EPA and mine safety data, or FBI and CIA and NSA data; why not? There needs to be thought, followed by enforceable rules, about what kinds of data integration are acceptable, and what kinds of network undermining are acceptable. Whether, say, the FTC is well equipped to do that right now is highly uncertain, but the thoughtful practice of data courtesy can’t simply be left to the private data skills of individual federal employees. Data collection initiatives should have to justify, to some ombudsman, why they need to collect specific information and what the minimal amount of time is to complete whatever it’s being collected for, and that entity should conduct regular audits looking into these protocols.

This is the kind of discussion we should be having over “cybersecurity”. But we’re not: instead, it’s all “OMG WE’RE GETTING PWNED BY CHINA EVERYBODY PANIC.” Breathe, people. We have, not forever, but time.

2 thoughts on “Zen and the Art of Cybersecurity

Leave a Reply