The latest Snowden document is the US intelligence “black budget.” There’s a lot of information in the few pages the Washington Post decided to publish, including an introduction by Director of National Intelligence James Clapper. In it, he drops a tantalizing hint: “Also, we are investing in groundbreaking cryptanalytic capabilities to defeat adversarial cryptography and exploit internet traffic.”
Honestly, I’m skeptical. Whatever the NSA has up its top-secret sleeves, the mathematics of cryptography will still be the most secure part of any encryption system. I worry a lot more about poorly designed cryptographic products, software bugs, bad passwords, companies that collaborate with the NSA to leak all or part of the keys, and insecure computers and networks. Those are where the real vulnerabilities are, and where the NSA spends the bulk of its efforts.
Last week, President Obama met with the five-member review board that he recently appointed to review the National Security Agency’s (NSA) controversial electronic surveillance program. The review board is part of the president’s effort to build confidence in the surveillance program and its respect for privacy rights.
But when Obama speaks about the program, he leaves the impression that its existing privacy protections are sufficient, if only we knew enough to appreciate them. That hardly instills confidence. If the president is serious about fixing the enormous overreach of U.S. surveillance that Edward Snowden helped to highlight, he should take these steps:
First, recognize 4th Amendment protection for our metadata. More than 30 years ago, in a different technological era, the Supreme Court ruled that, unlike the content of our phone conversations, we have no privacy rights in the numbers we call. The rationale was that we share those numbers with the phone company. The intrusion mattered little at the time because if the police wanted to reconstruct someone’s circle of contacts, they had to undertake the enormously time-consuming process of manually linking phone number to phone number.
I sense a little bit of history repeating, and not the good kind. About ten years ago, I was deeply involved in the field of conflict early warning and response. Eventually, I realized that the systems we were designing and implementing excluded at-risk communities even though the rhetoric had me believe they were instrumented to protect them. The truth is that these information systems were purely extractive and ultimately did little else than fill the pockets of academics who were hired as consultants to develop these early warning systems.
The prevailing belief amongst these academics was (and still is) that large datasets and advanced quantitative methodologies can predict the escalation of political tensions and thus impede violence. To be sure, “these systems have been developed in advanced environments where the intention is to gather data so as to predict events in distant places. This leads to a division of labor between those who ‘predict’ and those ‘predicted’ upon” (Cited Meier 2008, PDF).
Those who predict assume their sophisticated remote sensing systems will enable them to forecast and thus prevent impending conflict. Those predicted upon don’t even know these systems exist. The sum result? Conflict early warning systems have failed miserably at forecasting anything, let alone catalyzing preventive action or empowering local communities to get out of harm’s way. Conflict prevention is inherently political, and “political will is not an icon on your computer screen” (Cited in Meier 2013).
Defeat of the insurgency and terrorism in Iraq requires not only a military approach but also a political component. Although the “surge” may stabilize parts of Iraq and reduce the level of violence while the additional troops remain in place, long-term stability requires a more holistic approach.
Frank Kitson, a retired British military officer whose writings influenced British operations in Northern Ireland,argues that the “main characteristic that distinguishes campaigns of insurgency from other forms of war is that they are primarily concerned with the struggle for men’s minds.”[1] To defeat the insurgency, coalition forces must persuade the Iraqi population to reject extremism and deny safe haven to those fighting the new Iraqi political order. This will require dialogue, inducements, and the proportionate use of force to win the battle for “hearts and minds.”
Effective engagement with key segments of the Iraqi population requires, in turn, a comprehensive information operations campaign. To date, it is this component that is most lacking in coalition strategy. The coalition has failed to counter enemy propaganda either by responding rapidly with effective counter messages or by proactively challenging the messages, methods, and ideology that the insurgents and extremists promote and exploit.
While terminology may vary—some officials refer to information operations as strategic communications, influence operations, psychological operations, perception management, or just propaganda—the intent to influence the hearts and minds of target audiences through the effective use of information remains constant.
In Iraq, while the coalition fumbles its information operations, the insurgents and militia groups are adept at releasing timely messages to undermine support for the Iraqi government and bolster their own perceived potency. They are quick to exploit coalition failures and excesses; they respond rapidly to defend their own actions, often by shifting blame to the authorities; and they hijack coalition successes to argue that change only occurs as a result of their violence. The slow speed of the U.S. military’s clearance process—typically it takes three to five days to approve even a simple information operations product such as a leaflet or billboard—creates an information vacuum that Iraqis fill with conspiracy theories and gossip often reflecting the exaggerations or outright lies of insurgents and extremists.
The press stories that follow a document release like this often do not bother to do this. They look, instead, for a key—or the key—fact, around which the news story then develops. In this case, unsurprisingly, the key fact is that the NSA gathered tens of thousands of email communication by Americans before the FISA Court declared its actions unconstitutional. As the Washington Post puts it in its lead:
For several years, the National Security Agency unlawfully gathered tens of thousands of e-mails and other electronic communications between Americans as part of a now-revised collection method, according to a 2011 secret court opinion.
An article titled NSA Says It Can’t Search Its Own Emails on ProPublica brings up an interesting glitch in the NSA’s surveillance technology. In spite of having the capability to sort through big data with a supercomputer, when it comes to doing a search of NSA’s over 30,000 employees they are at a loss. The article explains,
“There’s no central method to search an email at this time with the way our records are set up, unfortunately,” NSA Freedom of Information Act officer Cindy Blacker told me last week. The system is “a little antiquated and archaic,” she added… It’s actually common for large corporations to do bulk searches of their employees email as part of internalinvestigations or legal discovery.”
The article also brings up the point that federal agencies often don’t have the funding they need for public records. However, if any agency should have the capability to keep tabs on its employees, it is the agency charged with surveillance of the nation. Lacking that ability limits NSA operatives to searching emails by individuals one at a time instead of searching for keywords or in bulk. This is very interesting in light of recent events, no further comment.
Tripwire has released the results of an extensive study focused on the state of risk-based security management with the Ponemon Institute.
The study examined the key risk-based security metrics IT security managers used most frequently to gauge the effectiveness of their organizations’ overall security efforts.
Top Metrics included: time taken to patch, policy violations, uninfected endpoints, data breaches, reduction in the cost of security, end users training, and reduction in unplanned system downtime.
The survey respondents included 749 US and 571 UK professionals in the following areas: IT security, IT operations, IT risk management, business operations, compliance/internal audit and enterprise risk management.
In the compliance arena, leading metrics included mean time-to-patch (49 percent), policy violations (33 percent), and reduction in audit findings and repeat findings (27 percent).
The study also found that only 19 percent of respondents viewed the number of records or files detected as compliance infractions, and only 16 percent identified reduction in expired certificates — including SSL and SSH keys — as an effective metric.