Thomas Briggs: Comments on the US Secret Super Cloud — Dead Since 1995 — With Two Thoughts from Robert Steele

Cloud, Corruption, Government, Idiocy, Ineptitude, IO Impotency
Thomas Leo Briggs
Thomas Leo Briggs

Thomas Leo Briggs is a retired CIA operations officer with 3 years military experience in US Army military police, 3 years as a Special Agent in the Drug Enforcement Administration and 26 years in the CIA.  He tried to make use of computer capabilities to aid and assist humint operations in a variety of ways throughout his last 18 years as an operations officer.  He is also the author of Cash on Delivery: CIA Special Operations During the Secret War in Laos (Rosebank Press, 2009).

– – – – – – – –

COMMENTING ON:  Richard Wright: Secret Intelligence Cloud, Charlie Allen’s Warning, Comment by Robert Steele

Intelligence Agencies Move Towards Single Super-Cloud by Heny Kenyon, Aol Defense, 17 December 2012

So, what we have here, according to Mr. Kenyon, is an effort to develop a pan-agency set of computer servers so that the analysts of all intelligence community (IC) agencies may share data and resources.  One reported hope being that such a system will break down existing boundaries between agencies and change their insular cultures.

The first thing a reader notices is that the alleged motivations for this super-cloud are lower costs and higher efficiency.  Secondly, the CIA already operates a cloud slightly separate from an NSA cloud consisting of five other intelligence agencies and the FBI.  Is that like being slightly pregnant?  Does that provide truly lower costs, higher efficiency, and shared resources and data?  Wouldn’t one expect to find different data and resources on each cloud, though some data and resources may be the same?  Moreover, the NSA cloud incorporates the smaller organization-wide clouds of its partner agencies and, in addition, the National Reconnaissance Office has its own plan to build its own cloud.  Seems all of that that does not make for lowest costs and highest efficiencies – nor one super-cloud.

Multiple little clouds all hanging out in the ether with lines to – the super-cloud?

Perhaps all you really have is the emergence of individual IC member organization clouds, bringing themselves lower costs, higher efficiencies and intra-organizational sharing of data and resources, but not a true pan-agency super-cloud, at least, not yet.

To be sure, the super cloud does not need to be contained in one set of computer servers.  But, if the super-cloud is to be multiple sub-clouds, controlled by different agencies, where then are the benefits of lower costs and higher efficiency?

Moreover, Mr. Kenyon’s article does not mention security.  Besides organizational differences in culture the great bugaboo is security.  How can a different culture, reporting to and getting promotions from different bosses trust all the other agencies to protect data and access to resources?  The ability to maximize the security of the data is an enormous obstacle to the lowest costs and the highest efficiencies.  If the black hat cracker world knows there is a super-cloud with multiple nodes of access you can be sure there will be attempted exploits against the super-cloud.

That’s why there are all those little separate clouds.

Another reason to have a super-cloud is interoperability.

Looking back, the DoD was the first to recognize the need for interoperability in the early 1990’s.  I wrote about that in my book, Cash on Delivery: CIA Special Operations During the Secret War in Laos.  Here is what I wrote in the last chapter of my book, a chapter titled, “Speaking Truth to Power – Lessons Learned”.

“One of the recent criticisms of the CIA and the intelligence community is that it is incapable of communicating effectively among all its members. When was that criticism first brought forth? At the end of the First Gulf War, General Norman Schwarzkopf declared that too many of the messaging systems in use in the defense and intelligence communities were incompatible. From that “lesson learned” came the Defense Department (DoD) requirement to build the Defense Messaging System (DMS). Supposedly, the entire intelligence community was to adopt the DMS or build new messaging systems compatible with the DMS. The DoD took the lead and built the DMS but allowed other agencies to keep their own messaging systems. The DMS established protocols for other agencies to use to convert their proprietary messages into and out of DMS compatible messages so another agency’s outgoing messages could be received by the DMS and incoming DMS compatible messages could be received at other agencies. Once an agency could write its messages in its own messaging system and then the message could be converted into something compatible with the DMS and then receive DMS compatible messages – the entire military and intelligence community would have interoperability, a post Desert Storm buzzword. The DoD awarded a contract to build the DMS in May 1995 and implementation began at ninety-eight sites in October 1997. Contrast that with the CIA that, as of March 2005, was not yet DMS compatible. That means that the CIA was incapable, during the Second Gulf War, of sending DMS compatible messages to the warfighters in Iraq or Afghanistan – more than 10 years since DoD awarded the DMS contract and more years than that since General Schwarzkopf made his plea for interoperability.”

Thus, there has been talk of interoperability ever since 1st Gulf War Days and the Defense Messaging System (DMS).  At the time I wrote my book I did not know of any success creating an interoperable DMS with all IC participation.

The initial sentence of Mr. Kenyon’s Aol Defense article might best be changed to “The intelligence community has been developing a single cloud computing network capability since 1995.

Something to keep in mind is that there are at least two issues here.  One is the interoperability of messaging systems and the other is the interoperablity of data sharing.  If you store messages in the super-cloud you begin to build up data.

First, messaging systems best serve immediate tactical use of intelligence.  One agency has taken down a terrorist cell and captured both terrorists and their materials, e.g. maps, documents, computer files, photos, etc.  From that action there is a need to share textual intelligence reports and the maps, documents, photos and computer files.  The means for passing this variety of material could be individual messaging systems, e.g. email; or organizational messaging systems, e.g. cable messages.

Since email would permit attachments, as well as text, it might be a good choice, but you have to know the names and email addresses of intended recipients and you would have to send the emails over an encrypted system and there are legal requirements for intelligence agencies to store and retrieve certain types of messages.  The message containing the terrorist cell intelligence would be one such message that must be retrievable.

Organizational messages, i.e. cables, are part of secure encrypted messaging systems, are addressed to organizational entities that usually remain the same when personnel leave, and should be part of an established legally required store and retrieve process.

The Defense Messaging System was an attempt to provide an interoperable organizational messaging system for IC agencies.  Mr. Kenyon’s article does not address whether organizational, or individual messaging, containing needed data, is part of the super-cloud data or is intended to be part of it.  In truth, Mr. Kenyon can only be as accurate or inclusive as his sources permit him to be.

But, what about needed data that is never placed in an organization’s messages and needs to be transferred into the super-cloud?

The super-cloud is different.  Separate IC agencies are amassing data in their organizations and their separate clouds, and the data owners have no idea which analysts in which agencies might need that data.  A super-cloud could hold data in one place where other IC agency analysts could find it.  Assuming that all analysts of the IC agencies have computer work stations with powerful data mining tools on them that can work effectively with the data in the super-cloud, how will access be handled?

I have no idea, but my guess is that no one else does either.  If I were the CIO of one agency and I understood the analytical requirements of my users, and I understood what all the data in my cloud was, and I had the authority to solve data access issues, I might be able to work it all out – in my separate cloud.  However, what we have seen is that instead of one super-cloud there are now multiple sub-clouds, to my count, CIA, NSA (with its six sub-sub-clouds) and soon, NRO.

How will all these clouds, sub-clouds and sub-sub-clouds be melded into one, efficient, lower cost super-cloud with perfectly secure access controls to protect everyone’s data?

There is a DNI CIO who may or may not have the authority to direct solutions.  Probably not, which is why each agency has its own configuration management board (CMB).  How long will it take to complete, or will it ever be worked out, that a secure, lower cost, higher efficient super-cloud is finally built and operational?

Or, will we end up with a compromise, something that Reuben Lucius Goldberg, whose cartoons of intricate diagrams of very complicated and impractical contraptions that accomplished little or nothing would best illustrate what may become of the super-cloud?

Robert David STEELE VivasClick Here to See Personal Page
Robert David STEELE Vivas
Click Here to See Personal Page

ROBERT STEELE:  I would really love to be an Inspector General for Intelligence, but I fear that the system could not handle intense intelligence with integrity even if I promised to be an educational IG (my predeliction — people should not be punished for being badly managed)..  Everything that Tom lays out above is consistent with everything that I know and everything that I hear from colleagues who are just shaking their heads and seeing how many more years they can rack up before the system implodes.  At some point someone, somewhere should be holding the IC “leaders” accountable for failing to do much of anything other than push paper.  In 1986 I was part of the second attempt to get CIA into the digital era, hand-picked by Bill Donnelly over the rabid objections of Ted Price, and I remember very well the absolute refusal of the Directorate of Operations to put anything online within the CIA where analysts could get to it, especially the tons of OPINTEL — that is also when I discovered that the clerks in the basement of CIA were GS-1 to GS-3 people with rotten attitudes that refused to “INDEX ALL NAMES” and instead would do one or two and then pass the cable  to the out box.  [In an actual counterintelligence project I managed, the clerks indexed 200 names over the six years I was overseas and the annuitant I assigned to do a complete scrub of the hard copy files ended up indexing 2000 from the hard copy of those six years — this is the kind of corruption that is pervasive throughout at all levels of the US secret world as well as — I imagine — the not so secret world.]

In 1986, recognizing the reasonable paranoia of the Directorate of Operations (despite its inability to police its own ranks), I invented the concept of “reverse search.”  Instead of making all data open to all searches, I created a concept for distributing all searches to all data managers, who could run a full access search on their own database, and then send back one of three responses:

01  HIT, SENSITIVE, Tell Us Why You Want to Know

02 HIT, ANSWER, Provide DRAFT of Proposed Final Utilization

03  NO HIT (which could also be a false no hit assuming a Blue Border or whatever)

The National Counterterrorism Center (NCTC) with its 80 incoherent databases and the arcane search languages unique to each of those 80+ databases (and remember that we have over 2,500 Sensitive Compartmented Information (SCI) and Special Access Program (SAP) databases, most in terrible disarray and irresponsibly managed because no one is training them in real-world current technology and no one is holding them accountable for actually producing useful intelligence (decision support — tons of useless raw information does NOT count).

If it were up to me, I would cancel the entire cloud initiative, redirect those contracts toward creating an unclassified cloud centered around the Open Source Agency (OSA), and turn the NCTC into what it should be, a help desk for everybody — operators, analysts, consumers, even — heavens — coalition partners — on ALL topics.  NCTC as a classified help desk and the OSA as the primary vehicle for meeting the needs of all who do not get classified anything now is a brilliant solution long overdue for implementation.

I do so want to see the USG learn how to make ethical evidence-based decisions.  Hope springs eternal.  St.

See Also:

21st Century Intelligence Core References 2.2

2012 Robert Steele: The Human Factor & The Human Environment: Concepts & Doctrine? Implications for Human & Open Source Intelligence 2.0

Robert Steele: How Dutch Intelligence Survived & Prospered Using Open Source Human Intelligence as a Foundation for Ethical Evidence-Based Decisions

Thomas Briggs: Reflections on OSINT in Support of HUMINT

Opt in for free daily update from this free blog. Separately The Steele Report ($11/mo) offers weekly text report and live webinar exclusive to paid subscribers, who can also ask questions of Robert. Or donate to ask questions directly of Robert.