Dr. Stephen Marrin is a Lecturer in the Centre for Intelligence and Security Studies at Brunel University in London. He previously served as an analyst with the Central Intelligence Agency and US Government Accountability Office. Dr. Marrin has written about many different aspects of intelligence analysis, including new analyst training at CIA‘s Sherman Kent School, the similarities and differences between intelligence analysis and medical diagnosis, and the professionalization of intelligence analysis. In 2004 the National Journal profiled him as one of the ten leading US experts on intelligence reform.
Abstract: Each of the criteria most frequently used to evaluate the quality of intelligence analysis has limitations and problems. When accuracy and surprise are employed as absolute standards, their use reflects unrealistic expectations of perfection and omniscience. Scholars have adjusted by exploring the use of a relative standard consisting of the ratio of success to failure, most frequently illustrated using the batting average analogy from baseball.Unfortunately even this relative standard is flawed in that there is no way to determine either what the batting average is or should be. Finally, a standard based on the decision makers’ perspective is sometimes used to evaluate the analytic product’s relevance and utility. But this metric, too, has significant limitations. In the end, there is no consensus as to which is the best criteria to use in evaluating analytic quality, reflecting the lack of consensus as to what the actual purpose of intelligence analysis is or should be.
Evaluating the quality of intelligence analysis is not a simple matter. Frequently quality is defined not by its presence but rather by its absence. When what are popularly known as intelligence failures occur, sometimes attention focuses on flaws in intelligence analysis as a contributing factor to that failure.
On the afternoon of 9 April 1948, angry mobs suddenly and swiftly reduced the main streets of Bogotá to a smoking ruin. Radio broadcasts, at times with unmistakable Communist content, called for the overthrow of the Colombian government and of “Yankee Imperialism.” Many rioters wore red arm bands; some waved banners emblazoned with the hammer-and-sickle. A mob gutted the main floor of the Capitola Nacional, disrupting the deliberations of the Ninth International Conference of American States and forcing Secretary of State Marshall and the other delegates to take cover. The army regained control of the city over the next day or two. But not before several thousand Colombians had been killed. It was the bogotazo.
Others, notably Dr. Loch Johnson, de facto dean of the intelligence scholars in the English language, have explored both definitions and concepts for a theory of intelligence. Others, such as Jack Davis, have done much in the area of analytic tradecraft or the “art” of intelligence analysis (to match the “art” in clandestine operations and covert action).
Now the time has come to develop a science of intelligence. The first casualty must of necessity be the obsession with secret sources and methods, secret agencies, and secret clients. Intelligence is about decision-support, plain and simple, and the new science of intelligence will be developed along the lines of the services science developed by Dr. Jim Spohrer of IBM, and others. Dr. Spohrer provided the below in an email exchange today:
(1) “you can have a science of anything, if a community agrees it is important”
(2) “innovations that are based on sciences, not just management and engineering practice, can be advanced more systematically”
(3) “industry cares about innovation acceleration, can academia deliver a science? does the engineering and management exist in practice”
(4) “academia said we can establish a research area to help build the science under the engineering and management practice.”
A round-table is being formed and a new article will result.
While the automated search produces the relevant results, Jack Davis is the Sherman Kent of our time and deserves a cleaner quicker result. Here is the human in the loop distillation of this great man’s contributions as they appear on this web site and the two web sites in Sweden where all our stuff is safely preserved.
Misappropriated without Attrribution–Free Online, March 4, 2010
One Star for lack of ethics on the part of the publisher. Beyond five stars for content, free online as with all of Jack Davis’s stuff. Upgraded to 3 stars for proper pricing (after Amazon’s cut, publisher only makes roughly 3 dollars per book, which is totally fair).
This product was misappropriated from Jack Davis, dean of the intelligence analysis scholar-practitioners. While materials created within the US Government by US Government employees are generally not copyrighted because the taxpayer funded their creation, they are a) available free online; and b) generally considered off-limits to sleaze-bag publishers that troll for stuff (this happens to all of us, in my case with my monographs for the Strategic Studies Institute (SSI), all free online).
It’s nice that Jack’s work is respected and made available on Amazon, a truly global service.
It is very troubling that Jack Davis, who just asked me to find out who did this, has not been contacted by the publisher and offered both courtesy copies of his own work, and some modest recognition.
Although this search brings up two relevant hits, Jack Davis, de facto dean of intelligence analytic tradecraft, is one of perhaps ten people at CIA that we absolutely hold in the highest regard. Here are all of the links, the first one being the specific report you are looking for. Searching on the web for “analytic tradecraft” is also interesting. Here it is better to search for <jack davis>.
The original conceptual depictions of “competing influences” on individual decision-makers were first developed by Dr. Greg Treverton teaching the Intelligence Policy Seminar at JIF School of Government, and Jack Davis, dean of the U.S. Intelligence Community scholar-analysts. The “eight tribes” (previously seven) are original to Robert Steele. Steele’s adaptation of Davis-Treverton first appeared as Figure 17 on page 53 of ON INTELLIGENCE: Spies and Secrecy in an Open World (AFCEA, 2000).
The future of Open Source Intelligence (OSINT) is Multinational, Multifunctional, Multidisciplinary, Multidomain Information-Sharing & Sense-Making (M4IS2).
The following, subject to the approval of Executive and Congressional leadership, are suggested hueristics (rules of thumb):
Rule 1: All Open Source Information (OSIF) goes directly to the high side (multinational top secret) the instant it is received at any level by any civilian or military element responsive to global OSINT grid. This includes all of the contextual agency and mission specific information from the civilian elements previously stove-piped or disgarded, not only within the US, but ultimately within all 90+ participating nations.
Rule 2: In return for Rule 1, the US IC agrees that the Department of State (and within DoD, Civil Affairs) is the proponent outside the wire, and the sharing of all OSIF originating outside the US IC is at the discretion of State/Civil Affairs without secret world caveat or constraint. OSIF collected by US IC elements is NOT included in this warrant.
For over three decades, Jack Davis has been the heir to Sherman Kent and the mentor to all those who would strive to be the world’s most effective all-source intelligence analysts. As a Central Intelligence Agency analyst and educator, he combines intellect, integrity, insight, and an insatiable appetite for interaction with all manner of individuals regardless of rank and disposition. He is the most able pioneer of “analytic tradecraft,” the best proponent for the value of human analysis over technical processing, and one of those very special individuals who helped define the end of 20th Century centralized analysis and the beginning of 21st Century distributed multinational multiagency analysis.
Jack Davis remains the de facto Dean for Analytic Tradecraft of the US Intelligence Community.
There are six (6) pages in this work that held my attention: pages 11-12 (Table 2.2 Analytic Concerns, by Frequency of Mention); page 14 (Figure 3.1, A Pyramid of Analytic Tasks); page 20 (Table 3.1, Wide Range of Analytical Tools and Skills Required); page 34 (Figure 5.1, Intelligence Analysis and Information Types), and page 35 (Table 5.1, Changing Tradecraft Characteristics). Print them off from the free PDF copy online (search for title).
My first review allotted two stars, on the second complete reading I decided that was a tad harsh because I *did* go through it twice, so I now raise it to three stars largely because pages 11-12 were interesting enough to warrant an hour of my time (see below). This work reinvents the wheel from 1986, 1988, 1992, etcetera, but the primary author is clearly ignorant of all that has happened before, and the senior author did not bother to bring him up to speed (I know Greg Treverton knows this stuff).
Among many other flaws, this light once over failed to do even the most cursory of either literature or unclassified agency publication (not even the party line rag, Studies in Intelligence). Any book on this topic that is clueless about Jack Davis and his collected memoranda on analytic tradecraft, or Diane Webb and her utterly brilliant definition of Computer Aided Tools for the Analysis of Science and Technology (CATALYST), is not worthy of being read by an all-source professional. I would also have expected Ruth Davis and Carol Dumaine to be mentioned here, but the lack of attribution is clearly a lack of awareness that I find very disturbing.
I looked over the bibliography carefully, and it confirmed my evaluation. This is another indication that RAND (a “think tank”) is getting very lazy and losing its analytic edge. In this day and age of online bibliography citation, the paucity of serious references in this work is troubling (I wax diplomatic).
Here are ten books–only one of mine (and all seven of mine are free online as well as at Amazon):
On the latter, look for “New Rules for the New Craft of Intelligence” that is free online as a separate document. Both Davis and Webb can be found online because I put them there in PDF form.
The one thing in this book that was useful, but badly presented, was the table of analyst concerns across nine issues that did not include tangible resources, multinational sense-making, or access to NSA OSINT.
Below is my “remix” of the table to put it into more useful form:
54% Quality of Intelligence
54% Tools of intelligence/analysis
43% Intra-Community collaboration and data sharing
41% Collection Issues
32% Targeting Analysis
Above are the categories with totals (first initial below connects to above). The top four validate the DNI’s priorities and clearly need work.
32% T Targeting Analysis is important
30% V Redefine intelligence
30% Q Analysis too captive to current
30% To Directed R&D for analytic technology needed
27% T Targeting needs prioritization
27% S Analyst training important and insufficient
22% V Uniqueness
22% E PDB problematic as metric
22% To “Tools” of intelligence analysis are poor
22% To “Tools” limit analysis and limited by culture
The line items above are for me very significant. We still do priority based collection rather than gap-driven collection, something I raised on the FIRCAP and with Rick Shackleford in 1992. Our analysts (most of them less than 5 years in service) are clearly concerned about both a misdirection of collection and of analysis, and a lack of tools–this 22 years after Diane Webb identified the 18 needed functionalities and the Advanced Information Processing and Analysis Steering Group (AIPASG) found over 20 different *compartmented* projects, all with their own sweetheart vendor, trying to create “the” all-source fusion workstation.
19% C S&T underused, needs understanding
16% E Critical and needs improvement
14% E Assess performance qualitatively
14% Q Quality of analysis is a concern
14% Q Intelligence focus too narrow
14% S Language, culture, regional are big weaknesses
11% A Leadership
11% L Must be improved
11% Q Problem centric vice regional
11% Q Global coverage is important
11% C Open source critical, need new sources
11% I Lack of leadership and critical mass impair IC-wide
11% I IC information technology infrastructure needed
11% I Non-traditional source agencies need more input
8% V Unclear goals prevail
8% T Targetting analysis needs attn+
8% C Collection strategies/methods outdated
8% S Concern over lack of staff or surge capability
8% S Intelligence Community-wide curriculum desireable
8% I Should NOT pursue virtual wired network
8% I Security is a concern for virtual and sharing
5% E Evaluation not critical
5% Q Depth versus breadth an issue
5% Q Greater client context needed
5% C Law enforcement has high potential
5% S Analytic corps is highly trained better than ever
5% S Career track needs building
5% I Stovepiping is a problem, need more X-community
5% I Should pursue virtual organization and wired network
3% V Newsworthy not intelligence
3% L Radical transformation needed
3% E Metrics are not needed
3% E Evaluation is negative
3% E Audits are difficult
3% Q Long term shortfalls overstated
3% Q Global coverage too difficult
3% T Targeting can be left to collectors
3% C All source materially lacking
3% C Need to guard against evidence addiction
3% C Need to take into account “feedback”
3% S Should train stovepipe analysts not IC analysts
3% S Language and cultural a strength
For the rest, not now, but three at the bottom trouble me: the analysts do not have the appreciation for feedback; they do not understand how lacking they are in sources; and they don’t know enough to realize that radical transformation is needed.
On balance, I found this book annoying, but two pages ultimately provocative.