MEB remains, in our view, the greatest US-based information broker of all time, along with Reva Basch now fully retired. Below is her 13 November 2013 presentation to the Special Library Association.
Welcome to the Internet of Things (IoT). Currently the idea of the IoT has many definitions. Most include a world in the not-too-distant future where most objects are computerized and seamlessly integrated into our information network, creating “smart” grids, homes, and environments.
Since the end of Cold War, the United States has exercised neither coherent nor strategically minded communication with overseas populations.
The New Public Diplomacy Imperative takes a look at the strategic necessity for strong American public diplomacy. Policy analyst Matthew Wallin examines many of the issues in contemporary public diplomacy, and recommends best practices for policy makers and public diplomacy practitioners.
This white paper provides case studies, insight and guidance to strengthen the effectiveness of American messaging overseas.
Through the prism of operations in Afghanistan, the author examines how the U.S. Government’s Strategic Communication (SC) and, in particular, the Department of Defense’s (DoD) Information Operations (IO) and Military Information Support to Operations (MISO) programs, have contributed to U.S. strategic and foreign policy objectives. It assesses whether current practice, which is largely predicated on ideas of positively shaping audiences perceptions and attitudes towards the United States, is actually fit for purpose. Indeed, it finds that the United States has for many years now been encouraged by large contractors to approach communications objectives through techniques heavily influenced by civilian advertising and marketing, which attempt to change hostile attitudes to the United States and its foreign policy in the belief that this will subsequently reduce hostile behavior. While an attitudinal approach may work in convincing U.S. citizens to buy consumer products, it does not easily translate to the conflict- and crisis-riven societies to which it has been routinely applied since September 11, 2001.
Amazon has aspirations beyond being the world’s largest retailer. The online retail giant also aspires to be a mega force in computing, says The New York Times Bits Blog in: “Amazon Bares Its Computers.” Amazon has announced that it is taking its Amazon Web Services beyond simple cloud-computing to include specialized computers, data storage systems, networking systems, optical transmissions systems, and power substations. The overall goal is make computer cheaper and run more efficiently.
Amazon rarely discusses its AWS plans, but the recent discussion about how it plans to annually spend one billion comes as big news.
Amazon is prepping to boosts its web services by hiring power engineers to work on substations and remove power redundancies in cloud-computing. Hardware is purchased directly to reduce costs and the company created original statistical methods to limit damage from catastrophic failures. Amazon also owns its own optical fiber systems and take AWS global.
Amazon is hardly keeping their information under wraps this time, though. They are sharing their advances via open source in a direct challenge to Google, Facebook, and Microsoft. Microsoft will never share its secrets and Google does share some of its toys, but it keeps the bigger stuff locked away. What about Facebook?
The article explains:
“The notable outrider among the giant computers is Facebook, which isn’t selling its own system. Instead, Facebook is focused on pure cost-cutting, and spearheads the Open Compute Project, a kind of open-source, cloud-computing architecture. Open Compute is far enough along that companies like Hewlett-Packard, which came late to cloud computing, use aspects of it in their public clouds.”
Amazon is not directly asserting it is better than its competitors, but its openness and cost-cutting procedures certainly make it look better in the consumers’ eyes.
“In evaluating open-source documents, collectors and analysts must be careful to determine the origin of the document and the possibilities of inherent biases contained within the document.”
– FM2-22.3: Human Intelligence Collector Operations, p. I-10
“Source and information evaluation is identified as being a critical element of the analytical process and production of intelligence products. However there is concern that in reality evaluation is being carried out in a cursory fashion involving limited intellectual rigour. Poor evaluation is also thought to be a causal factor in the failure of intelligence.”
– John Joseph and Jeff Corkill “Information Evaluation: How one group of Intelligence Analysts go about the Task”
These two quotes illustrate the long-running problem that has plagued commercial cyber security reporting for many years. There are very few unclassified OSINT standards of source evaluation and even less for cyber threat intelligence; at least that I could find while doing research for this article.
The field of cyber intelligence is fairly new and fortunately, thanks to the Software Engineering Institute at Carnegie Mellon and the work of Jay McAllister and Troy Townsend, we can take a credible look at the state of the practice of this field:
“Overall, the key findings indicate that organizations use a diverse array of approaches to perform cyber intelligence. They do not adhere to any universal standard for establishing and running a cyber intelligence program, gathering data, or training analysts to interpret the data and communicate findings and performance measures to leadership.”
– McAllister and Townsend, The Cyber Intelligence Tradecraft Project
The one thing that isn't covered in their report is the issue of source validation and how that contributes to the validity or value of the intelligence data received. However they did write a follow-up white paper with Troy Mattern entitled “Implementation Framework – Collection Management (.pdf)”