97% owned present serious research and verifiable evidence on our economic and financial system. This is the first documentary to tackle this issue from a UK-perspective and explains the inner workings of Central Banks and the Money creation process.
WASHINGTON: When the Presidential Daily Briefing occurs, a top intelligence official traditionally hands the president a folder with a sheaf of paper inside. The president may read what's inside or have it presented by the intelligence official. Then comes question time, when the chief executive and commander in chief can ask how reliable a source is or question the assumptions of an analysis he's just read.
But that will change. The president and his top officials want and will get a single mobile device allowing them to access highly classified and unclassified data wherever they are. The early fruits of the intelligence community's early efforts to do that are visible in the photo above. It shows President Obama in the Oval Office on January 31 using a technically neutered tablet as part of the Presidential Daily Briefing.
. . . . . .
A single device is the Holy Grail for the intelligence community and senior government officials, but it will be some time before it happens, the colonel said. In the near term, the White House hopes to issue two devices: one for classified and another for unclassified communications. It is coordinating with the Defense Department and the National Security Agency to ensure access to secure defense communications networks intelligence grade cryptographic algorithms.
In The Chronicle, Williman Pannapacker writes about the importance of receiving digital humanities training.
In The Chronicle, Williman Pannapacker writes about the importance of receiving digital humanities training, which he summarizes in a tweet: no dh, no interview. At the end of this piece he backs away from this provocation, writing “even though I've been excited about the digital humanities since my first visit to the summer institute, I want to urge job candidates: Don't become a DH'er out of fear that you won't get a position if you don't.” And I would certainly agree with that, though it always comes back to this matter of defintion. Even in the narrowest of defintions of DH, the field is beginning to spin out a range of sub-specializations. Pannapacker compares the current interest in DH to the focus on “theory” in the nineties, but mostly as a cautionary tale. Indeed DH has had an ambivalent (at best) relationship with theory, which makes sense in a way as two competing methods, which might become complementary (and may be complementary in some scholars' work) but are largely seen as incongruous at this point. Of course the primary difference between DH and other humanities methods is the infrastructure required to support the endeavor. As Pannapacker points out:
We discovered a brief film clip of one of Tesla's demonstrations. It is embedded in the trailer for Dr. Steven Greer's movie-in-progress Sirius,, beginning at 53 seconds in and lasting until 57 seconds. By rapidly double-clicking the pause button you can watch it in slow motion. No explosive phenomena are apparent to me, but the structure does seem to be made of metal. See if it doesn't remind you of anything.
Click start. Then use mouse to move to 50-53 seconds, see the tesler enery pulverize rocks — very similar to what happened to the World Trade Center towers when combined with controlled demolitions. http://www.sirius.neverendinglight.com/
I have been writing and blogging about “information forensics” for a while now and thus relished Nieman Report’s must-read study on “Truth in the Age of Social Media.” My applied research has specifically been on the use of social media to support humanitarian crisis response (see the multiple links at the end of this blog post). More specifically, my focus has been on crowdsourcing and automating ways to quantify veracity in the social media space. One of the Research & Development projects I am spearheading at the Qatar Computing Research Institute (QCRI) specifically focuses on this hybrid approach. I plan to blog about this research in the near future but for now wanted to share some of the gems in this superb 72-page Nieman Report.
In the opening piece of the report, Craig Silverman writes that “never before in the history of journalism—or society—have more people and organizations been engaged in fact checking and verification. Never has it been so easy to expose an error, check a fact, crowdsource and bring technology to bear in service of verification.” While social media is new, traditional journalistic skills and values are still highly relevant to verification challenges in the social media space. In fact, some argue that “the business of verifying and debunking content from the public relies far more on journalistic hunches than snazzy technology.”
I disagree. This is not an either/or challenge. Social computing can help every-one, not just journalists, develop and test hunches. Indeed, it is imperative that these tools be in the reach of the general public since a “public with the ability to spot a hoax website, verify a tweet, detect a faked photo, and evaluate sources of information is a more informed public. A public more resistant to untruths and so-called rumor bombs.” This public resistance to untruths can itself be moni-tored and modeled to quantify veracity, as this study shows.
Full post less two graphics below the line. Original post.
Phi Beta Iota: The cost, totalling $12 billion, is as good as a deceptive bureaucracy can provide. Our own estimate based on other sources over time is that it is closer to $15-20 billion, and this is without considering the cost of lost productivity, lost critical access to multiple data bases (the National Counterterrorism Center, for example, should be included in any calculation of the cost of idiocy, along with half or more of the cost of the Department of Homeland Security and half the cost of the Pentagon). Then of course one has the complex cost of dereliction of duty across all the Cabinet functional areas. Good people trapped in a bad system that is totally lacking in both intelligence and integrity.
It is fashionable now to talk about data as the new oil (or dirt), and to proclaim breathlessly that the ever-increasing masses of data allow for ever more wonderous things to be done including my personal favorite, situational awareness.
However, no one is yet serious about holistic analytics (which also implies a holistic collection management strategy and a clear definition of both what is to be collected and what is to be done with anomalous data encountered in passing). Neither is anyone serious about True Cost Economics, Man-Machine Translation, Global Near-Real-Time Crowd-Sourcing (for observations, translations, and culturally-grounded interpretations) or M4IS2 (Multinational, Multiagency, Multidisciplinary, Multidomain Information-Sharing and Sense-Making).
I cannot help but recall my briefing to the National Research Council in 1994, when I was asked to comment on the US Army's multi-billion dollar communications plan for the future. I pointed out the obvious: the US Army was assuming that all data would be generated from within the US Army or other US Government systems, and was making no provision for ingesting and digesting data from the 99% of the data sources outside the US Army. Of course they blew me off then, and they still do not get it today, 22 years later.