… the quite predictable case of data overload and why the most serious 4th Amendment breach relates to the statistical problem of ‘data culling’ algorithms* generating large numbers of “false positives” and thereby wiping out the principle of “probable cause.”
Too Much Useless Data, Warns Former NSA Coder
by Jason Ditz, December 26, 2013
William Binney, a former NSA coder behind some of the surveillance program’s algorithms, is warning that the agency’s interest in mass surveillance is coming at a grave cost in efficiency.
While the agency sees value in taking in any data it can get, “just in case,” sorting through a stockpile of unrelated data is soaking up so many resources that what relevant data they might have is getting less focus.
Binney’s comments mirror warnings in some of the Snowden documents, which show the NSA is also concern about their data collection programs far outpacing their ability to process that data.
Indeed, in March some NSA analysts were asking for permission to collect less data with some of the programs, saying that they are collecting a lot of data with “relatively small intelligence value.”
- Click to share on Facebook (Opens in new window)
- Click to share on Google+ (Opens in new window)
- Click to share on Pinterest (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on Tumblr (Opens in new window)
- Click to share on Twitter (Opens in new window)
- Click to email this to a friend (Opens in new window)
- Click to print (Opens in new window)