I read “Big Data Is No Longer Enough: It’s Now All about Fast Data.” The write up is interesting because it shifts the focus from having lots of information to infrastructure which can process the data in a timely manner. Note that “timely” means different things in different contexts. For example, to a crazed MBA stock market maven, next week is not too useful. To a clueless marketing professional with a degree in art history, “next week” might be just speedy enough.
The write up points out:
Phi Beta Iota: We cannot process more than 1% of the “big data” we have now….and most of what we have is legacy data full of holes, and most of what we collect has nothing to do with holistic analytics or true cost economics. In other words, “big data” is the death rattle of scientific reductionism.
See Especially:
See Also: