Artificial Intelligence (AI) can be used to automatically predict the credibility of tweets generated during disasters. AI can also be used to automatically rank the credibility of tweets posted during major events. Aditi Gupta et al. applied these same information forensics techniques to automatically identify fake images posted on Twitter during Hurricane Sandy. Using a decision tree classifier, the authors were able to predict which images were fake with an accuracy of 97%. Their analysis also revealed retweets accounted for 86% of all tweets linking to fake images. In addition, their results showed that 90% of these retweets were posted by just 30 Twitter users.
A long-overdue technological revolution is at last under way
“IT IS possible to teach every branch of human knowledge with the motion picture,” observed Thomas Edison in 1913, predicting that books would soon be obsolete in the classroom. In fact the motion picture has had little effect on education. The same, until recently, was true of computers. Ever since the 1970s Silicon Valley’s visionaries have been claiming that their industry would change the schoolroom as radically as the office—and they have sold a lot of technology to schools on the back of that. Children use computers to do research, type essays and cheat. But the core of the system has changed little since the Middle Ages: a “sage on a stage” teacher spouting “lessons” to rows of students. Tom Brown and Huckleberry Finn would recognise it in an instant—and shudder.
Now at last a revolution is under way (see article). At its heart is the idea of moving from “one-size-fits-all” education to a more personalised approach, with technology allowing each child to be taught at a different speed, in some cases by adaptive computer programs, in others by “superstar” lecturers of one sort or another, while the job of classroom teachers moves from orator to coach: giving individual attention to children identified by the gizmos as needing targeted help. In theory the classroom will be “flipped”, so that more basic information is supplied at home via screens, while class time is spent embedding, refining and testing that knowledge (in the same way that homework does now, but more effectively). The promise is of better teaching for millions of children at lower cost—but only if politicians and teachers embrace it.
Parallella is a low cost supercomputer designed by Adapteva using Xilinx Zynq-7010/7020 FPGA+2x Cortex A9 SoC combined with Adapteva Ephipany 16 or 64 cores epiphany coprocessor. The project had a successful kickstarter campaign which allowed then to provide the 16-core version for $99, and the 64-core version for $750. The board will soon be shipped to people who pledged on kickstarter, and one of the promise of the campaign was to fully open source the platform, and today, they just fulfilled that.
The temperature in Phoenix, Arizona hit 119 degrees (F) on the 29th of June, a new record for the date. The heat was so intense, it led to the cancellation of 18 regional flights at the airport (the aircraft used for those routes were restricted to temperatures no higher than 118 degrees).
The extreme heat is also playing havoc with the electrical grid in the US southwest, much earlier than the late August squeeze that is routine. With everyone in the region running their air conditioners at full clip to avoid cooking (more tex-mex sous-vide in airtight homes than outdoor barbecue), there’s barely enough power available to meet demand. And at peak loads, the electrical grid is much more likely to fail.
These are killer temperatures. And if the grid fails right now, it’s not just an inconvenience.
It quickly becomes a matter of life or death.
If you and your community are relatively unprepared, the only way to meet the challenge of a blackout during extreme heat is to band together as a community. Community action during times like this can dramatically reduce the death toll.
However, community action after a crisis hits isn’t the best approach.
The real resilient solution is to produce more locally.
In this case, the ability to produce energy locally and to use it effectively is the key to long term resilience. It can transform a killer blackout into a relatively minor event.
But, resilience like this requires investments at the household and community level, by people like you and me.
For example, if most of the homes in a community produced solar energy, electricity would be not only be available when needed, the production would be peaking at the very same moment the need for it was the most intense. Further, homes with battery backups and natural gas generators would be able to continue to provide energy around the clock and, if the community was connected by a microgrid, a blackout could be completely avoided.
The only way this type of resilience gets built is if you and I build it, before disaster strikes.
So, let’s get going, before we are all cooked together.
Free information will be our doom, Quartz‘s Jaron Lanier asserts in, “Free Information, as Great as it Sounds, Will Enslave Us All.” From high-frequency trading to online marketing, insists Lanier, big data is being used by those with the resources to collect and manipulate it to enrich themselves. Meanwhile, those of us with just paltry, personal devices are the ones creating the information, creating the value that fuels such systems. It is an argument that has been advanced before, and Lanier pursues the thread:
“Something seems terribly askew about how technology is benefiting the world lately. How could it be that so far the network age seems to be a time of endless austerity, jobless recoveries, loss of social mobility, and intense wealth concentration in markets that are anemic overall? How could it be that ever since the incredible efficiencies of digital networking have finally reached vast numbers of people that we aren’t seeing a broad benefit? . . .
“While people are created equal, computers are not. When people share information freely, those who own the best computers benefit in extreme ways that are denied to everyone else. Those with the best computers can simply calculate wealth and power away from ordinary people.”
See the article for its supporting arguments. Lanier does not leave us hanging for a potential solution. He recalls a suggestion he credits to Ted Nelson, which the IT pioneer made back in 1960: embed a “universal micropayment system” into any digital communication network, so that each individual who contributes any bit of data would get a bit of compensation in return. In that reality, for any tweet each of us sent, search query we made, or even security-camera image of us that was later used by any organization (for whatever purpose), we might become a few cents richer.
Interesting idea; can it gain any traction before the current system is set in stone?
I spotted a blog post called “Could Palantir Technologies Be Raising Additional Funding?” I have no clue who or what is behind this interesting item. The main idea is that Palantir, a high profile company which has been in the news about litigation and other matters, has been funded already. According to Crunchbase, the company has more than $300 million in funding. For the sake of comparison, Attivio and Coveo — both in the content processing space — have been able to drum up about $30 million in funding. Most of the companies in the search and content processing space — Digital Reasoning, for instance — have garnered a fraction of what long time players Attivio and Coveo have been able to gather. At the time of its sale to Oracle, Endeca — another content processing and intelligence vendors — was generating an estimated $150 million in revenues. At the time of its sale to Hewlett Packard, Autonomy was nosing into the $800 million range. But the key figure for Autonomy is that it sold to the prescient managers at HP for more than $10 billion.
Let’s assume that Palantir has received funding in the $300 million range. Let’s assume that the company is not raising any additional funding. Let’s assume that the company, founded in 2004, is going to pay back its investors, operate at a profit, and fund necessary research to keep the content processing system in step with competitors like Cybertap, among others.
So what does the gargantuan funding suggest to me, this fine, humid Sunday morning in rural Kentucky?