GI Wilson: Maps for Post-Sandy Recovery – Good, Bad, & Ugly – Comment by Robert Steele

Earth Intelligence, Geospatial, IO Mapping
Col GI Wilson, USMC (Ret)

We always have a map problem…I know you have known this for yrs and yrs and been the single voice in the map wilderness calling out….I wonder who got all those old Soviet maps after the fall…..we have never solved this problem…we just think we have. yes…No…?

After Sandy, Intelligence Agencies Scramble To Feed Maps, Data To Rescuers

Colin Clark

AOL Government, 30 October 2012

Click on Image to Enlarge

As FEMA, firemen, police and the National Guard wade into the devastation visited upon us by Hurricane Sandy, many of them are using maps and other information made available to them by intelligence agencies.

While intelligence analysts and their technical specialists usually spend their time targeting bad guys and helping troops plan to get them, some of them have gotten the rare and welcome chance to help their own countrymen at home several times since Hurricane Katrina ravaged New Orleans.

The National Geospatial Intelligence Agency provides most of the support to civil authorities during disasters. It takes photos, infrared and other data from satellites and airplanes and builds them into remarkably detailed and accurate maps.

Read full article.

Continue reading “GI Wilson: Maps for Post-Sandy Recovery – Good, Bad, & Ugly – Comment by Robert Steele”

Patrick Meier: Hybrid Mergers of Crowdsourcing and Computers

Geospatial, P2P / Panarchy
Patrick Meier

The Limits of Crowdsourcing Crisis Information and The Promise of Advanced Computing

First, I want to express my sincere gratitude to the dozen or so iRevolution readers who recently contacted me. I have indeed not been blogging for the past few weeks but this does notmean I have decided to stop blogging altogether. I’ve simply been ridiculously busy (and still am!). But I truly, truly appreciate the kind encouragement to continue blogging, so thanks again to all of you who wrote in.

Now, despite the (catchy?) title of this blog post, I am not bashing crowd-sourcing or worshipping on the alter of technology. My purpose here is simply to suggest that the crowdsourcing of crisis information is an approach that does not scale very well. I have lost count of the number of humanitarian organizations who said they simply didn’t have hundreds of volunteers available to manually monitor social media and create a live crisis map. Hence my interest in advanced computing solutions.

The past few months at the Qatar Computing Research Institute (QCRI) have made it clear to me that developing and applying advanced computing solutions to address major humanitarian challenges is anything but trivial. I have learned heaps about social computing, machine learning and big data analytics. So I am now more aware of the hurdles but am even more excited than before about the promise that advanced computing holds for the development of next-generation humanitarian technology.

The way forward combines both crowdsourcing and advanced computing. The next generation of humanitarian technologies will take a hybrid approach—at times prioritizing “smart crowdsourcing” and at other times leading with automated algorithms. I shall explain what I mean by smart crowdsourcing in a future post. In the meantime, the video above from my recent talk at TEDxSendai expands on the themes I have just described.

Phi Beta Iota:  Dr. Meier, an absolute pioneer in crisis information management that leverages shared geospatial foundations and brilliant innovative collaborative networks of open source software and a melange of common hand-held cell phones, has bracketed  two of the four pillars of advanced intelligence.  The other two are the whole system model that assumes nothing, and the true cost documentation that assumes nothing.

See Also:

21st Century Intelligence Core References 2007-2013

Yoda: Why OpenStreetMap Worries Old Industry (and Old Government?) — the End of Intellectual “Property” and Rise of “Value Added”

Geospatial
Got Crowd? BE the Force!

The New Cartographers: Why OpenStreetMap Worries Tech Companies

Nevermind Apple’s maps misfire, the free, volunteer-made OpenStreetMap may end up reigning supreme anyway, as companies increasingly choose it for map data over Google. But as the project grows, it’s becoming harder and harder for its members to agree on what direction to go in next. Part 2 of a 3-part series. Read part 1 here.

“There is literally not a mapping company in the world that doesn’t use OpenStreetMap in some capacity,” Steve Coast, founder of the free, crowdsourced world map, in his keynote address to some 224 passionate geography junkies at the second annual State of the Map USA conference in Portland, Oregon, on October 13.

Already, in the last year alone, some of the biggest names in the tech sector have switched from Google Maps — which began charging for heavy use of its data in January 2012 — to OpenStreetMap (OSM) to power their map apps or websites.

Read full article.

Continue reading “Yoda: Why OpenStreetMap Worries Old Industry (and Old Government?) — the End of Intellectual “Property” and Rise of “Value Added””

Patrick Meier: Could Twitris+ Be Used for Disaster Response (and Other Apps?)

Geospatial
Patrick Meier

Could Twitris+ Be Used for Disaster Response?

I recently had the pleasure of speaking with Hermant Purohit and colleagues who have been working on an interesting semantic social web application called Twitris+. A project of the the Ohio Center of Excellence in Knowledge-enabled Computing (Kno.e.sis), Twitris+ uses “real-time monitoring and multi-faceted analysis of social signals to provide insights and a framework for situational awareness, in-depth event analysis and coordination, emergency response aid, reputation management etc.”

Unlike many other social media platforms I’ve reviewed over recent months, Twitris+ geo-tags content at the tweet-level rather than at the bio level. That is, many platforms simply geo-code tweets based on where a person says s/he is as per their Twitter bio. Accurately and comprehensively geo-referencing social media content is of course no trivial matter. Since many tweets do not include geographic information, colleagues at GeoIQ are seeking to infer geographic information after analyzing a given stream of tweets, for example.

Read full post with graphic.

See Also:

Patrick Meier at Phi Beta Iota

Mini-Me: Earthquakes, East Coast, Fukushima Redux + Meta-RECAP

03 Environmental Degradation, 07 Other Atrocities, 08 Proliferation, 10 Transnational Crime, Communities of Practice, Corruption, Geospatial, Policies, Politics
Who? Mini-Me?

Huh?

In Japan during the 2011 earthquake and tsunami? Radiation exposure estimates now available

The Pentagon says that none of the nearly 70,000 members of the DoD-affiliated population (service members, DoD civilian employees and contractors, and family members of service members and civilian employees) who were on or near the mainland of Japan between March 12 and May 11, 2011, are known to have been exposed to radiation at levels associated with adverse medical conditions.

East Coast earthquake created a ‘new normal'

The quake was centered 3 to 4 miles beneath Mineral, a town of fewer than 500 people about 50 miles northwest of Richmond. Yet it was believed to have been felt by more people than any other in U.S. history.

Last Year’s Quake Shook Up Virginia Nukes

It was the first time ever in this country that a nuclear power station had gone through an emergency shutdown because of an earthquake. In this case it was a rare 5.8 magnitude seismic event with an epicenter a few miles away that ruined Louisa County school buildings, cracked the Washington Monument and shook the North Anna beyond what it was designed to deal with.

Why quake forecast maps often fail

Click on Image to Enlarge

Three of the largest and deadliest earthquakes in recent history occurred where earthquake hazard maps didn’t predict massive quakes, scientists say.  A combination of bad assumptions, bad data, bad physics, and bad luck is why hazard maps have failed to predict three of the largest and deadliest earthquakes in recent history.

Earthquake Damage: Are Bad Maps to Blame?

A new study argues that earthquake-hazard maps didn't give engineers and seismologists a full picture of several recent quakes' dangers.

Study links fracking and earthquakes

In hydraulic fracturing, or fracking, millions of gallons of water, mixed with sand and chemicals, are injected into rock thousands of feet underground to extract natural gas. Frohlich said the most likely explanation for the quakes is that once injected, the fluids apply pressure to faults in the area and unstick them.

Fukushima Hangs by the Devil’s Thread

The molten cores at Units 1, 2 & 3 have threatened all life on Earth. The flood of liquid radiation has poisoned the Pacific. Fukushima’s cesium and other airborne emissions have already dwarfed Three Mile Island, Chernobyl and all nuclear explosions including Hiroshima and Nagasaki.

But at Unit 4, more than 1500 rods remain suspended in air. Called “a bathtub on the roof” by CNN anchor Jon King, the damaged pool teeters atop a building decimated by seismic shocks and at least one hydrogen explosion. The question is not if, but when it will come crashing down.

See Also:

Continue reading “Mini-Me: Earthquakes, East Coast, Fukushima Redux + Meta-RECAP”

Patrick Meier: Innovation and the State of the Humanitarian System + RECAP

Geospatial, IO Deeds of Peace, IO Impotency
Patrick Meier

Innovation and the State of the Humanitarian System

Published by ALNAP, the 2012 State of the Humanitarian System report is an important evaluation of the humanitarian community’s efforts over the past two years. “I commend this report to all those responsible for planning and delivering life saving aid around the world,” writes UN Under-Secretary General Valerie Amos in the Preface. “If we are going to improve international humanitarian response we all need to pay attention to the areas of action highlighted in the report.” Below are some of the highlighted areas from the 100+ page evaluation that are ripe for innovative interventions.

Accessing Those in Need

Operational access to populations in need has not improved. Access problems continue and are primarily political or security-related rather than logistical. Indeed, “UN security restrictions often place sever limits on the range of UN-led assessments,” which means that “coverage often can be compromised.” This means that “access constraints in some contexts continue to inhibit an accurate assessment of need. Up to 60% of South Sudan is inaccessible for parts of the year. As a result, critical data, including mortality and morbidity, remain unavailable. Data on nutrition, for example, exist in only 25 of 79 countries where humanitarian partners have conducted surveys.”

Could satellite and/or areal imagery be used to measure indirect proxies? This would certainly be rather imperfect but perhaps better than nothing? Could crowdseeding be used?

Information and Communication Technologies

Continue reading “Patrick Meier: Innovation and the State of the Humanitarian System + RECAP”

Patrick Meier: Geo-Fencing and Grass Roots Emergency Communications and Geo-Referenced Sense-Making

Advanced Cyber/IO, Geospatial
Patrick Meier

How People in Emergencies Use Communication to Survive

“Still Left in the Dark? How People in Emergencies Use Communication to Survive — And How Humanitarian Agencies Can Help” is an excellent report pub-lished by the BBC World Service Trust earlier this year. It is a follow up to the BBC’s 2008 study “Left in the Dark: The Unmet Need for Information in Humanitarian Emergencies.” Both reports are absolute must-reads. I highlight the most important points from the 2012 publication below.

Are Humanitarians Being Left in the Dark?

Continue reading “Patrick Meier: Geo-Fencing and Grass Roots Emergency Communications and Geo-Referenced Sense-Making”