Patrick Meier: Live Crisis Map of Disaster Damage Reported on Social Media

Crowd-Sourcing, Data, Design, Geospatial, Governance, Innovation, P2P / Panarchy, Resilience
Patrick Meier
Patrick Meier

Live Crisis Map of Disaster Damage Reported on Social Media

Digital humanitarian volunteers have been busing tagging images posted to social media in the aftermath of Typhoon Yolanda. More specifically, they’ve been using the new MicroMappers ImageClicker to rate the level of damage they see in each image. Thus far, they have clicked over 7,000 images. Those that are tagged as “Mild” and “Severe” damage are then geolocated by members of the Standby Volunteer Task Force (SBTF) who have partnered with GISCorps and ESRI to create this live Crisis Map of the disaster damage tagged using the ImageClicker. The map takes a few second to load, so please be patient.

YolandaPH Crisis Map 1

The more pictures are clicked using the ImageClicker, the more populated this crisis map will become. So please help out if you have a few seconds to spare—that’s really all it takes to click an image. If there are no picture left to click or the system is temporarily offline, then please come back a while later as we’re uploading images around the clock. And feel free to join our list-serve in the meantime if you wish to be notified when humanitarian organizations need your help in the future. No prior experience or training necessary. Anyone who knows how to use a computer mouse can become a digital humanitarian.

The SBTF, GISCorps and ESRI are members of the Digital Humanitarian Network (DHN), which my colleague Andrej Verity and I co-founded last year. The DHN serves as the official interface for direct collaboration between traditional “brick-and-mortar” humanitarian organizations and highly skilled digital volunteer networks. The SBTF Yolanda Team, spearheaded by my colleague Justine Mackinnon, for example, has also produced this map based on the triangulated results of the TweetClicker:

YolandaPH Crisis Map 2
There’s a lot of hype around the use of new technologies and social media for disaster response. So I want to be clear that our digital humanitarian operations in the Philippines have not been perfect. This means  that we’re learning (a lot) by doing (a lot). Such is the nature of innovation. We don’t have the luxury of locking ourselves up in a lab for a year to build the ultimate humanitarian technology platform. This means we have to work extra, extra hard when deploying new platforms during major disasters—because not only do we do our very best to carry out Plan A, but we often have to carry out  Plans B and C in parallel just in case Plan A doesn’t pan out. Perhaps Samuel Beckett summed it up best: “Ever tried. Ever failed. No matter. Try Again. Fail again. Fail better.”

Patrick Meier: Digital Humanitarians: From Haiti Earthquake to Typhoon Yolanda

Crowd-Sourcing, Data, Design, Geospatial, Governance, Innovation, Mobile, P2P / Panarchy, Resilience
Patrick Meier
Patrick Meier

Digital Humanitarians: From Haiti Earthquake to Typhoon Yolanda

We’ve been able to process and make sense of a quarter of a million tweets in the aftermath of Typhoon Yolanda. Using both AIDR (still under development) and Twitris, we were able to collect these tweets in real-time and use automated algorithms to filter for both relevancy and uniqueness. The resulting ~55,000 tweets were then uploaded to MicroMappers (still under development). Digital volunteers from the world over used this humanitarian technology platform to tag tweets and now images from the disaster (click image below to enlarge). At one point, volunteers tagged some 1,500 tweets in just 10 minutes. In parallel, we used machine learning classifiers to automatically identify tweets referring to both urgent needs and offers of help. In sum, the response to Typhoon Yolanda is the first to make full use of advanced computing, i.e., both human computing and machine computing to make sense of Big (Crisis) Data.

ImageClicker YolandaPH

We’ve come a long way since the tragic Haiti Earthquake. There was no way we would’ve been able to pull off the above with the Ushahidi platform. We weren’t able to keep up with even a few thousand tweets a day back then, not to mention images. (Incidentally, MicroMappers can also be used to tag SMS). Furthermore, we had no trained volunteers on standby back when the quake struck. Today, not only do we have a highly experienced network of volunteers from the Standby Volunteer Task Force (SBTF) who serve as first (digital) responders, we also have an ecosystem of volunteers from the Digital Humanitarian Network (DHN). In the case of Typhoon Yolanda, we also had a formal partner, the UN Office for the Coordination of Humanitarian Affairs (OCHA), that officially requested digital humanitarian support. In other words, our efforts are directly in response to clearly articulated information needs. In contrast, the response to Haiti was “supply based” in that we simply pushed out all information that we figured might be of use to humanitarian responders. We did not have a formal partner from the humanitarian sector going into the Haiti operation.

Yolanda Prezi

What this new digital humanitarian operation makes clear is that preparedness, partnerships & appropriate humanitarian technology go a long way to ensuring that our efforts as digital humanitarians add value to the field-based operations in disaster zones. The above Prezi by SBTF co-founder Anahi (click on the image to launch the presentation) gives an excellent overview of how these digital humanitarian efforts are being coordinated in response to Yolanda.

While there are many differences between the digital response to Haiti and Yolanda, several key similarities have also emerged. First, neither was perfect, meaning that we learned a lot in both deployments; taking a few steps forward, then a few steps back. Such is the path of innovation, learning by doing. Second, like our use of Skype in Haiti, there’s no way we could do this digital response work without Skype. Third, our operations were affected by telecommunications going offline in the hardest hit areas. We saw an 18.7% drop in relevant tweets on Saturday compared to the day before, for example. Fourth, while the (very) new technologies we are deploying are promising, they are still under development and have a long way to go. Fifth, the biggest heroes in response to Haiti were the volunteers—both from the Haitian Diaspora and beyond. The same is true of Yolanda, with hundreds of volunteers from the world over (including the Philippines and the Diaspora) mobilizing online to offer assistance.

A Filipino humanitarian worker in Quezon City, Philippines, for example, is volunteering her time on MicroMappers. As is customer care advisor from Eurostar in the UK and a fire officer from Belgium who recruited his uniformed colleagues to join the clicking. We have other volunteer Clickers from Makati (Philippines), Cape Town (South Africa), Canberra & Gold Coast (Australia), Berkeley, Brooklyn, Citrus Heights & Hinesburg (US), Kamloops (Canada), Paris & Marcoussis (France), Geneva (Switzerland), Sevilla (Spain), Den Haag (Holland), Munich (Germany) and Stokkermarke (Denmark) to name just a few! So this is as much a human story is it is one about technology. This is why online communities like MicroMappers are important. So please join our list-serve if you want to be notified when humanitarian organizations need your help.

Bio

Patrick Meier: Big Data & Disaster Response: Even More Wrong Assumptions

Crowd-Sourcing, Data, Design, Geospatial, Governance, Mobile, P2P / Panarchy
Patrick Meier
Patrick Meier

Big Data & Disaster Response: Even More Wrong Assumptions

“Arguing that Big Data isn’t all it’s cracked up to be is a straw man, pure and simple—because no one should think it’s magic to begin with.” Since citing this point in my previous post on Big Data for Disaster Response: A List of Wrong Assumptions, I’ve come across more mischaracterizations of Big (Crisis) Data. Most of these fallacies originate from the Ivory Towers; from social scientists who have carried out one or two studies on the use of social media during disasters and repeat their findings ad nauseam as if their conclusions are the final word on a very new area of research.

The mischaracterization of “Big Data and Sample Bias”, for example, typically arises when academics point out that marginalized communities do not have access to social media. First things first: I highly recommend reading “Big Data and Its Exclusions,” published by Stanford Law Review. While the piece does not address Big Crisis Data, it is nevertheless instructive when thinking about social media for emergency management. Secondly, identifying who “speaks” (and who does not speak) on social media during humanitarian crises is of course imperative, but that’s exactly why the argument about sample bias is such a straw man—all of my humanitarian colleagues know full well that social media reports are not representative. They live in the real world where the vast majority of data they have access to is unrepresentative and imperfect—hence the importance of drawing on as many sources as possible, including social media. Random sampling during disasters is a Quixotic luxury, which explains why humanitarian colleagues seek “good enough” data and methods.

Continue reading “Patrick Meier: Big Data & Disaster Response: Even More Wrong Assumptions”

Patrick Meier: Mining Mainstream Media for Emergency Management 2.0

Data, Design, Innovation
Patrick Meier
Patrick Meier

Mining Mainstream Media for Emergency Management 2.0

by Patrick Meier

There is so much attention (and hype) around the use of social media for emergency management (SMEM) that we often forget about mainstream media when it comes to next generation humanitarian technologies. The news media across the globe has become increasingly digital in recent years—and thus analyzable in real-time. Twitter added little value during the recent Pakistan Earthquake, for example. Instead, it was the Pakistani mainstream media that provided the immediate situational awareness necessary for a preliminary damage and needs assessment. This means that our humanitarian technologies need to ingest both social media and mainstream media feeds. 

 

Newspaper-coversNow, this is hardly revolutionary. I used to work for a data mining company ten years ago that focused on analyzing Reuters Newswires in real-time using natural language processing (NLP). This was for a conflict early warning system we were developing. The added value of monitoring mainstream media for crisis mapping purposes has also been demonstrated repeatedly in recent years. In this study from 2008, I showed that a crisis map of Kenya was more complete when sources included mainstream media as well as user-generated content.

So why revisit mainstream media now? Simple: GDELT. The Global Data Event, Language and Tone dataset that my colleague Kalev Leetaru launched earlier this year. GDELT is the single largest public and global event-data catalog ever developed. Digital Humanitarians need no longer monitor mainstream media manually. We can simply develop a dedicated interface on top of GDELT to automatically extract situational awareness information for disaster response purposes. We're already doing this with Twitter, so why not extend the approach to global digital mainstream media as well?

Continue reading “Patrick Meier: Mining Mainstream Media for Emergency Management 2.0”

Stephen E. Arnold: Better Data Is Out There [Just Not From the US Government, or the Banks, or the Corporations, or Most Universities and Media….]]

Data
Stephen E. Arnold
Stephen E. Arnold

Better Data Is Out There

Many have been operating under the assumption that the digital age has been providing us with reliable and accurate information. David Soloff noticed that this was incorrect when he was comparing grocery store prices against a government claim that they had dropped for the first time in more than half a century. Soloff discovered that prices, however, had increased by 5%. People are relying on misconstrued data, so Soloff founded Premise Data Corp. to sell better data. The San Francisco Gate details Soloff in “Google-Backed Startup Seeks Clearer Economic Signals Through Better, Faster, Stronger Data.” Backing the company are Google Ventures, Andreessen Horowitz and Harrison Metal.

Premise gathers data with a “global Internet trawl” that reads data from the Internet as well as using the old-fashioned approach of sending people into the field. The company plans on selling its “better” data to financial institutes, packaged good companies, and government and international organizations. So far the only customer they have is Bloomberg, but starting off with a big name like that is not a bad start.

John Morgan, an economist at UC Berkley, does not think it will be as easy to collect data as Premise hopes. He points out that governments change data for their own political aims and stores are not too keen on having people take photos of their wares. These are obvious observations, but Morgan goes on to say that not many people are going to want to buy Premise’s product:

“Meanwhile, he’s dubious that many consumer product companies will pay for this information because there are already many reliable sources on pricing for packaged goods. He’s also doubtful governments will be in the market for this information because they’ll insist on control over the collection and analysis. Morgan said the remaining question is whether Premise can earn a comfortable profit supplying tools to remaining potential customers, such as financial institutions, while paying a worldwide army of data collectors.”

It looks like we will have the choice of data vendors in the future. Who provides the best data? Who is going to be providing Google with the better results? A new market just opened up and Wall Street has not caught on yet.

Whitney Grace, October 29, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

Patrick Meier: World Disaster Report: Next Generation Humanitarian Technology

Crowd-Sourcing, Data, Design, Economics/True Cost, Governance
Patrick Meier
Patrick Meier

World Disaster Report: Next Generation Humanitarian Technology

Posted on | Leave a comment

This year’s World Disaster Report was just released this morning. I had the honor of authoring Chapter 3 on “Strengthening Humanitarian Information: The Role of Technology.” The chapter focuses on the rise of “Digital Humanitarians” and explains how “Next Generation Humanitarian Technology” is used to manage Big (Crisis) Data. The chapter complements the groundbreaking report “Humanitarianism in the Network Age” published by UN OCHA earlier this year.

Learn more, includes video.

Patrick Meier: Humanitarian Crisis Computing 101

Crowd-Sourcing, Data, Governance
Patrick Meier
Patrick Meier

Disaster-affected communities are increasingly becoming “digital” communities. That is, they increasingly use mobile technology & social media to communicate during crises. I often refer to this user-generated content as Big (Crisis) Data. Humanitarian crisis computing seeks to rapidly identify informative, actionable and credible content in this growing stack of real-time information. The challenge is akin to finding the proverbial needle in the haystack since the vast majority of reports posted on social media is often not relevant for humanitarian response. This is largely a result of the demand versus supply problem described here.

 . . . . . . . . .

The smaller the micro-stack, the easier the tasks and the faster that they can be carried out by a greater number of volunteers. For example, instead of having 10 people classify 10,000 tweets based on the Cluster System, microtasking makes it very easy for 1,000 people to classify 10 tweets each. The former would take hours while the latter mere minutes. In response to the recent earthquake in Pakistan, some 100 volunteers used MicroMappers to classify 30,000+ tweets in about 30 hours, for example.

Read full post with utterly brilliant photographs that make all this clear.