As part of my participation in the TechChange Course “Tech Tools and Skills for Emergency Management”, I conducted an interview with colleague Adam Papendieck at Tulane University’s Disaster Resilience Leadership Academy. TechChange posted the interview on their own blog this morning, but I thought I would share it here as well.

Mini Biography

Adam Papendieck has an MPH from Tulane University and a technical background in GIS, Statistics and Information Systems.  He is currently the Sr. Program Manager for Technology at the Payson Center for International Development at Tulane University, where his role is to leverage appropriate and innovative information technologies in support of research projects, funded Public Health capacity-building projects in East Africa, and crisis informatics activities with the Disaster Resilience Leadership Academy.  He has worked on applied ICT activities such as the creation of a dynamic web mapping application for the World Vision US corporate information portal, the design and implementation of open source thin client computer labs in Rwanda, the creation of e-learning platforms at African institutions of higher education, various crisis mapping initiatives and disaster analytics activities for the Gulf Oil Spill, Hurricane Katrina and other events.

Interview Notes

I spoke with Adam today (Monday, September 26, 2011) about the technologies we’ve encountered in the TC103 course, his views on some of them, and where he sees the future of the field going. Adam is particularly interested in crowdsourcing and has experience working with Ushahidi, both on the development/applied side for the Gulf Oil Spill last year, and on the evaluation side following the earthquake in Haiti. Here are some of Adam’s (paraphrased) comments on the field, the merits and pitfalls of some of these technologies, and his vision for the future:

Crisis Informatics and Social Networking

We’re in a place where the only people who used to know things in crisis situations were the groups responsible for creating Situation Reports and using that information to guide response, such as those in government, the BINGOs, etc. But now, groups now like Ushahidi and CrisisMappers, whether or not they and the specific, individual technologies will still be around in 5 years, they have opened us up to the idea that extra-institutional groups, organized by technology skill sets and different motivations can be organized to have a real impact on what happens post-crisis, and even pre-crisis as well. And that’s happening, they’re having that impact.

This goes back to social networking in terms of lubricating this system – making it easy for these people to get together over Skype, Facebook, Twitter, etc – some of the individuals may have connections with what’s happening on the ground in a crisis, and they get together and organize themselves with these social networking tools and collaborative development and writing platforms. They are now so nimble and they’re organized enough to present something quickly in a relatively polished format that sort of competes with old-fashioned information systems. We have the old existing structure of sector responders, and now there’s a new structure thanks to these groups like Ushahidi and CrisisMappers.

A Period of Rapid Innovation

We’re in a period of innovation, people are having ideas and trying everything. Ushahidi is a good example, it has characteristics where I think years down the line it will be seen as a very important evolutionary link to wherever we’re going. Unfortunately, there are misguided implications and applications of that kind of software, its being used in many different sectors, neighborhood monitoring, crime mapping, health, with varying degrees of success and effort being put into it. Its important that now you see a new appreciation for evaluating them [the various crisis maps] and learning from them. The groups that are serious about advancing it, like Ushahidi, are honestly interested in evaluation, and turning around to incorporate that evaluation data and user feedback so they can develop further, really looking at their strengths and weaknesses.

Quality & Reliability of Data

Assessing the quality of data has a lot in common with futures speculation; there’s so much information to sort through and trying to identify patterns and assess reliability of information is so difficult. Its possible to use computation models or use Delphi methods to collect panels of experts who can field a stream of information efficiently enough so they can plug in to it quickly, assess rapidly, identify what’s important, and get it back out quickly. There’s this interest in streamlining the QA of these new crowdsourcing systems, because its the higher level QA that’s missing right now.

Leveraging the Crowd

There’s a lot of purchase in the realm of disaster resilience and crisis informatics, covering new ground in volunteer mobilization. The most exciting thing to me is crowdsourcing – its new and disruptive and it’s the new information source. The pool is there, and it’s a matter of figuring out how best to use it and have the impact our more formal assessments can have, really leveraging the crowd. This means somehow getting the crowd to provide good information, being careful that you’re getting real information feedback loops set up, and working to incentivize the input of data. There’s a lot of goodwill in this community to get things going, but sustaining it for the long-term and even for monitoring, you have to identify subsets of the crowd of users who have something at stake, and you have to make sure that the information that comes back is represented in ways that they appreciate, so that their participation is incentivized. Going back to data quality, trustworthiness is also an issue. Offshoot projects like SwiftRiver are designed to be solutions, crowdsourcing the quality assurance piece, but there are other options, like these computational models (text analysis, for example) where the data submitted by different people is weighted somehow based on their reliability, instead of relying wholly on volunteers who don’t know the specifics or the context. So its interesting to think of applying computational linguistics to assess validity. Another approach is “controlling the crowd”, so limiting the size of it, the people who are inputting, which kind of goes against the original idea of crowdsourcing. If you’re only allowing certain people to submit to the common database, you’re controlling users to focus on a higher level of expertise to ensure good quality data, to determine in what context data should be classified, etc. This is as opposed to the “Twitter model”, where you’re basically just screening through hashtags to find what’s real and what’s not, but you can’t control the source.

The Value of a New Contextual Awareness

There’s been a lot of criticism, and in some ways there is a lot to criticize about initiatives like 4636 in Haiti and things like that because its so hard to identify the real impact of those activities. But there’s a clear contextual awareness thing that’s going on that’s so much more than what we get through the news and official press releases and government statements that’s so valuable. It’s different information coming from someone who’s having a different experience than a journalist or an NGO worker or a government employee, and that’s a good awareness and perspective to have – an authentically new contextual awareness. And it IS valuable.

Where its difficult is where we’re trying to use this tool to have an impact for the victims of an earthquake, and in order to do that we have the traditional sectoral response operation with clear roles and ways of doing things and certain ways of exchanging information and certain ways that they don’t. You can’t expect crowdsourced information initiatives to situate easily in that environment and immediately be successful because the exiting response operation is not that well oiled to begin with. The mission for crowdsourcing information tools is integrating with and respecting the existing sectoral response community and showing how they can add value to the exiting structure. It has the potential to be very disruptive once there’s respect for the way this information can add to the contextual awareness and databases; and this information is OPEN so it will lead to redundancies and an appreciation for shared open databases which wil be new and happening and exciting. But we’re not quite there yet.

Oil Spill Crisis Map

Thinking about the Ushahidi project for the Gulf Oil Spill Map with Louisiana Bucket Brigade, and how we’d change things the next time around, I’d want to work on the last leg of the information feedback loop. We got information, it fed into the system, we mapped, we got this contextual awareness, we were classifying things well, data organization was good (easier for this crisis because of the scale and focus on environmental issues), we did a lot of things right. But we stopped short of grabbing that data and packaging it and injecting it into the responding agencies. We could have done it. It wasn’t a technology issue, it was one of manpower, we needed a group or at least someone who had screened the reports, gathered them and put them in something more low-tech and accessible like an email or powerpoint, and sent it to wildlife protection agencies who were responding. That’s the human power approach. This is what text analysis is supposed to solve by dynamically creating these mechanisms – that’s a human link in the information feedback loop. We did a little of that but what would really have completed the project is if the info could have been read through, assessed,(quality checked), used by responders, and the people on the ground, the sources of that data would KNOW that it was useful, there would be a demonstrated response to the participation that people on the ground had in sending in information. You want that impact to be as transparent as possible, and participation is absolutely key; people want to see things happening from their texts, and if they don’t, they’re not going to use it next time.

See the original post on the TechChange Blog. 

About TechChange

TechChange partners with universities, organizations and software developers to train leaders to leverage emerging technologies for social change. TechChange envisions a world where a highly trained corps of creative and tech-savvy professionals can effectively and quickly respond to the most critical humanitarian, development and peacebuilding challenges of our time.


The four-week online professional development certificate course “Tech Tools and Skills for Emergency Management” (TC103) will explore how new communication and mapping technologies are being used to respond to disasters, create early warning mechanisms, improve coordination efforts and much more. From the earthquake recovery efforts in Haiti and Japan to the monitoring of election violence in countries like Kenya this course will consider a variety of real world examples from organizations working in the field and analyze some of the key challenges related to access, implementation, scale, and verification that working with new technology presents. The course is designed to assist professionals in developing concrete strategies and technological skills to work amid this rapidly evolving landscape.