Archive for October, 2011


As part of my participation in the TechChange Course “Tech Tools and Skills for Emergency Management”, I conducted an interview with colleague Adam Papendieck at Tulane University’s Disaster Resilience Leadership Academy. TechChange posted the interview on their own blog this morning, but I thought I would share it here as well.

Mini Biography

Adam Papendieck has an MPH from Tulane University and a technical background in GIS, Statistics and Information Systems.  He is currently the Sr. Program Manager for Technology at the Payson Center for International Development at Tulane University, where his role is to leverage appropriate and innovative information technologies in support of research projects, funded Public Health capacity-building projects in East Africa, and crisis informatics activities with the Disaster Resilience Leadership Academy.  He has worked on applied ICT activities such as the creation of a dynamic web mapping application for the World Vision US corporate information portal, the design and implementation of open source thin client computer labs in Rwanda, the creation of e-learning platforms at African institutions of higher education, various crisis mapping initiatives and disaster analytics activities for the Gulf Oil Spill, Hurricane Katrina and other events.

Interview Notes

I spoke with Adam today (Monday, September 26, 2011) about the technologies we’ve encountered in the TC103 course, his views on some of them, and where he sees the future of the field going. Adam is particularly interested in crowdsourcing and has experience working with Ushahidi, both on the development/applied side for the Gulf Oil Spill last year, and on the evaluation side following the earthquake in Haiti. Here are some of Adam’s (paraphrased) comments on the field, the merits and pitfalls of some of these technologies, and his vision for the future:

Continue reading


I’m pleased and excited to announce the first major project of the Ports in the Storm blog: a study of Qualitative Data-Gathering Methods of Major International Humanitarian Organizations. This study began as an independent study course for the summer of 2011, designed to be an exploration of the dominant qualitative data-gathering methods most often employed by the major international humanitarian aid organizations, under the supervision of Payson Adjunct Assistant Professor Nathan Morrow.

This study has grown into a larger and more exciting project than originally anticipated, and as such I have created a series of resources based on my findings, including an index of sources, a series compilations of methodology and utilization of a variety of qualitative methods, a discussion of prescribed methods vs. methods employed by the organizations studied, some best practices, gap analysis, and proposal for the minimum tools and training to be employed in each phase of the humanitarian program cycle.

These elements are set up as a series of blog posts from the past couple of weeks, and are all accessible via the page on this blog entitled “Qualitative Data-Gathering Methods of Major International Humanitarian Organizations“.

It is my sincere hope that this study and the resources complied therein will be useful for others in the field of international development and humanitarian aid studies, as a resource of best practices in qualitative research. I welcome and look forward to feedback – please let me know what you think and feel free to provide suggestions for additional resources and/or research!


Availability of Resources & Reports

A fundamental difficulty with this study is that it relies on the toolkits, guidelines, reports and other documents publicly accessible from the organization websites. This introduces the very real possibility (indeed, likelihood) that these documents are not a representative sample of the work being done by these organizations. However, one could reasonably assume that the documents publicly accessible (and therefore included in this study) were selected to be posted online based on their quality, their relevance to current humanitarian crises, or their treatment of current issues in humanitarian aid, and are therefore worthy of consideration here.

Two types of documents were sought for inclusion in this study: (see Chart 1)

  1. Documents such as guidelines or toolkits which outline prescribed practices to be used in the field when carrying out a particular stage of the program cycle; and
  2. reports, such as assessments, appraisals, or evaluations of actual aid responses or programs carried out by the organization.

Chart 1: Documents Reviewed by Type

Continue reading

Resources for Qualitative Data-Gathering & Research Design

Berg, B. (2008) Qualitative Research Methods for the Social Sciences (7th Edition). Boston, MA: Allyn & Bacon.

Bernard, H. R. (2006) Research Methods in Anthropology: Qualitative and Quantitative Approaches (4th Edition). Lanham, MD: AltaMira Press.

Perecman, E. & Curran, S., eds. (2006) A Handbook for Social Science Field Research: Essays & Bibliographic Sources on Research Design and Methods. London, UK: SAGE Publications.

Trochim, W. M. K. Research Methods Knowledge Base. (online resource)

Additional Resources for Specific Techniques

Emergency Capacity Building Project (ECB). The Good Enough Guide: Impact Measurement and Accountability in Emergencies. Published by Oxfam GB and World Vision International. 2007. Accessible here.

Academy for Educational Development, Population Communication Services (AED/PCS). CAFS Handbook: Participatory Techniques. 2002. Accessible here.

Centers for Disease Control & Prevention (CDC), Evaluation Research Team. Evaluation Brief No. 13: Data Collection Methods for Program Evaluation: Focus Groups. July 1998. Accessible here.

CRS. Maj, M. et al. Guidance for Implementing Station Days: A Child-Centered Monitoring & Evaluation Tool. Catholic Relief Services. Baltimore, MD. 2009. Accessible here.

PolicyLink. Community Mapping – What is it? (online resource) 2002. Accessible here.

Spiers, J.A. “Tech Tips: Using Video Management/Analysis Technology in Qualitative Research.” International Journal of Qualitative Methods, Vol. 3, No. 1, April 2004. Accessible here.

STC. “Participatory Video: A Qualitative Method of Monitoring & Evaluation.” Design, Monitoring & Evaluation – Save the Children (blog). Posted 20 October 2009. Accessible here.

UCLA Center for Health Policy Research. Health DATA Program. Section 4: Key Informant InterviewsAccessible here.

US Agency for International Development (USAID). Performance Monitoring & Evaluation TIPS: Conducting Key Informant Interviews. Washington, DC. 1996. Number 2. Accessible here.

World Bank. Transect Walk and Diagramming: Procedures and Examples. (online resource) Accessible here.


Based on a review of the available documents and the methods discussed therein, here are some initial findings:


The stages of the program cycle where data collection methodology is most emphasized, both generally and for qualitative methods in particular, are assessment, monitoring & evaluation. For other stages of the program cycle, there were some documents that addressed data collection for the appraisal stage, but very few that discussed data collection methods for program design or baseline determination. Not one of the organizations studied had documents that covered all stages of the program cycle. (see Table 1)

Table 1: Eight Organizations & Methods Found, by Stage of Program Cycle

Continue reading