M&E Sourcebook: Methods and Process

Report writing and presentation of results

The evaluation process must be as open as possible with the results made widely available. For evaluations to be useful, they must be used. Feedback to both policy-makers and operational staff is essential.

The way in which impact assessment findings are recorded, presented and shared is of great, and often underestimated, importance.

Future challenges in disaster reduction include measuring progress, thorough reporting and dissemination, and advocacy based on documented successful practices.

Presenting M&E results

Willingness to acknowledge and learn from experience is essential. M&E must lead to improvements in agencies' work to reduce risk. M&E reports are potentially valuable documents. They allow for practical lessons to be learned within and across programmes and regions. They provide a basis for discussion about better practice and policy change. They also contribute to institutional memory, which is important in organisations that suffer from rapid staff turnover. The need for better feedback is generally acknowledged and widely discussed in aid agency circles; similarly, the institutional obstacles are well understood; and guidance on good practice generally is becoming available.

Planning and monitoring feedback loops

Most DRR evaluations are forward-looking, seeking to assess the appropriateness of project activities, highlight strengths and weaknesses, and draw lessons for future work. They can be used to modify project approaches (both in detail and orientation), assist strategic planning, and feed into organisational strategies.


Examples: Monitoring and evaluation feedback loops

Some good examples where monitoring and evaluation findings were fed back into planning:

  • ITDG's ongoing monitoring and reviews led to concrete changes in its livelihood options for DRR in South Asia project. (See Lockwood and Alonson 2003: 26)




Examples: Strategic planning

  • DFID's evaluation of PAHO's PED programme in 2003 was intended to assist PAHO in strategic planning under new directorship and developing a 5-year workplan for donors. (See Gander et al 2003: 3)
  • ECHO's regional and global DP/DRR reviews are designed to feed into organisational strategies. (See de Haulleville and Halatov 2002)
  • A review of the Bangladesh Red Crescent's community-based cyclone preparedness project in Cox's Bazaar found that "The largest attribute that has enabled the project to remain steadfast to the stated approach has been the openness to learning; learning from the experience of others but more importantly from the project's own experiences through a process of action-reflection-learning" (Madiath 2002: 14). Reviews of the BDRCS's cyclone preparedness work in the early 1990s influenced its reorientation towards a more participatory approach. (See Venghaus et al 2000: 18-19)


However, it is not clear how good agencies are at absorbing lessons from DRR evaluations, at project, programme or policy levels. It appears that they are poor at absorbing the particular and general lessons that come from evaluations. Agencies may place too little emphasis on lesson-learning and feeding lessons into management of ongoing activities. Often the review or evaluation report is filed rather than acted upon. Many organisations have poor information storage and retrieval systems. Few staff have time to reflect upon lessons learned. In NGOs particularly, overwork and pressures of work can constitute a systemic weakness preventing thinking and innovation.

Evaluators may themselves need to make efforts to ensure that lessons are adopted. One evaluation team found that although significant resources were invested by an agency in regular evaluations and monitoring missions, the actual changes to a programme that such undertakings make were not always proportional to the inputs. It used an evaluation workshop to explore why previous recommendations had not been implemented: this identified a number of institutional and other influences (see Venghaus et al 2000: 27-30).

Dissemination of results

The OECD DAC guidelines recommend "systematic dissemination" of results but more transparency is needed in DRR M&E, where general failure to share evaluations hinders learning. This culture of concealment runs counter to the principle of accountability that many agencies claim to follow.

The best collection of DRR evaluation reports is the ALNAP database but most of its documentation is confidential. Many agencies distribute information on successful projects, but this can tend towards agency propaganda. However, publication of evaluations is slowly becoming more common.

Evaluation findings should be fed back to all project stakeholders before reports are submitted, to allow for discussion and clarification. It is not clear how often this happens. Participatory evaluations are more likely to do so: participation creates "ownership" of the final product among stakeholders, increasing the likelihood that lessons will be acted upon, although participatory feedback workshops can be time-consuming. Evaluations by external consultants often limit feedback to debriefing sessions with the commissioning agency.

Participatory evaluations

Some good examples of participation in evaluation include:

  • The Canadian Council for International Co-operation's Reconstruction and Rehabilitation Fund, reviewing partners' work following floods and cyclones in Bangladesh in 1988-91, recommended discussion workshops for partners to share experiences and discuss how earlier recommendations were being turned into action. (See Buchanan et al 1992: 25)
  • PAHO has created a Partnership for Health Preparedness (PHP): a forum for liaison, dialogue and collective reporting to its three core donors: DFID, USAID and CIDA. Annual meetings and joint programme reports have improved donors' understanding and allowed more open exchange between partners. The process has also enabled PAHO to reflect on the overall effects of its programme and give more consideration to output-, outcome- and impact-level results instead of previous over-emphasis on individual programme activities in separate reports to each donor. (See Gander et al 2003: 36-38)


There are situations where evaluation findings are challenged by the organisations or programmes evaluated. The comments of those evaluated are rarely recorded, but the experience of those involved in DRR suggests that it is unusual for evaluation findings to be unchallenged, and there have been occasions where the main findings have been rejected.

Further reading and website resources

  • Buchanan, A., B. Mackey and V. Warmington (1992) 'Cyclone Reconstruction and Rehabilitation: Bangladesh Programme Review February 11-28, 1992'. Unpublished report. Reconstruction and Rehabilitation Fund, Ottowa.
  • de Haulleville, A. and G. Halatov (2002) Evaluation of ECHO's Strategic Orientation to Disaster Reduction. ECHO, Brussels.
  • Gander, C, et al. (2002) Evaluation of IHA's (International Humanitarian Assistance Division) Disaster Preparedness Strategy: Towards a New Disaster Risk Management Approach for CIDA. CIDA, Canada.
  • Lockwood, H. and A.C. Alonson (2003) 'ITDG Livelihood Options for Disaster Risk Reduction in South Asia: project review'. Unpublished report. Department for International Development, London.
  • Venghaus, G, A Syed, S. White and N. Ullah (2000) 'Review of Bangladesh Red Crescent Society Disaster Prepardeness'. Unpublished report. Bangladesh Red Crescent Society, Dhaka.

Evaluations available online: