This three-step evaluation methodology was tested several times before being adopted by the OECD-DAC network on development evaluation in 2012.16
It was re-assessed in 2014 when the EU commissioned a synthesis of seven evaluations undertaken since 2010, looking at the strengths and weaknesses of the three-step approach.17
The specific tools and evaluation techniques used by each evaluation team were compared and assessed in order to develop recommendations on possible improvements. The recommendations covered methodological aspects as well as managerial and process issues.
- A contextual analysis should be included in each evaluation.
- Step 2 analysis18
should consider the possibility of using secondary rather than primary data analysis and/or more qualitative approaches (such as benefit–incidence surveys or perception surveys). - Development partners’ management responses to evaluation recommendations need to be strengthened.
- Evaluation reporting formats should be simplified.
- The classification and presentation of evidence collected should be simplified to facilitate comparability across evaluations.
In addition, the study noted that the evaluation approach could become an integral part of the domestic policy processes if it was led by the country rather than by the development partners.
These recommendations were based on seven evaluations. Looking across the 17 evaluations examined in this chapter, it is clear that the application of the methodology provided more robust results at the step 1 level for sector budget support than for general budget support.19
This was not due to the type of budget support but to the fact that, by coincidence, in five of the six sector budget supports evaluated, recipient governments chose to earmark EU funds to specific (and narrowly defined) spending programs. This made the effects of budget support more traceable and allowed for a counterfactual approach to be taken for step 1. At the same time, these five countries stood out for their poor monitoring of policy actions and outcomes, making it more difficult to undertake the step 2 analysis. To assess policy and budget support effectiveness, the 17 evaluations confirmed that strengthening partner countries’ statistical institutions, statistical and monitoring systems, and accountability systems through improved and regular policy impact analysis needs to remain a priority.
- 16
The methodology was tested in evaluations of budget supports in Mali, Zambia, and Tunisia in 2011. See https://www.oecd.org/countries/zambia/evaluatingbudgetsupport.htm. - 17
In addition to the three pre-cited evaluations of 2011, the synthesis included the evaluations of budget support in Tanzania (2013), Mozambique (2014), South Africa (2013) and Morocco (2014). See https://www.oecd.org/dac/evaluation/Evaluation-Insights-Evaluating-the-Impact-of-BS-note-FINAL.pdf - 18
Most of the evaluations relied on the analysis of secondary data sourced from existing administrative or survey data. Primary data collection was mostly limited to information gained through focus group discussions and structured interviews. The synthesis discussed the possibility of undertaking specific survey work to provide primary data for a more precise and focused analysis. - 19
The evaluations can be found on https://ec.europa.eu/international-partnerships/strategic-evaluation-reports_en