Mixed methods evaluations embedded in a long-term development project can become a valuable source of learning that strengthens implementation. This article will demonstrate how strategically embedding evaluation into a USAID-funded, education initiative benefited the project.

Implemented by EDC, Time to Learn (TTL) is a USAID-funded 5-year project (2012-2016) designed to improve reading in community schools and equitable access to education services for orphans and vulnerable children in six provinces. As the TTL research/evaluation partner, EnCompass is responsible for conducting a baseline study and midline and endline impact evaluations, two performance evaluations, and in-depth case studies. The evaluations are intended to measure the impact of project interventions on learner literacy; outcomes related to teacher performance; and performance (process and outputs) of the project.

There are significant benefits of embedding strong evaluation partners in long-term projects:

  • Each evaluation builds on the last one, and there is a strategic role for evaluation throughout the project—allowing for the development of a full story of the project through evaluation
  • The evaluators develop subject matter and context expertise, and are thus better able to interpret the data and ensure it is used and applied to learning and adapting
  • The longer relationships with the evaluation team build trust with stakeholders and the client, which contributes to capacity building
  • It provides an ideal set up for country ownership, as it allows for engagement of in-country partners over time.

Prior to each evaluation, EnCompass designed and delivered data collection training for USAID Time to Learn partners from the Ministry of Education and University of Zambia, as well as Time to Learn staff. The 2015 training was in preparation for the midline impact evaluation. Data collection methods included: Classroom Observation Protocol, Early Grade Reading Assessment, and community school Head teacher questionnaire.

There are also significant benefits from combining mixed methods and including a participatory, capacity strengthening approach.

Context: Adding a qualitative component to the overall evaluation approach helped the project access and better understand the context.  This was especially important at baseline, because there was not a lot of data available on community schools.  The qualitative elements of the evaluation design allowed for a more nuanced understanding of the context.

Explanatory: Qualitative evaluation helped to shed light on the quantitative results. For example, we knew that parental involvement was statistically predictive, better ? was connected to literacy scores. When we saw this in the baseline, we wanted to know: What does parental involvement really mean? We could not survey people, because we would not have known what questions to ask. So, we conducted two case studies. These two communities/schools were a non-representative, but purposeful sample. We learned from one case study that some parent community committees were erecting buildings—infrastructure support. In another case parents were playing an accountability function. Overall, the depth of the two cases illuminated the relationships and paths that seem associated with better education outcomes.

Program Implementation Adjustment: Based on lessons from the case study, the project and the community of implementers are discussing possible action to support the kinds of parental involvement that seems most associated with positive outcomes.

Lessons from our experience as a long-term evaluation partner:

1. Evaluation is not a one-off activity. If you want to mix methods intentionally, you have to allow the qualitative to inform, and the quantitative to respond to hypotheses that the qualitative is throwing out.

2. To maintain independence, we have built safeguards:

  • EnCompass is a subcontractor, and that helps maintain some distance, even when embedded in a project
  • The evaluation team focuses on its integrity through internal reflection
  • The evaluation team presents to the EnCompass home team to check in on the credibility of the point of view.

3. We build in flexibility. We may like to plan and control the schedule, especially for the quantitative evaluation part—timing, random sampling, etc.—but we cannot control 5 years out. A good, utilization-focused evaluation has to respond to changing programs. This presents a challenge for evaluation; evaluation needs to change and respond to changes in the intervention, the community, and the client’s desires (for example, there are things they want to know now, that they did not want to know earlier).

4. We bring the right mix of skills: The evaluation team brings expertise in quantitative methodology, qualitative methodology, facilitation, and presentation skills to policy makers and implementers.

5. Five years is a long time. The evaluation team must maintain passion for the work, and open curiosity.

The evaluation team appreciates being a core part of making implementation better by learning and reporting. We engage program implementers in conversations to interpret results, and contribute to implementers’ strategic thinking. There are significant challenges to conducting evaluation over a longer period, but we believe that by focusing on the integrity of evaluation and evaluation use we demonstrate the value for money of evaluation.