Consultant - Final Evaluation of the FFP-funded Yemen Emergency Food Assistance III - Yemen

Programming Sana'a, Yemen


Project Location: Sana’a and Aden, Yemen


The FFP-funded Yemen Emergency Food Assistance III (YEFA III) program is a food assistance program jointly implemented by Mercy Corps, CARE, and ACTED with the aim of reaching 9,583 households in eight districts across1 five governorates in Yemen. The program utilizes complementary approaches to meet the basic food needs of conflict-affected households through the distribution of food vouchers over nine consecutive distribution rounds and through improving the practical knowledge of communities in proper hygiene, sanitation, and nutrition.

The consortium partners work closely with local vendors who participate in the redemption of food baskets, following the approved FSAC minimum. Community awareness raising on nutrition, sanitation, and hygiene is implemented via community health volunteers, each trained by consortium partners and equipped with relevant and practical information, education, and communication (IEC) materials. Geographically, the YEFA III program is implemented in Sana’a (Manakha district), Al Mahwit (Bani Saad district), Taiz (Dmnat Khadir and Al Shamaitein districts), Lahj (Radfan and Halimayn districts), and Al Dhale’e (Al Hussein and Al Shuayb districts) governorates.

To promote food security for targeted vulnerable communities across five governorates, the YEFA III program aims to improve access to adequate food for at least 9,583 vulnerable conflict-affected households in targeted areas of Al Dhale’e, Lahj, Al Mahwit, Sana’a, and Taiz governorates. Assistance is delivered through commodity vouchers redeemable for basic food items (wheat flour, vegetable oil, beans, sugar and salt) with aim to meet 80% of daily kcal requirement for an average family of 7.

The program’s goal is to improve food security for at least 9,583 vulnerable conflict-affected households in targeted areas.

The first objective, which directly targets food security, is improved access to an adequate food basket for at least 9,583 vulnerable conflict-affected households in targeted areas as measured through the following indicators:

  • Indicator P 1.1: Percentage of Households with Poor, Borderline and Acceptable Food Consumption Score (FCS) disaggregated by sex of household head.
  • Indicator P 1.2: Percentage of Households with Little or No Hunger, Moderate Hunger and with Severe Hunger on the Household Hunger Scale (HHS)
  • Indicator P 1.3: Mean Reduced Coping Strategy Index (rCSI) disaggregated by sex of household head.

The second objective, which focuses on Health and Hygiene is improved hygiene, sanitation and nutrition awareness and practices to 9,583 households (67,081 individuals) with the measurements of success built upon two indicators:

  • Indicator P 2.1: Percent of primary caretakers who know at least 3 or more IYCF practices
  • Indicator P 2.2: Percent of primary caretakers who know at least three critical times for hand washing

Purpose / Project Description:

YEFA III is seeking a final evaluation consultant to conduct an end-of-project evaluation that will focus on the entire implementation period; January 2019 – May 2020 (17 months). The main purpose of the external evaluation is to determine the effectiveness of YEFA III in achieving its overall aim of improving food security for vulnerable conflict-affected households in the targeted areas. The final evaluation will focus on assessing the achievements and overall performance of the program. This  will  be  done  through  the analysis of the existing project data and the consultants own qualitative assessments. In addition, the consultant will assist in the hiring and training a team of enumerators to collect primary quantitative data through an endline survey. The project will come to a close in two stages; first in December, 2019 in the Lahj  governorate and second in Al Dhale’e, Al Mahwit, Sana’a, and Taiz governorates in April 2020.

The  project  design,  scope, implementation  and  the  capacity  to achieve the project objectives and reach the stated targets will be assessed. The evaluation will also consider YEFA III’s effects on local markets, as well as how it affected certain groups of interest (women and men, youth, IDPs/host communities, etc). The effectiveness and relevance of the modality and quantities will be considered. Finally, as the implementation area differs between program partners by geography and the timing of implementation,  it  is  vital  that  the  evaluation consultant compare  and contrast findings in order to accurately represent these differences in a single final report.

Actionable recommendations for use in informing the programming strategy for a potential future replications of this project will be included in the final report. Collate and analyze lessons learnt, challenges faced and best practices obtained during implementation will be available to inform the key stakeholders of this project, primarily ACTED, Care, and Mercy Corps, local government partners, USAID, and Food for Peace.

Evaluation Objectives:

The final evaluation report shall measure the ability of the program to meet its stated goals, respond to the specific evaluation questions identified below and include recommendations to inform the design of similar activities in the future. The overall objectives will be to evaluate the:

  • Achievements of the activity in relation to the goal, objectives, results, and targets.
  • Activity’s effects on local markets, and how it affected certain groups of interest (women and men; the youth population; boys and girls, etc.).
  • Effectiveness and relevance of the modality, transfers, and complementary interventions to achieve activity outcomes.
  • Best practices, lessons learned, strengths, and challenges in the activity design, including the LogFrame, and implementation for achieving project achievements.

Key Evaluation Questions

The final evaluation report shall measure the ability of the program to meet its stated goals, respond to the specific evaluation questions identified below and include recommendations to inform the design of similar activities in the future. The following key questions will guide the end of project evaluation:


  • To what extent have the activity’ s interventions adhered to planned implementation - schedules, participant targeting, resource transfer composition and quantities, inputs and service delivery, and outputs - and achieved intended goals, purposes and outcomes?
  • Did interventions reach the targeted groups and individuals within the implementation areas? Were there differences between the groups reached within each partners’ area of implementation? What factors contributed to these differences?
  • How effective was the targeting approach in achieving the project goal?
  • Were interventions appropriate and effective for the target group based on the nature of their vulnerabilities? To what extent did the project activities address real needs in the targeted communities?
  • To what extent were the project aims and objectives relevant to the participants needs? To the country’s needs? Are they still relevant?
  • What were the main  internal and external factors influencing the achievements or non-achievements of the program? What factors promoted or inhibited adherence to plans and achievement of goals?
  • What, if any, are recommended changes to targeting for future programs?

Effectiveness and Efficiency of Interventions and Intervention Implementation:

  • To what extent did the activity consider gender equity, protection, age, physical and emotional challenges of the participants, and risks to participation in various interventions in project design and implementation?
  • How did access to the project differ for different population groups (e.g. based on gender, age, political affiliation, housing status, etc)? Were any groups unable to participate? What factors contributed to inclusion/exclusion? How could similar interventions be more inclusive in the future?
  • What was the level of efficiency with regards to cost-per-project participant, timely delivery of the goods or services, and adjusting the transfer amount based on price and need changes? What factors of the  project  financial  management  processes  and  procedures  affected  project implementation?
  • Was  the  process  of  achieving  results  efficiently?  Specifically, did  the  actual  or  expected results (outputs and outcomes) justify the costs incurred? Were the resources effectively utilized? Were the activities cost-efficient? What factors contributed to this?
  • How  effective  were  the  structures, strategies,  and  tools  used  in  the project  management  and implementation of the project, both at the implementation partner level and consortium level? What were the strengths and weaknesses of the structures, strategies and tools
  • How  effective were the structures, strategies and tools used for planning, monitoring, and accountability both at the implementation partner level and consortium level? What were the strengths and weaknesses of the structures, strategies and tools? 
  • Did the project design (planning, modalities of delivery, budget allocation etc.) flexibly respond to the context changes over time, if any?  Were activities implemented on time? Were objectives achieved on time? What caused any delays?
  • How were problems and challenges within the program design or implementation identified, reviewed, and managed? How did monitoring information and feedback from the targeted populations contribute to this? How  effective  has  the  project been in responding  to  the  needs  of  the  respondents, and what results were achieved?
  • How  effective  was  the  collaboration  between each  program partner and with the consortium,   with   particular   reference   to   communication, cooperation and  partners’ perceptions of the project? To what extent did the consortium approach contribute to the efficiency of the project implementation?

Unintended Consequences and Lessons Learned:

  • What changes—expected and unexpected, positive and negative—did targeted participants, community members and other stakeholders associated with the activity’s interventions? What factors appear to facilitate or inhibit these changes? How do these changes correspond to those hypothesized by the activity’s LogFrame?
  • Which aspects of the project appear to be more or less influential in achieving the stated goals of the program? What aspects have been least successful and/or most difficult to achieve? What were the factors that contributed to this?
  • Where there any unintended consequences on local markets? What factors contributed to this?
  • To what extent have stakeholders been involved in decision-making during implementation?
  • Was the modality selected and quantity distributed appropriate for the response and target population? Are  there  more  efficient  ways  and  means  that would have  delivered better results (outputs and outcomes) with the available inputs? What are recommended changes for future programs?
  • What lessons were learned regarding targeting, program and activity design, and implementation?
  • What lessons have been learned regarding accountability, monitoring, and evaluation?
  • What are the key recommendations for future programming?

Linkages, Layering, and Exit Strategies:

  • What exit strategies  and  approaches  to  phase  out  assistance were  included  and  how effective have these been(including contributing factors and constraints)?
  • Did project activities overlap and duplicate other similar interventions (funded nationally and/or  by  other  donors)? 
  • To what extent can the benefits of the intervention be sustained after the completion of this project?

Consultant Activities:

Desk Review

In  order to  use  existing  sources/information  and  avoid  duplication, a comprehensive desk review will be undertaken prior to the start of any in-country activities. This will cover analysis of relevant documents, information, data/statistics, triangulation  of  different  studies,  etc. Key  project  documentation  shall  be  shared  with  the consultant to facilitate the process of evaluation

Data collection/field work

The external consultant is expected to conduct a mixed methods evaluation using tools and a workplan approved by the YEFA III Specialist and MEL TWG prior to the start of the evaluation. Data collection shall involve visits to a sample of project locations, meetings with program partners, targeted participants and other key stakeholders. The consultant will lead the qualitative and quantitative data collection, including supervising data collection teams, and completing the analysis within the approved timeline.

  • Qualitative: The qualitative evaluation must capture lessons learned and best practices through a variety of qualitative methods. The evaluation team will design the overall qualitative study approach and should consider a variety of primary data collection methods, including: semi-structured in-depth interviews, focus group discussions, and observations. The evaluation team leader and members will be responsible for collecting and analyzing qualitative data. Data will be collected from key stakeholders through interviews, discussions, consultative processes, and observations.
  • Quantitative: The final evaluation will include primary data collection and analysis of quantitative survey data. The endline survey tool will be designed by the consultant and must utilize the same data collection instruments, level of statistical precision, and statistical power as the baseline survey. The field operations - from hiring and training of enumerators and testing of tools to transportation logistics - will be led by the consultant with support from the consortium partners. The evaluation shall be designed to detect statistically significant changes in estimates from baseline to endline for key indicators

Consultant Deliverables:

The Consultant will deliver a final report comprised of the following structure and content:

  • Cover Page, List of Acronyms
  • Table of Contents
  • Executive Summary: Clear and concise stand-along summary of the evaluation, with particular emphasis on the main findings, conclusions, lessons learned and recommendations.
  • Introduction: Description of the evaluated intervention, its logic, history, organization and stakeholders. Presentation of the evaluation’s purpose and questions.
  • Methodology: Description of the sampling strategy and methods used for data collection. This section should be sufficiently detailed to help the reader judge the accuracy of the report and the findings
  • Limitations:  A description of the limitations. This section should address constraints and limitations of the methodology, and the implications of these limitations for the findings, including whether and why any of the evaluation findings are inconclusive. 
  • Results:  Factual evidence relevant to the questions asked by the evaluation and answered through the evaluation questions. This section should provide a clear assessment of progress with respect to indicators / targets / objectives and/or evaluation questions. A comparison the baseline to endline point estimates with tests of statistically significant changes should be included. Reference baseline evaluation information as well as program logic, MEL Framework, etc
  • Synthesis, Recommendations and Lessons Learned: An interpretation of the data and results as provided by the evaluation team. It should include actionable recommendations for current or future program improvements, pull out organization lessons learned, and generally comment on data and results.  Everything presented in this section should be directly linked back to the information presented in the Results section of the report
  • Conflicts of Interest: Disclose any conflicts of interest or the appearance of conflicts of interest, including the interest of program staff in having a successful program.
  • Annexes: A complete file of data collection instruments in English and translations; list of stakeholder groups with number and type of interactions; SOW, TOR, documents developed under the end-of-program evaluation plan, any data sets (these can be provided in electronic format), any required photos, participant profiles or other special documentation needed.

Timeframe / Schedule: 

The anticipated length of this consultancy is approximately 70 days beginning in December, 2019. The desk review and tool design will take place prior to departure for Yemen.  Field-level data collection will commence in Lahj governorate during the month of January while data collection in the remaining districts will start in April 2020. It is possible that this consultancy will be split into two parts with the consultant will make two separate trips to Yemen to collect data.

The Consultant will report to:

The YEFA III Program Director

The Consultant will work closely with:

Key Program and MEL staff from all YEFA III partners, the YEFA III MEL Specialist, and the YFEA III Program Director.

Required Experience & Skills:


  • Master’s degree or higher in a relevant field;
  • Minimum of 5-10 years of experience in conducting studies, evaluations, collecting data and producing quality baseline/midterm/end line study reports, preferably for international non-governmental organizations or multilateral agencies;
  • Demonstrated experience leading, designing and implementing evaluations in general;
  • Demonstrated experience in designing, planning and conducting final evaluations for programs operating in insecure and complex settings (experience in Yemen preferred);
  • Excellent spoken and written English;
  • Excellent communication skills, including the ability to communicate effectively in a multi-cultural environment;
  • Demonstrated experience with quantitative and qualitative research methods, including paper and pencil-based survey instruments;
  • Demonstrated experience in presenting final evaluation results through written reports and remote presentations that include key findings, conclusions and recommendations;
  • Experience in Yemen.


  • Experience evaluating USAID-funded projects;
  • Experience in evaluating consortium projects;
  • Experience in evaluating emergency programs, that include voucher/cash;
  • Experience within refugee/displaced population context;
  • Data visualization skills highly desirable;
  • A B1 or up level of Arabic;
  • Current Yemeni visas (north and south).

To Apply:

Individuals interested in applying for this consultancy must submit materials to Mercy Corps by Thursday, November 28th. In addition to your CV, please submit a brief cover letter which includes a list of relevant projects/experience, expected fees/daily rate, and availability.