Decentralized evaluation for evidence-based decision making WFP Office of Evaluation Decentralized Evaluation Quality Assurance System (DEQAS) Quality Checklist for Decentralized Evaluation Inception Report Version November 2015 [title of the decentralized evaluation] Overall General Comments/Status Length: report does not exceed 20 pages (excluding annexes) Accessibility: Report is written in a clear and accessible manner Report clarifies and builds on the Terms of Reference, extending its evidence base and analysis Report provides a clear operational plan for how the team will carry out the decentralized evaluation Report reflects a common understanding between the evaluation team and Evaluation Manager on expectations and standards Report demonstrates ownership of the process by the evaluation team. Editing Template has been followed and all elements included Table of contents is included and lists tables, graphs, figures and annexes Tables and diagrams are used as relevant IR QC Version November 2015 P a g e 1 | 11 List of acronyms is included Cover Page Template for cover page has been followed Title of the decentralized evaluation identical to that in the TOR Date and status of the report (draft/final) indicated on the cover page 1. Introduction Overall: Key information on the evaluation and the subject are included in 1-2 pages. Expected Content Main objectives of the evaluation are clearly stated (accountability and learning) Purpose of the Inception Report clearly stated Expected users of the Inception Report clearly stated Assessment criteria Comments/Status Introduction clearly sets the scene for the evaluation, including its subject, timing and objectives. Purpose of Inception Report clearly stated, and expected users specified 2. Context Overall: Succinct overview of the surrounding context, as pertaining to the subject of the evaluation, setting the scene for the evaluation in 3-4 pages. Expected Content Overview of the geographical context directly relevant to the evaluation including: Poverty, food security Assessment criteria Comments/Status Contextual information is focussed and concise. Information is relevant and important to understanding the context IR QC – Version November 2015 P a g e 2 | 11 Government policies and priorities, including policy gaps Humanitarian issues Gender dimensions of the context in relation to food security, nutrition situation, architecture in the country and indicators Key external events Features of international assistance in the area Other WFP work in the area Work of other key actors for the subject of the evaluation Information is explicitly geared to the evaluation subject, rather than being generically presented 3.Subject of the Evaluation Overall: Comprehensive description of the evaluation subject in 1-2 pages Expected Content Key features of the evaluation subject should be included: Type of intervention (operation, activity, thematic area, transfer modality, pilot project) Geographic scope of the evaluation subject Assessment criteria Comments/Status Information is relevant and important to understanding the subject of the evaluation: o What it is o When it was designed o What are the key inputs ($ value) o What are the planned outputs? (beneficiaries, MT, Cash &Voucher $) IR QC – Version November 2015 P a g e 3 | 11 Relevant dates: Approval date; start date; end date Planned outputs at design: Beneficiary numbers (planned and revised) disaggregated by gender/activity Amount of transfers (food, cash, vouchers) Any other outputs Planned outcomes at design Key activities Main partners (Government; NGOs; Bilateral; Multilateral) Resources (% funded of total requirements) and key donors. If subject funded from pooled funds, show resource allocated Assessment of the Logical Framework or similar tool from evaluation perspective or its reconstruction Other relevant preceding/ concurrent activities/ interventions/operations Any amendments to initial design What is the target/scope? o What are the planned outcomes? o Who is involved in its implementation? Soundness of logical framework assessed. If reconstructed, it has been discussed and agreed about with the evaluation commissioner Highlights relevant issues from past evaluations and reviews that are relevant to the evaluation Gender dimensions explained Differences between o original design and implementation are explained if appropriate Gender equality and women’s empowerment dimensions relevant to subject of the evaluation and context Assessment of whether quality gender analyses were undertaken and whether this IR QC – Version November 2015 P a g e 4 | 11 analysis was properly integrated in programme design. Include reference to: Past evaluations/reviews related to the subject Maps/graphs for illustration 4. Stakeholder Analysis Overall: Comprehensive mapping of stakeholders, including their interests and involvement in and needs from the evaluation in 1-2 pages. Expected Content Building on the related TOR section, the stakeholder analysis should identify: o Who are the different groups involved in the evaluation subject (including beneficiaries) o Why they have a stake in the subject of the evaluation and the evaluation itself o How they will be involved in the evaluation process Assessment criteria Comments/Status All relevant stakeholders have been identified Relevant analysis of who should be involved in the evaluation is included along with the respective interests of stakeholder groups Relevant analytical tools applied Considerations regarding beneficiaries’ perspectives are included Beneficiary analysis is disaggregated by gender Stakeholders’ analysis is coherent with the proposed methodology and evaluation matrix IR QC – Version November 2015 P a g e 5 | 11 5. Evaluation approach and methodology Overall: Methodology and specific methods present a comprehensive and systematic approach, which is sufficient to generate trust in the credibility of the evaluation as well as its independence and impartiality, in 3-4 pages. Expected Content Assessment criteria Comments/Status 5.1 Proposed approach and methodology Explanation and justification of the evaluation criteria selected, and how they will be applied Inclusion of evaluation questions to be addressed Full description of the methodological approach, including a mixed-methods approach Clear description of the evaluation matrix, and how it will be used Clear statement of how gender will be addressed in the methodology Proposed methodological approach is coherent, logical and in line with the TOR Methodological approach is comprehensive and presents a systematic approach that will generate trust in the credibility of the evaluation Mixed-method approach is specified Evaluation matrix is present; contains the required elements (see ‘expected content’, below) and meets required quality standards; Proposals for the integration of gender into the methodology are sufficient to ensure a credible and comprehensive approach 5.2 Site Mapping Presentation of the geographic coverage of the evaluation Explanation of the sampling for the selection of areas to be visited (selection criteria explicit) Relevant site mapping tool has been used to present the analysis if appropriate The site mapping is linked to the analysis of the operation Gender considerations explained IR QC – Version November 2015 P a g e 6 | 11 Assurance on the impartiality of the selection process The analysis informs the selection of areas to be visited during the mission, according to a sound rationale and impartial approach 5.3 Data Collection Methods and Tools Description of and justification for, specific methods to be applied Chosen methods explicitly linked to the Evaluation Matrix and informed by stakeholder analysis Specific consideration of how the methods proposed will address gender issues Approaches to addressing any data gaps identified at Inception stage Description and supply (in Annexes) of data collection tools Description of the sampling strategy Explanation of analytical methods to be applied, including how data will be triangulated for drawing conclusions How data will be cleaned, where relevant The data collection methods to be applied in the evaluation are justified and described in full, with strategies described for addressing data gaps Consideration of how the data collection methods will address gender considerations Consideration of how data collection activities will be undertaken in a gendersensitive manner Transparent presentation of data collection tools Sampling methods are robust and impartial Proposals for analysis, including triangulation, likely to generate credible conclusions 5.4 IR QC – Version November 2015 Limitations and risks P a g e 7 | 11 Limitations or gaps in evidence are presented Indication of how the evaluation team will mitigate limitations Risks are identified and mitigation strategies proposed Risks are correctly identified, and mitigation strategies are realistic Limitations anticipated in the evaluation due to e.g. availability of data, timing of field visits or security considerations are clearly stated, alongside how these will be mitigated. 5.5 Ensuring Quality Description of quality assurance mechanisms to ensure the impartiality, independence, credibility and utility of the evaluation Mechanisms for ensuring the utility (e.g. communication and learning plan in place) credibility (robust methodology, clear mechanisms for minimising bias i.e. impartiality) and independence (use of external evaluation team) are clear and explicit 6. Organization of the evaluation Overall: Comprehensive operational plan gives confidence that the evaluation can be implemented as planned, in up to 6 pages. Expected Content Assessment criteria Comments/Status 6.1 Team composition and workplan Description of the expertise of each team member in line with ToR Team expertise matches all competencies required in ToR (including gender) IR QC – Version November 2015 P a g e 8 | 11 requirements (including gender expertise) and their respective role and responsibilities Inclusion of a workplan for each team member in line with deliverables if appropriate Intended mechanisms for ensuring teamwork and coordination There is clear complementarity among team members’ skill sets Gender expertise is included Tasks to be undertaken by each team member are clear and in line with consultants’ profiles There is a clear plan to ensure co-ordination and teamwork among team members 6.2 Timeline Clear presentation of timeline, revised if applicable, with associated deliverables The (revised) timeline respects the time required for each evaluation phase The (revised) timeline has been agreed with the Evaluation Manager Associated deliverables, with specific dates, are included 6.3 Data collection mission schedule Evaluation mission schedule (by days/team member/locations/stakehol ders etc.) Evaluation mission schedule provides a practical tool to facilitate WFP planning List of stakeholders is consistent with stakeholder analysis Detailed plan is presented in the annex 6.4 Information/Support Required List of support is clear and has been agreed with WFP Evaluation Manager Description of support (logistical/ operational) required during the evaluation IR QC – Version November 2015 P a g e 9 | 11 Annexes Comment Annexes support and expand on text in the main report. Including: Map of intervention/project area Evaluation Matrix Data Collection Tools Evaluation Mission Schedule Documents Gathered Relevant and up-to-date map is included Annexes listed and numbered Annexes referenced, where appropriate in the main report Data collection tools included Not all working documents to be included Comments/Status Quality assessment – Evaluation Matrix Evaluation Matrix – expected content Assessment Criteria The Evaluation Matrix should provide an overview of how each of the key evaluation questions, as identified in the Terms of Reference, will be addressed. It should include: The matrix summarises the evaluation methodology and addresses each of the evaluation questions in the TORs Breakdown of the main questions into Subquestions A set of indicators to measure performance, explicitly referring to the logic model used Possible benchmarks (including good practice Comment/status The number of sub-questions is adequate to keep the evaluation team focused on answering all the main questions and attain depth of analysis (i.e. not too many and not too few) The sub-questions are developed to guide the evaluation team, but are not as detailed as a survey instrument or interview guide. For each evaluation question, sub-questions, performance IR QC – Version November 2015 P a g e 10 | 11 standards, performance assessment of comparator agencies, etc.) Links to the relevant parts of the methodology that will contribute to answering the sub questions Explanation of how the findings will be triangulated Sources of information (specifying whether secondary data will be used and where primary data is needed) indicators (building on those in the Logic model or Logframe), possible benchmarks and sources of information are specified The matrix clearly demonstrates that triangulation will take place The evaluation matrix is informed by the stakeholder analysis The matrix refers to the relevant evaluation criteria (including relevance, coherence (internal and external), coverage, efficiency, effectiveness, impact, sustainability, and connectedness) and standards (e.g. SPHERE standards) IR QC – Version November 2015 P a g e 11 | 11
© Copyright 2026 Paperzz