Job title : Consultancy: Evaluability Assessment of the National School Safety Framework (NSSF)
Job Location : Gauteng, Pretoria
Deadline : December 25, 2024
Quick Recommended Links
Jobs by Location
Job by industries
Summary of the key activities for the EA: Clarity of Program Objectives: Assess if the NSSF's objectives are clearly defined and understood by relevant stakeholders. Programme Design and Logic: Develop, review and revised the NSSF's Theory of Change (ToC) and assess or evaluate if the ToC or logic model logically links activities, outputs, outcomes, and impacts. This involves reviewing the programme's implementation plan, activities, and expected results. Stakeholder Engagement: Identify and consult with key stakeholders, including government officials at various levels (national, provincial, district and circuit level), educators, learners, other government departments, funders, and other relevant parties. Solicit stakeholders' perspectives and expectations in their understanding of the framework's context and the feasibility of an evaluation. Ascertain that a shared understanding exists among implementers and key stakeholders of the objectives, results, implementation strategy. This could be done using a hybrid approach which could be face to face or online, depending on the situation. Data Availability and Quality: Assess the availability and quality of data needed for the evaluation. This includes reviewing existing monitoring (e.g. school readiness surveys, schools' incident reports, provincial reports, etc.) and evaluation systems, data collection methods, and data sources. Confirm information requirements and data sources, and to articulate and/or refine indicators of success. Evaluation Questions: Develop and refine key evaluation questions that align with the framework's objectives and stakeholders' information needs. Ensure these questions are clear and answerable given the available data and resources. Evaluation Design: Identify appropriate evaluation designs and methods that can answer the evaluation questions. This could include experimental designs (e.g., randomized controlled trials), quasi-experimental designs (if appropriate) (e.g., matched comparison groups), or non-experimental designs (e.g., pre-post comparisons), as well as contribution analysis, etc. Feasibility and Resources: Assess the feasibility of conducting the evaluation, considering the available resources, budget, time, and expertise. Ensure that the evaluation is realistic and manageable within the program's constraints. Sampling strategy/criteria: Develop a sampling strategy for programme documents and primary data sources (both government and UNICEF and other key partners). Instrument development and validation: A conceptual/measurement matrix (or a checklist) for the EA will be developed, as well as draft data collection and data analysis instruments. These will be reviewed and approved, as part of the inception phase. The EA consultant will propose an approach to pilot and validate instruments in order to mitigate any conceptual and/or measurement error and submit an updated version of the data collection instruments in the early days of the execution phase. Four distinct data analysis components are proposed for the EA: (i) stakeholder analysis; (ii) desk-based review of planning documents, processes, and activities, and analysis of programming tools; and (iii) analysis of key interventions/activities at the country-level, (iv) analysis of monitoring indicators and data: Stakeholder analysis. Key stakeholders will be identified including UNICEF and targeted non-UNICEF partners. Their roles and inputs will be assessed through a stakeholder analysis. Desk-based review and analysis of national documents (policies, law, legislations, etc.): The EA consultant will conduct broad background reading of government, UNICEF and other key stakeholder documents. Analysis of key national, including UNICEF programme activities/interventions and from other relevant partners: The EA consultant will review both the technical and management aspects of the NSSF initiatives at the country level. Analysis of government monitoring and evaluation systems and data: A particular focus will be on capacities for monitoring and evaluation, the relevance and evaluability of the results framework at the national level, and outcome and output indicators. The EA consultant will carefully review the sources and reliability of information, determine what gaps there may be in the information required, and suggest activities needed to fill the gaps.
To qualify as an advocate for every child you will have… An advanced university degree (Master's or higher) in Social Services, Research or Development A minimum of five years of relevant professional experience in monitoring and evaluation with a proven record of conducting evaluations Knowledge of how UNICEF or other UN agencies or developmental agencies and the South African government system works at national, provincial and local level Excellent conceptual and analytical skills Good skills in Microsoft Word, MS Excel, PowerPoint, and other necessary software e.g. Adobe, graphics, etc. Excellent and concise English writing skills Proactive and energetic approach to the work Ability and commitment to work to a tight time frame Ability to manage and supervise evaluation teams and ensure timely submission of quality evaluation reports Ability to deal with multi-stakeholder groups Ability to write focused evaluation reports Willingness and ability to travel to the different project's sites in the country and must be in possession of a valid driver's license
Research / Data Analysis jobs