SDN Monitoring SOP Draft 30042024

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 29

STANDARD OPERATING PROCEDURE

Monitoring and Evaluation

Office Sudan Country Office

Department Programme/M&E

Reference Number SDN/PRG/05/01/2024

Version 1.0

Date of Implementation Effective immediately

FOR INTERNAL USE ONLY – CIRCULATION OUTSIDE OF WFP IS STRICTLY FORBIDDEN


SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Draft Zinsou Kpavode, Head of M&E


Abdalla Elsheikh, M&E Officer

Reviewed by Abraham Abatneh


Head of Programme

Cleared by Khalid Osman


Deputy Country Director / Operations

Approved by Eddie Rowe


Country Director

2
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Table of Contents

1. SOP Context.........................................................................................................................................4
1.1. Background................................................................................................................................4
1.2. Purpose.......................................................................................................................................4
2. Monitoring Procedures.......................................................................................................................5
STEP 1: Prepare or revise performance monitoring narratives and logical frameworks...............5
STEP 2: Prepare CSP Monitoring, Review and Evaluation (MRE) Plan and draft M&E Budget.......6
STEP 3: Develop a monitoring toolkit....................................................................................................7
STEP 4: Collect primary data and collate secondary data..................................................................8
4.1. Process monitoring......................................................................................................................8
4.2. Output monitoring.....................................................................................................................12
4.3. Outcome monitoring.................................................................................................................12
STEP 5: Capture, compile and validate data......................................................................................15
STEP 6: Analyse data and prepare information products................................................................16
STEP 7: Make use of monitoring findings: Take action and document lessons............................18
STEP 8: Conduct evaluations or reviews............................................................................................19
8.1. Planning......................................................................................................................................21
8.2 Inception......................................................................................................................................22
8.3 Data collection and management............................................................................................22
8.4. Reporting and dissemination...................................................................................................23
3. Roles and Responsibilities................................................................................................................24
Annexes...................................................................................................................................................... 27
Annex 1: Sudan CO process monitoring master tool.......................................................................27
Annex 2: Monitoring findings’ action plan tracking system.............................................................27
Annex 3: List of acronyms....................................................................................................................28

3
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

1. SOP Context
1.1. Background

WFP Sudan’s operation is one of the largest and most complex operations globally, with
over 8 million unique beneficiaries receiving assistance in 2023 amid unprecedented
insecurity and access challenges following the conflict which erupted in April 2023.

In this context, monitoring is a key priority to inform the operational decision-making


throughout the Country Strategic Plan (CSP) cycle, starting from the design of CSP
through to the establishment of a logical framework. Monitoring generates evidence
through measuring outcomes, outputs and processes of WFP’s programmes, in-line with
the Strategic Outcomes and corresponding Activities in the CSP logframe.

This involves continuous data collection, validation, triangulation, analysis, reporting


and dissemination of findings and recommendations. On a secondary level, monitoring
generates data for evaluative purposes and corporate reporting, as well as for further
evidence-building at all organizational levels.

To this effect, the Sudan Country Office has developed a robust monitoring system
aligned with the Corporate Result Framework and based on the Sudan CSP logical
framework. The M&E system is in line with the emergency’s nature and the current
humanitarian response, and the corporate Minimum Monitoring Requirements. The
monitoring system is designed to measure process, output, outcome, cross cutting
indicators.

1.2. Purpose

The Sudan CO monitoring SOPs describe the common procedures, timing and
responsibilities for the design and implementation of the Sudan CO monitoring system.
They are intended to ensure that anyone assigned with the responsibility of managing
the M&E function at the Country Office, and Field Offices, follows the same steps and
timelines, and is guided by the same underlying principles and processes in monitoring
data collection, analysis, and reporting. They provide monitoring staff with information
needed to perform their tasks appropriately by describing step-by-step how the
monitoring system is implemented in Sudan following the corporate WFP CSP
monitoring cycle (see Figure 1).

Ultimately, the SOPs contribute to the improvements in WFP Sudan’s monitoring


activities and help to better inform programme adjustment and showcase its results.

4
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

The primary users of these monitoring SOPs include Country Office management, the
programme unit, M&E staff, the Hub and Field offices, and reports and donor relations
unit. The proposed roles and responsibilities of relevant staff are included under each
functional area and step of the monitoring cycle shown in Figure 1. The monitoring
SOPs are considered mandatory for the conduct of Sudan CSP monitoring, remain
adaptable and subject to revision and re-approval to align with changes in CO strategy,
operations, and context.

Figure 1: WFP CSP Monitoring Cycle

The SOPs are fully in line with the Corporate Standard Operating procedures for CSP
Monitoring and the Corporate Monitoring Guidance.

2. Monitoring Procedures
The eight steps of the Monitoring SOPs are described in this section.

STEP 1: Prepare or revise performance monitoring narratives and


logical frameworks.

The development of logframes is an important first step in the CSP monitoring cycle,
establishing intended results, and accompanying indicators to be tracked during the
course of the CSP life. The development or review of logical frameworks for the CSP
and funding is the responsibility of the CO M&E Unit. Technical units should be
consulted for inputs during the development/review of the Logical framework. CSP
logframe development is done during the CSP development or Budget Revisions (BR).

Revision of the logical framework during any BR should be informed by CSP strategic
changes/adjustments. If there is a challenge in reporting any of the indicators

5
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

committed or any other challenges warranting changes to the logframe, M&E unit
should ensure that review of CSP logframe be part of the BR objectives, to allow any
proposed changes.

The Head of Programme is responsible for approving the logframe at CO level and is
done outside COMET system. The CSP logical framework should be validated in COMET
only after clearance by the Head of M&E and approval by the Head of Programme. After
approval of the logframe at CO level, the logframe is submitted to RBN for validation,
and subsequent submission to HQ for approval.

In addition, all logframes developed for different funding proposals should be aligned to
the CSP logframe in terms of types of indicators being committed and targets as
applicable. The alignment of logical frameworks for donor reporting to the CSP logical
framework limits/prevents changes to the monitoring systems and additional
monitoring needs, thus ensuring efficiency. However, in exceptional cases, new
indicators may be incorporated based on donors’ requirements.

STEP 2: Prepare CSP Monitoring, Review and Evaluation (MRE) Plan


and draft M&E Budget.

To ensure that indicators are systematically monitored and reported on, and that
reviews and evaluations are planned and budgeted for, a VAM Monitoring & Evaluation
Planning and Budgeting Tool is required. The VAM Monitoring & Evaluation Planning
and Budgeting Tool captures all indicators included in the CSP logframe, timing,
frequency, and costing of activities throughout a CSP, and related staffing needs,
equipment, and services. The tool is linked to the corporate budget structure and feeds
into the Country Portfolio Budget (CPB). The tool is updated on an annual basis, in
January and ensures that adequate resources are planned for monitoring needs
annually. The update includes reporting on actual activities undertaken in the past year
2and actual expenditures used and adjusting planning for the upcoming years of the
Country Strategic Plan if needed.

The log frame can include other indicators than the ones in the indicator compendium
responding to specific donor requirements. In addition to outcome and output
indicators, monitoring of process indicators and cross-cutting indicators (for gender
equality and women’s empowerment, protection, and accountability to affected
populations, nutrition integration and environmental sustainability) should also be
included. The tool describes how each indicator will be monitored and is presented as a
matrix. It also includes basic information on reviews and evaluation plans. The process

6
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

for preparing the VAM Monitoring & Evaluation Planning and Budgeting Tool should be
participatory, and the final tool endorsed by the head of programme. The VAM
Monitoring & Evaluation Planning and Budgeting Tool should be prepared by the CO
M&E Unit in consultation with activity managers, programme and BPU staff. Preparation
of VAM Monitoring & Evaluation Planning and Budgeting Tool should be done
immediately after CSP approval and submitted within the first quarter. This timeline
allows for ease of reporting especially in reporting the previous years’ expenditures
after the annual financial closures. The approved plan and implementation report, once
cleared by the Head of Programme is shared with the Regional Office through
Monitoring and Review Advisor, for review and facilitating clearance by HQ.

STEP 3: Develop a monitoring toolkit.

The MRE Plan lists the sources of data and processes through which the data will be
collected as well as the data collection methods. The main data collection methods in
the Sudan CO include process monitoring, output monitoring and outcome monitoring.

Process monitoring is a continuous monitoring of the implementation of WFP


operations, reflecting on how well activities are implemented in the field, how compliant
partners are with implementation guidelines and how accurate the information
reported in partner reports is. Output monitoring measures the targeted assistance
provided, such as the number of beneficiaries reached, the quantity of food distributed,
or value of cash-based transfers made in addition to the other outputs. Outcome
monitoring focuses on intermediate results and is conducted through representative
sampling by systematically collecting and analysing data and comparing achievements
to targets to inform whether WFP's interventions are leading to the intended changes at
country level.

The tools (checklists, questionnaires, and databases) required to conduct each type of
monitoring need to be developed. This involves adapting corporate methodological
guidelines and data collection templates provided in Survey Designer1, or creating new
ones as required. All monitoring tools are expected to be developed by the CO M&E unit
in consultation with the technical units and Hub programme staff, based on the
approved CSP logframe indicators in addition to various programmatic needs. Reviews
and Evaluations are done in consultation with the country office management and
relevant units. Surveys and checklists are first developed in English then translated into

1
Survey Designer is an application that allows the CO M&E team to quickly and easily build standardized
assessment and monitoring surveys. Standardized data collection tools assure that collected data meets
corporate quality standards and enables data integration.

7
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Arabic by WFP, which are then inserted in MoDa2. Final forms designed in MoDa are
then cross-checked and tested to ensure that the flow, visibility, and validity conditions
are aligned and compatible. Details of the tools by monitoring exercise are provided in
the CSP Monitoring, Review and Evaluation (MRE) plan and M&E toolkit. The monitoring
tools can be updated on a regular basis as required in accordance with corporate
guidance and relevant CO unit’s needs.

The monitoring tools, accompanying methods, analysis and implementation plan


comprise the toolkit for CSP monitoring. All staff involved in the implementation of the
CSP should be informed of its contents.

Development of the methods and required tools should begin as soon as the MRE Plan
is complete and should be finalized before CSP activities begin. Once the tools are
finalized and tested, the Analysis Plan should be prepared.

STEP 4: Collect primary data and collate secondary data.

Data collection procedures are described in line with the different types of monitoring.

4.1. Process monitoring

Process monitoring is continuous monitoring of the implementation of WFP operations


in the field. The Sudan CO conducts six main types of process monitoring that are
summarized in the table below.

Table 1: Scope of process monitoring

Type of process Purpose Location where data


monitoring is collected
Distribution To ensure that the process of Distribution sites
monitoring transferring food, nutrition products,
cash and vouchers and non-food items
(NFIs) is done according to WFP
recommended practices.
Post-Distribution To collect data from households after Beneficiary households
Monitoring (PDM) they received the assistance, when they
have already been consuming or using
the food or cash-based transfers.
Activity To monitor WFP activities such as asset Activity sites
implementation creation (FFA), school feeding including
monitoring (AIM) HGSF and nutrition interventions.

2
MoDa is the main platform for data collection for WFP. The application, available on both web and mobile,
allows M&E teams to collect data in a safe and secure manner.

8
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Type of process Purpose Location where data


monitoring is collected
Warehouse To verify the condition of the food Warehouses
monitoring storage areas for CP warehouses.
Retail monitoring To monitor shops that participate in WFP Shops
cash-based transfers activities, and
similar nearby shops that are not part of
the WFP programme.
Market Diversion To do quick check for signs of diversion Shops
Monitoring through spot checks on retailers (WFP
and non-WFP contracted within vicinity of
in-kind General Food Distribution.

Process monitoring data is collected each month based on the approved Monthly
Monitoring Plan (MMP) by WFP monitors (FMAs) and Third-Party Monitors (TPMs).

4.1.1. Process Monitoring Planning

Monthly Monitoring Plans (MMPs) are developed at hub office level according to the
implementation plan at the beginning of each month, outlining the monitoring activity
and the sites selected. MMPs are developed using the Risk-based Monitoring
Framework (RBMF) which provides standardized templates for planning and tracking
monitoring visits and monitoring coverage. This approach helps better in allocating
resources for monitoring and helps to ensure that Minimum Monitoring Requirements
are achieved across the year. The information from the RBMF templates help the CO to
develop detailed monthly monitoring plans based on the full list of all activity sites and
prioritize sites that were not monitored in the previous month(s), sites that have a high
number of beneficiaries, sites that have a high risk of issues based on previous reports,
and sites that may become inaccessible soon due to security or weather conditions.

The Hub Office monitoring plans are shared with the CO M&E team who then
consolidates them, provide guidance, and oversee the monitoring plans as well as track
CO monitoring coverage. The consolidated MMPs are then shared with the hub offices
and TPM service provider for implementation.

Indeed, WFP Sudan CO commissioned a TPM firm to complement its capacity and
collect monitoring data with the aim to increasing monitoring coverage where WFP staff
cannot access due to security reasons. This aims to ensure WFP meets the corporate
minimum monitoring requirements (MMR) and benefits from the efficiency of private
sector firms in this field. WFP Sudan outsource around 70-80 percent of its process
monitoring activities to the TPM service provider particularly in access-constrained area
where WFP staff are unable to conduct monitoring activities.3

3
For more information, kindly refer to the Third-Party Monitoring SOP.

9
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

The MMPs are revised, as required, at the middle of the month to incorporate any new
information received from programme units regarding the distributions.

The process monitoring planning flow is summarized by the figure 2 which also
indicates responsible and expected timelines.

Figure 2: Process monitoring planning flow

Share implementation/distribution plan with the Hub M&E Focal points


Res pons ible: Hub Programme Team Timeline: 15 of every month

Develop Hub Office MMP using the RBMF tools and share it with the CO M&E team
Res pons ible: Hub M&E focal point Timeline: 20 of every month
Fe e dbac k to Hub

Review and consolidate the CO MMP


Res pons ible: CO M&E team Timeline: 25 of every month

Share the final MMP with the Hub Offices and TPM for implementation
Res pons ible: CO M&E team Timeline: 28 of every month

* MMP is revis ed, as required, at the m iddle of the m onth to incorporate any new inform ation received from program m e units .

4.1.2. Process Monitoring data collection

Once the approved MMP is shared by the CO M&E team, Hub offices and TPM service
provider plan for monitoring field visits. Both WFP FMAs and Third-Party Monitors are
then deployed to the field for data collection according to the approved plan. Data
collection is done using tablets with ODK Collect application. Data is collected offline
and uploaded to the MoDa server whenever monitors access an internet environment.

The Sudan CO adopted a process monitoring master tool allowing to collect data for
each of the type of monitoring describe earlier. Specific controls have been
programmed to direct the monitors to the specific section of the tools to use based on
the type of monitoring being conducted. The MoDa master process monitoring is
available in Annex 1.

Upon finalization of data collection at the selected locations, monitors conduct exit
debriefs with the Cooperating Partners responsible for that area while at Hub Office
level, WFP monitors debrief Head of Programme.

10
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

In addition, during process monitoring visits, WFP field monitors or Third-Party Monitors
may discover issues of concern that require immediate action and need to be reported
without delay. This may include discovering fraudulent activities or personnel
misconduct, as well as circumstances which may negatively impact programme
implementation and the well-being of beneficiaries. Issues that may hinder the
provision of assistance in a dignified, safe, and transparent manner shall be escalated
immediately by the monitors as describe in the Sudan CO Monitoring Issues Escalation
SOP.

4.1.3 Process monitoring Roles and Responsibilities

Responsibilities and timelines for process monitoring are summarized in Table 2.

Table 2: Process monitoring roles and responsibilities

Responsible Roles and Responsibilities Timelines


Hub Programme Team Share implementation / distribution 15th of every month
plan with the Hub M&E Focal points
Hub M&E focal point Develop Hub Office MMP using the 20th of every month
RBMF tools and share it with the CO
M&E team
CO M&E Unit  Review and consolidate the CO  25th of every month
MMP
 Share the final MMP with the  28th of every month
Hub Offices and TPM for
implementation
Hub M&E focal point Plan for monitoring field visits and Until 5th of every month
and TPM deploy monitors according to the
approved MMP.
WFP Fields monitors  Collect data using tablets with 5th of the month to the 5th of
and TPM monitors ODK Collect application. the following month
 Upload data to the MoDa server.
 Escalate immediately issues
identified during monitoring
visits using the intake form.
Hub M&E focal point Download data from MoDA and 5th to 10th of every month
update the RBMF monitoring
coverage tool.
CO M&E Unit  Compile and clean monitoring 10th to 15th of every month.
data.
 Produce monthly process
monitoring reports.
 Share the monthly process
monitoring reports.

11
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

4.2. Output monitoring

As per the United Nations Development Group definition, outputs are changes in skills
or abilities and capacities of individuals or institutions, or the availability of new
products and services that result from the completion of activities within a development
intervention within the control of the organization. They are achieved with the
resources provided and within the period specified. The overall importance of output
monitoring is to ensure the planned progress of output results is sustained. Data
collected through output monitoring enables managers to take corrective actions
needed to reach planned or intended output results which serve the larger objectives
and goals of projects/programmes. Output monitoring in the Sudan CO is managed by
This level of monitoring is managed by the Information Management unit.

4.2.1. Output Monitoring planning

In the planning phase, output indicators and the related planning figures are set in the
Needs Based Plan/Project Plan. The Needs Based Plan/Project Plan provides data on the
identified annual and overall beneficiary targets and needs for the entire duration of
the Project/Country Strategic Plan, including total beneficiaries, food and/or CBT rations,
and assistance calendar. Output indicators and planning figures should also be included
in partner agreements and thereafter in the monitoring visit plans.

4.2.2. Output Monitoring data collection

Output data in WFP Sudan are collected through standardized monthly Cooperating
Partners’ (CP) Distribution Reports (DR), which are completed and submitted by CPs (or
by WFP staff when implementing direct distributions). The report provides data on
beneficiaries reached (disaggregated by sex and age) and the amount of food
commodities/ cash-based transfers (CBT) distributed. Other output data are collected
through periodical activity progress reports (depending on the context) as well as CPs’
final project reports (also known as “completion reports”) that the CO design and
standardize in agreement with partners and include in Field Level Agreements.

12
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

4.3. Outcome monitoring

Outcome monitoring measures the degree of achievement of the short- to medium


term results of WFP Sudan programmes resulting from outputs. Monitoring of outcome
results is guided by the MRE plan and uses accepted social research methodologies to
collect data to demonstrate strong evidence of achieved results. Outcome monitoring is
further required to measure the degree of achievement of the outcomes indicators
included in the Sudan Country Strategic Plan (CSP) logframe.

4.3.1. Outcome Monitoring planning

The CO M&E team is responsible for planning and coordination of outcome monitoring
data collection. The CO M&E team should prepare a detailed “project plan” prior to any
outcome monitoring survey, The plan should include the following elements:

 main objectives and evidence questions,


 information needs i.e. the outcome indicators of interest,
 contextual information,
 timeframe (for tool design, training of enumerators, data collection and analysis),
 survey budget,
 methodology including sampling approach and sample size calculation,
 data collection tools, and
 data cleaning, analysis, and data management plan, and
 report writing.

The main objectives described in the plan can also be used to communicate about the
data collection activity with all stakeholders and can be translated for sharing, for
example, with relevant authorities such as HAC or COR to obtain the required permits.

Outcome monitoring is conducted at Strategic Objective levels (SOs) twice a year for
SO1 and once a year for SO2 and SO3 in accordance with the CSP outcome indicators
data frequency mentioned in the MRE plan.

The sample size for outcome monitoring will be representative of the target population,
aiming at statistical reliability with a 95 percent confidence level (minimum 90 percent),
with +5 percent precision and the previously calculated prevalence of the main indicator
to assess.

13
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

4.3.2. Outcome Monitoring data collection

WFP Sudan conducts sample household surveys to track the CSP outcome indicators on
food security and livelihoods, nutrition indicators, and crosscutting indicators that
include gender, protection, and accountability to affected population, and environment.
Outcome monitoring for institutional-based activities such as School Feeding, Home
Grown School Feeding and Nutrition treatment, is done through desk reviews and
Nutrition Database. The common methods used by WFP Sudan to collect outcome data
are therefore household surveys, using both quantitative and qualitative data collection
methods. The Outcome Monitoring exercise under SO1 is conducted jointly with the
VAM team through the Food Security Monitoring System (FSMS) exercise. Data
collection is primarily carried out by staff from both the WFP and its government
partners. However, in case the FSMS cannot be carried out or is delayed, the CO M&E
Unit may request the assistance of the TPM service provider to assist in the outcome
monitoring data collection exercise. In the event of security challenge or access
constraints, remote monitoring could be envisaged to collect outcome data. This is
subject to the availability of the beneficiary phone number database and the network
coverage.

Data is collected at household level using the questionnaire which is programmed on


androids using Open Data Kit (ODK) and uploaded into the MoDa corporate software
application. This application allows offline data collection, and subsequent uploading of
data into the server in an internet environment. With real time upload of collected data,
data collection progress and initial results are monitored through MoDa platform. This
allows quality assurance (completeness, accuracy) of the collected data.

Data collection tools are reviewed before each outcome monitoring activity to align with
corporate guidelines and incorporate new or adjusted needs. Before data collection,
enumerators are trained on the revised tools (questionnaires), sampling techniques and
basic research principles of data collection.

On another hand, desk review-based outcome results tracking, specifically for School
Feeding and Nutrition activities is directly managed by responsible technical units. For
Nutrition activities, data on nutrition performance indicators are reported and tracked
through the National Nutrition Information System (NIS) while School Feeding outcome
indicators are reported and tracked by the School Feeding Unit. In addition, user
satisfaction surveys are conducted annually by responsible technical units (UNHAS, Log
Cluster) to report on user satisfaction related outcome indicators. The CO M&E Unit is
responsible for tracking implementation to ensure timely reporting of results and
uploaded in COMET in addition to operational adjustments.

14
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

4.3.3 Outcome monitoring Roles and Responsibilities

The table 3 summarize key roles and responsibilities of planning, preparation, and
implementing outcome monitoring.

Table 3: Outcome Monitoring Roles and Responsibilities

Task Responsible Roles and Responsibilities


CO M&E Team • Plan and prepare for outcome monitoring surveys.
• Develop data collection tools.
• Conduct data collection training.
• Coordinate data collection.
• Compile and cleaning data.
• Analyse, report and disseminate the results.
• Follow up on implementation of action to
Outcome recommendations.
Monitoring • Capture results in COMET.
CO Technical • Review the scope of outcome monitoring.
Units • Review the data collection tools.
• Identify and implement proposed actions to
recommendations.
• Use results for decision making.
Hub Offices • Participate in reviewing the tools.
• Support the implementation of outcome monitoring
surveys (obtain the required permits., data collection,
etc).
• Share results with CPs.
• Identify and implement proposed actions to
recommendations.
CO M&E Team • Follow up with responsible technical units on survey
implementation.
User • Capture results in COMET.
satisfaction CO Technical • Plan and implement the survey.
survey units • Analyse the results.
• Share the results.

15
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

STEP 5: Capture, compile and validate data.

The completeness and accuracy of data is essential for its used for decision-making
purposes. Therefore, it is critical to ensure that all collected data during monitoring are
uploaded to the MoDa server before moving to the analysis step. This requires following
up with data collectors to ensure submission of all expected data sets as well as with
partners for output data from regular reporting as per agreements to be uploaded into
COMET.

Data validation/cleaning involves checking completeness, quality, and comparability of


data to ensure it is adequate for analysis. In cases where data collection is outsourced,
the contractual document should clarify the exact format in which the data will be
received, whether any preliminary analysis will be required and in what format the
preliminary products will be presented. Data sets from outsourced processes must also
be received within the timelines stated in relevant contractual documents.

For outcome and output levels, data should be processed and validated before final
values are entered into COMET. Where representative sampling is used, validation
includes comparison of datasets with the expected number of cases based on the
sampling, and determining whether the dataset is adequate for analysis. For output
data, validation and reconciliation must be done with relevant documents and other
data sources. Data required for corporate reporting – Annual Country Reports (ACRs)
should be reconciled for consistency on a quarterly basis.

In addition, the CO M&E Unit will implement the corporate guidance to address the
need for systematic and consistent practices for ensuring monitoring systems produce
high quality data that measures the outputs, outcomes, cross-cutting priorities, and
processes of WFP’s programmes at the CO level. The guidance outlines two approaches
for ensuring high quality data:

- Preventive measures: The first, setting up systems, processes, and tools mainly
prior to data collection, ensures that quality is an inherent characteristic of the
data produced. This is an ex-ante method for guaranteeing data quality through
the design of robust monitoring systems, anticipative action, and management
of staff resources.
- Detective controls: The second method occurs after data collection and consists
of ex-post checks on the data generated and reported to identify possible
shortcomings in the five dimensions of data quality 4. These checks are known as

4
Completeness, timeliness, reliability, validity, and integrity. For more information, kindly refer to the
corporate guidance on Data Quality.

16
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

‘detective controls’ and apply at different stages of data collection, aggregation,


and reporting.

STEP 6: Analyse data and prepare information products.

Once monitoring data is compiled, cleaned, and validated, it must be analysed to


discern trends, achievements and challenges guided by the Analysis Plan. Such analysis
should include consideration of the reasons for under-achievement of indicators and
actions required to improve performance. Similarly, success factors contributing to
good performance should be identified for dissemination. A key output of this analysis
is the development of information products, which might include, for example,
monitoring reports, public information materials or input for donor briefs. These
information products should be circulated internally for decision-making and learning
purposes and externally5, as appropriate, for information sharing and accountability.

The CO M&E unit is responsible for producing periodic report or information products
that are summarized in Table 2.

Beside the periodic report listed above, the Sudan CO also develops dashboards in
accordance with the information needs. To facilitate communication and coordination, a
directory of the Sudan CO M&E dashboards 6 is available and should be updated
regularly.

In addition, assessments, review, and evaluation reports are also expected to be


prepared and disseminated as appropriate.

Table 4: Periodic M&E reports/information products

Report/information product Frequency Timeline


type
Monthly process monitoring report Monthly 15 of the following
month.
Quarterly process monitoring report Quarterly End of the first month of
the new quarter.
Semi-annual process monitoring Bi-annually End of the first month of
report the new semester.
Outcome monitoring report Bi-annually One month after
completion of data
collection.

5
Before being shared externally, the reports should be vetted in contents for an external audience and
approved by the Head of M&E and Head of Programme.
6
The Sudan CO M&E dashboards directory is available here.

17
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Report/information product Frequency Timeline


type
Annual Country report Annually As per corporate
guidance.
Weekly Monitoring Issues Escalation Weekly Every Monday
Report
Monthly Monitoring Issue Escalation Monthly 05 of the following
Report month.

Regarding the Annual Country Report (ACR), while the overall coordination is under the
Performance and Reports Unit, the CO M&E unit holds the following responsibilities:

• Coordinate with technical units in setting annual targets for all outcome
indicators and that COMET is timely updated.
• Ensure timely availability/generation of outputs and outcome results in
alignment to the approved CSP logframe.
• Ensure timely data entry of output and outcome results in COMET, quality, and
completion.
• Prepare a waiver for all indicators not being reported and seek management
approval for further sharing with Regional Bureau Office.
• Support technical units with preparation of narratives for outcome level results,
by ensuring that results including short narratives are shared with technical units
on preparation for the ACR.

The CO M&E team will maximize the dissemination of monitoring evidence internally to
ensure that the data collection has been relevant, and the necessary information has
been derived from it. An approval from management is required before sharing
monitoring information products with external audience.

STEP 7: Make use of monitoring findings: Take action and document


lessons.

The main purposes of monitoring are to inform decision-making, spur actions to


improve implementation and inform the design or revision of CSPs. It is therefore
important that procedures be put in place to ensure that monitoring findings are
utilized, and actions are taken based on these findings.

18
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

To this effect, an escalation system is established to detect and address operational


issues to maintain efficiency, facilitates programmatic adjustments, and ensures WFP
Sudan remains accountable to the people it assists.7

On another hand, regular meetings between M&E, programme, other technical staff,
and management are required to review monitoring findings and implement actions
related to improving programme implementation and design (see table 3). The output
of these meetings should be the action plans developed, stating agreed actions based
on monitoring findings, responsible for implementation and timelines. A tracking
system is established to ensure that agreed action plans are implemented as presented
in Annex 2.

Table 5: Regular meeting on monitoring findings8

# Targeted audience Frequency Timeline


- CO M&E
End of the first month of the
1 - Management (DCD, Head of Quarterly
new quarter.
Programme)
- CO M&E
Second programme meeting
2 - Programme Team Monthly
of the month.9
- Head of programme
- Hub M&E Officer
- Hub Management (Head of
3 hub, Hub head of programme) Monthly Last week of the month.
- Hub programme and
technical staff

STEP 8: Conduct evaluations or reviews.

This section describes the procedures for conducting reviews and evaluations. These
procedures may be applicable to other specialized assessments.

7
Please refer to the Sudan CO Monitoring Issues Escalation SOP for more information.
8
In addition to the regular meetings, ad hoc meetings to review monitoring findings could be organised as
required.
9
The Sudan CO Office programme team organizes a bi-weekly programme meeting. This meeting should be
used as well to discuss monitoring findings among others.

19
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

According to the WFP Evaluation Policy (2022), a review is the periodic or ad hoc
assessment of the performance of a programmatic intervention, or a specific aspect of
it, informing operational decision making and support learning and accountability.
Reviews do not have to conform to specified external reporting or publication
requirements or to the international standards applicable to evaluation, but they must
comply with the standards of the United Nations System-wide Action Plan on Gender
Equality and the Empowerment of Women. The Evaluation Policy also define an
evaluation as an assessment, conducted as systematically and impartially as possible, of
an activity, project, programme, strategy, policy, topic, theme, sector, operational area,
or institutional performance. It analyses the level of achievement of both expected and
unexpected results by examining the results chain, processes, contextual factors, and
causality using appropriate criteria such as relevance, effectiveness, efficiency, impact,
and sustainability. Evaluations adhere to the United Nations definition of evaluation.

There are three categories of evaluations within WFP context: impact evaluation (OEV
led), centralized evaluation (OEV led) and decentralized evaluation (managed by
CO/RBN).

However, this SOP however, focuses on decentralized evaluation and specifically those
managed at country office level. The Decentralized Evaluation Quality Assurance System
(DEQAS) provides detailed guidance about management of decentralized evaluation.
Detailed Information on different types of evaluation can be accessed here.

There are various types of decentralized evaluation which are determined by the nature
of the subject of the evaluation, as follows:

• Activity evaluation: Assesses an ongoing or completed WFP activity, from design to


implementation and results. It can cover one or several activities within a CSP
but should not attempt to cover the entire portfolio. It can also cover one activity
across several CSPs in multiple countries.
• Pilot evaluation: Assesses a pilot intervention, defined as an experiment or test,
often small-scale at first, before introducing an intervention more widely.
Evaluations of pilot projects generate evidence on their relevance; results,
whether intended or not; and how the pilot project has impacted target
communities to determine whether the pilot can be scaled up in the same
country or elsewhere and, if so, under which conditions. The evaluation of a pilot
is critical prior to any scale-up or replication and before decisions are made on
the design of a potential successor intervention.
• Transfer modality evaluation: Assesses, notably in view of beneficiary preferences,
the appropriateness of the choice of a transfer modality(ies), defined as the

20
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

mode in which assistance is transferred to intended beneficiaries (in-kind,


commodity vouchers and cash). Transfer modality evaluations assess the
intended or unintended results of various transfer modalities and their relative
efficiency with the aim of understanding when, why and how a given transfer
modality or combination of transfer modalities best achieves the desired
outcomes compared to another.
• Thematic evaluation: Covers a theme across the entire CSP portfolio or selected
activities in one or more countries, or globally. They provide a “big picture”
perspective on how WFP is performing; how the organization could further
improve in a given thematic area, such as gender equality and the
empowerment of women, capacity strengthening, and protection; and identify
good practices in a given thematic area and within a range of operational
contexts.
• CSP outcome evaluation: Assesses all ongoing or completed activities under a CSP
outcome in one country. It should not attempt to cover the entire country office
portfolio.

Decentralized evaluations or reviews follow six key steps (planning, preparation,


inception, data collection, reporting, and dissemination) which are described in the
following sub-sections.

8.1. Planning

Planning of reviews and evaluations should be done either as part of the CSP design,
annual or periodical reviews of the CSP. CSP reviews are in response to CO ad-hoc
needs or demands (i.e. for learning) or in response to donor demands or funding
proposal requirements. The planning should be in response to learning and
accountability of the CSP. Planning at CSP design is encouraged to ensure that funding
needs are also budgeted for. All planned reviews and evaluations and implementation
timelines should be included in the annual VAM, M&E planning, and budget tool. Table 4
provides the key planning steps for reviews and evaluations.

Table 6: Key planning and preparation steps for reviews and evaluations

Reviews Evaluations
1. Ensure that the surveys have been included 1. Ensure that the surveys have been included
in the MRE plans and budgeted for. in the MRE plans and budgeted for.

2. Agree as to whether the survey will be 2. Agree on the type of evaluation


done by an externally or internally recruited (thematic/activity etc) and as to whether the

21
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Reviews Evaluations
consultant. survey will be done by an externally or
internally recruited.
3. Prepare a TOR with clear purpose, scope,
and budget in consultation with the relevant 3. Identify the evaluation manager and
technical unit. establish the evaluation committee (Head of
M&E)
3. Seek endorsement from technical unit,
clearance from the Head of MEAL and 4. Prepare a TOR with clear purpose, scope,
approval by Head of RAM. and budget in consultation with the relevant
technical unit.
4. Organize internal review team or recruit
the external consultants/TDY 5. Establish the evaluation committee and
evaluation reference group.

6. Approval of TOR as per DEQAS guidelines:


- Draft ToR
- DE quality support service on the ToR
- Finalize the TOR

7. Recruitment of evaluation firm/team

8.2 Inception

Inception phase is mostly applicable where a decision is made to use an external


consultant to conduct reviews or evaluations. The inception phase entails an initial
review of programme documentation, literature, and is provided based on the
subject/scope of the reviews and evaluations. If implementation is done internally, the
concept note/TOR serves the purpose. Based on findings from this initial review, an
inception report is developed which should include scope, overall and detailed
objectives, evaluation/review questions linked to the specific objectives, methodology,
information on data sources, sampling plans and methodologies, data collection plans,
key deliverables, and timeline. The draft review/evaluation inception report is shared
with M&E and technical units for feedback and then finalized.

8.3 Data collection and management

The approach to data collection and management differs depending on whether the
review or evaluation is managed internally by the Sudan CO or externally. In the case of
a review/evaluation managed by WFP CO, data collection tools are developed by WFP,
data collection is done by WFP monitors and analysed by the M&E unit at the country
office. When the review/evaluation is managed externally, development of tools, data
collection planning and analysis is done by the external team(s). WFP provides

22
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

coordination and administrative support. If the review/evaluation team does not have
the capacity to recruit data collection teams, WFP provides support though its monitors
or coordinates recruitment of data collection team through hub offices. For evaluations,
WFP M&E support is restricted to coordination and facilitation only. All process will be
guided by the approved inception report. The table 5 summarized the key steps for data
collection and analysis.

Table 7: Key steps of data collection and analysis

Reviews Evaluation
(Internal)
1. Prepare a survey tool and pre-test by WFP. 1. Review and provide inputs on the survey
2. Develop a deployment plan in consultation tool to the review team.
with FOs. 2. Liaise with FOs and CPs to support the
3. Data collection including close data quality review team’s field mission.
check during the survey. 3. Data collection to be conducted by the
4. Share the preliminary findings for evaluation team.
feedback. 4. Provide requested secondary data.
5. Incorporate feedback in the final review 5. Coordinate provision of feedback on
findings. preliminary findings from relevant CO
technical units.
(External) 6. Share feedback on the draft evaluation
1. Review and provide inputs on the survey findings with the evaluation team.
tool to the review team
2. Liaise with FOs and CPs to support the
review team’s field mission
3. Field work by the review team
4. Provide feedback on preliminary findings
shared by the review team

Inception report (draft)

8.4. Reporting and dissemination

Review and evaluation reporting use the recommended corporate templates 10. The
reporting process is in three stages: preparation of the first draft report, preparation of
the second draft report, following the incorporation of comments from targeted
audience and the final report after presentation of results based on second draft.

The three stages ensure quality reporting in addition to ensuring that the content of the
report is within and fully covers the scope of the evaluation. Specific for evaluations, the
reports undergo quality check by an externally outsourced quality support service using
a standard checklist.
10
Available here

23
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Throughout the reporting process, the report undergoes different approval stages.
While the evaluation report approval process strictly follows the corporate guidelines
(DEQAS), reports from other assessments and reviews approval process are decided at
the CO. Reviews are generally approved by the CO management while the evaluation
reports are approved by the chair of the evaluation committee (CD/DCD).

Dissemination of evaluation reports follows the corporate guidelines and is made


publicly available online (evaluations). Publication of review reports or conformity to
specified external reporting or publication requirements is not a must. However, it is
expected that all reviews address the UNSWAP standards on gender. However,
depending on the needs such as donor request, it can be included in the reporting.

The dissemination and follow-up phase has two main purposes both of which aim at
maximizing the use of the evaluation findings. The key deliverables include: 1)
Development of management response to the evaluation recommendations and 2)
Publication of the report and dissemination of the findings.

Following the approval of reviews/evaluations reports, the M&E unit coordinates the
process of reviewing the recommendations and assigning them to relevant technical
units. The technical units are assigned a task of identifying corrective actions in
response to the recommendations with implementation timelines using the corporate
management response template. The M&E unit follows up the implementation of
activities based on the provided or agreed timelines.

3. Roles and Responsibilities


The Table 6 below provides overall roles and responsibilities on the monitoring, review,
and evaluation function within the Sudan Country office.

Table 8: Roles and responsibilities of Sudan CO M&E function

Unit Roles and responsibilities


Country Office • Ensure availability of adequate human resources for the implementation of
M&E Unit monitoring activities.
• Design/review the CSP logframe for CO and RB clearance and HQ approval.
• Design and review funding proposal logframes.
• Prepare annual and quarterly budgets for monitoring, review, and

24
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Unit Roles and responsibilities


evaluation activities.
• Prepare annual Monitoring, Review, and Evaluation (MRE) plans for the
country office.
• Prepare annual work plans for monitoring, review, and evaluation activities.
• Prepare/review and update data collection tools in coordination with
technical units.
• Consolidate Monthly Monitoring Plans and track CO monitoring coverage
using the RBMF.
• Prepare and disseminate timely monitoring reports, and review/evaluation
results.
• Ensure regular entering of outcome data into COMET.
• Conduct regular meetings between M&E, programme, other technical staff,
and management to review monitoring findings and implement actions
related to improving programme implementation and design.
• Follow up on implementation of actions related to improving programme
implementation and design based on monitoring, review, and evaluation
findings.
• Coordinate the implementation of the Monitoring Issues Escalation System.
• Coordinate monitoring, review, and evaluation data collection activities.
• Ensure that all monitoring activities are aligned to the corporate
requirements and that all country office specific monitoring needs are met.
• Conduct oversight visit to hub offices.
• Support M&E staff in acquiring skill that supports the implementation of
their activities.
• Coordinate the implementation of Third-Party Monitoring as appropriate.
• Provide technical support on M&E related issues.
• Maintain close communication with Regional Bureau (RB) team for latest
guidelines, updates, inputs, and support.
• Participate and contribute towards logical framework development and
review.
• Participate and contribute towards indicator target setting for both output
and outcome indicators.
Country Office • Provide specific information needs to be incorporated into data collection
Technical units tools.
• Provide programme briefs and technical trainings to field monitoring team
as required.
• Use monitoring, review, and evaluation findings to prepare programme’s
improvement actions and ensure implementation.
Hub Offices M&E • Implement monitoring activities at the hub level in coordination with the CO
M&E Unit.
• Prepare Monthly Monitoring Plans based on the planned operations and
share them with the CO M&E Unit.
• Ensure adequate and balanced monitoring coverage using the RBMF.
• Coordinate the deployment of Monitors (both WFP and TPM) to collect data
in the field according to the approved MMPs.
• Support the CO M&E Unit in the implementation of the Monitoring Issues

25
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Unit Roles and responsibilities


Escalation System.
• Conduct field level training for TPMs based on monitoring tools and M&E
training provided by CO.
• Communicate roles of the TPMs to Cooperating (CPs) at the field level and
liaise with both TPMs and CPs for the day-to-day organization.
• Conduct monthly random spot checks to verify TPMs’ work.

• Collect monitoring data according to the monitoring plans (Field Monitors).


Field Monitors • Escalate timely any issues of concern that require immediate action and
need to be reported without delay (Field Monitors).
• Provide implementation/operation plan to Hub M&E Officers to prepare the
Monthly Monitoring Plans.
Hub Office • Prepare/ contribute towards actions to recommendations from monitoring,
technical units review and evaluation results and ensure implementation in coordination
with CO based technical units.
• Use Monitoring, review and evaluation results for decision making.
• Support the implementation of monitoring of activities.
Hub Offices • Encourage a culture of using results/evidence for decision making.
management • Ensure appropriate segregation of duty for monitoring team at the hub
level.
• Resourcing and budget allocation for monitoring activities.
• Ensure that operational and strategic decision making are evidence based.
Management
• Encourage a culture of using results/evidence for decision making.
• Support the implementation of monitoring of activities.

In addition to the internal roles and responsibilities described in the table 8, WFP Sudan
partners (namely CPs and TPM) also play an important role in the M&E function. on the
monitoring, review, and evaluation function within the Sudan Country office. Within the
M&E workstreams, CPs are expected to:

 Share the required information on food and cash distributions, such as the exact
location and dates to facilitate monitoring visits by WFP Fields Monitors and TPM
monitors.
 Maintain a close communication with WFP FMAs and TPMs to conduct
monitoring visits.
 Certify TPM attendance at distribution sites during the spot-checks conducted by
WFP.
 Prepare and submit timely the monthly reports to WFP as indicated in the Field
Level Agreements (FLAs).
 Support outcome monitoring data collection as required.
 Participate in review / evaluation exercise as required.

26
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

These roles and responsibilities should be communicated to all the CPs during the
onboarding stage. The TPM roles and responsibilities are described in detail in the
Sudan CO SOPs for Third-Party Monitoring (TPM).

With regards to the segregation of duty, it is important to separate the duties between
the M&E team and the teams involved in programme implementation. Such separation
of duties helps avoid conflicts of interest and ensure that staff who design programmes
and Field Level Agreements (FLAs) are not the same as those who monitor programme
implementation and outcomes. The CO management in addition to the hub offices
management also need to ensure that the M&E teams are fully dedicated to their
monitoring activities and are not diverted to perform other programme-related tasks.

27
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Annexes

Annex 1: Sudan CO process monitoring master tool

Link to the Process Monitoring Master Tool

Annex 2: Monitoring findings’ action plan tracking system

Link to the tracking system documentation

28
SOP ID No: SDN/PRG/05/01/2024
Monitoring and Evaluation SOP

Annex 3: List of acronyms

ACR Annual Country Report


BPU Budget and Programming Unit
BR Budget Revisions
CBT Cash-based transfers
CO Country Office
COMET Country Office Tool for Managing (programme operations) Effectively
COR Sudan Commission for Refugees
CP Cooperating Partners
CPB Country Portfolio Budget
CSP Country Strategic Plan
DEQAS Decentralized Evaluation Quality Assurance System
FFA Food for Asset
FMA Field Monitoring Assistant
FSMS Food Security Monitoring System
HAC Sudan Humanitarian Aid Commission
HGSF Home Grown School Feeding
M&E Monitoring and Evaluation
MMP Monthly Monitoring Plan
MMR Minimum Monitoring Requirements
MRE Monitoring, Review and Evaluation
ODK Open Data kit
OEV Office of Evaluation
RBMF Risk-based Monitoring Framework
RBN Regional Bureau Nairobi
SO Strategic Objective
SOP Standard Operating Procedures
TOR Terms of Reference
TPM Third Party Monitoring
VAM Vulnerability, Analysis and Mapping

29

You might also like