0% found this document useful (0 votes)
11 views17 pages

Monitoring and Evaluation - HSM 2

The document outlines the purpose and importance of Monitoring and Evaluation (M&E) in health systems, emphasizing the systematic collection and analysis of data to improve project performance and health outcomes. It details the goals and objectives of the Kenyan National M&E System, the framework for effective M&E, types of evaluations, and the processes involved in monitoring and evaluating health interventions. Additionally, it discusses the components of Terms of Reference (TOR) for evaluations and the tools necessary for effective monitoring and evaluation.

Uploaded by

viwisa6037
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views17 pages

Monitoring and Evaluation - HSM 2

The document outlines the purpose and importance of Monitoring and Evaluation (M&E) in health systems, emphasizing the systematic collection and analysis of data to improve project performance and health outcomes. It details the goals and objectives of the Kenyan National M&E System, the framework for effective M&E, types of evaluations, and the processes involved in monitoring and evaluating health interventions. Additionally, it discusses the components of Terms of Reference (TOR) for evaluations and the tools necessary for effective monitoring and evaluation.

Uploaded by

viwisa6037
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Purpose of Monitoring and Evaluation (M&E) in a Health System

Monitoring is a systematic process covering routine collection, analysis, and use of


information about how well a project or programme is performing. It involves continuous
review of the performance of all the components in the project to ensure that input deliveries,
work schedules, targeted outputs, and other required actions are proceeding as per the work
plans (MOMS & MPHS, 2011).

1
Evaluation is the periodic assessment of a project or programme to determine the
achievements against clearly set performance targets. The purpose of conducting an
evaluation is to assess whether the project is making progress toward achieving its overall
goals and objectives, and providing opportunities for mid-course corrections to project
implementation, if necessary (MOMS & MPHS, 2011).

Monitoring and evaluation (M&E) are fundamental components of any programme that aims
at continuously improving and providing better health outputs and outcomes. Although there
are differences between monitoring and evaluation, the two processes work together and lead
to the same end result, which is to produce information that can be used to continuously
improve the performance of a given facility, department or programme and learn about what
is working and what is not working.

17.2 Goal and Objectives of the Kenyan National M&E System

The goal of the National M&E is to provide timely and reliable information that will enable
tracking of progress and to enhance informed decision-making at all levels in the
implementation of interventions under the health sector mandate in the country. The specific
objectives are to:

i. Establish a reliable M&E system at National and County level;

ii. Strengthen the M&E capacity of MoHS and Health facilities to collect, analyse and
use data for decision making and health system improvement;

iii. Promote importance of M&E, the need for systematic data collection and use of
results and lessons learned in the further planning of health interventions by the
government and its partners;

iv. Increase understanding of trends and explaining the changes in disease incidences or
prevalence overtime as well as morbidity and mortality rates and ratios;

v. To ensure accountability, transparency and the quality of information to achieve the


desired results.

17.3 Monitoring and Evaluation Framework

Effective M&E is based on a clear logical pathway of results in which results at one level are
expected to lead to results at the next level, leading to the achievement of the overall goal.
Consequently, if there are gaps in the logic, the pathway will not lead to the required results.

Major levels for M&E framework are: Inputs, process, outputs, outcomes and impacts.

 Inputs- The pPeople equipment, materials and resources that are put into a project in
order to implement the project.

2
 Processes - The activities performed/involved in delivering the project such as
training, meetings, treatment and distribution among others. Processes associated with
service delivery are very important and involve quality, unit costs, access and
coverage.

 Outputs- The first level of results associated with the project e.g. the number of
people trained or services delivered in order to achieve outcomes. These are short-
term results.

 Outcomes – The second level results associated with the project-mid-term results e.g.
the changes in health status, behaviour or skills. These must be related to the project
goals.

Impacts-The third level results with long-term consequences of a project e.g.
decreased mortality and morbidity changes overtime, and usually long-term.

17.4 M&E Indicators

An indicator is defined as a set of values used to measure against; it is like a sign post that is
used to measure against. Valid and measurable indicators are very crucial in an M&E system.
Each of the M&E levels – inputs, outputs, outcomes, impacts – has an indicator to verify
whether the desired objectives or activities are implemented, achieved or not.

A minimal set of indicators is advisable in any M&E system. The following three golden
rules of M&E provide a good basis:

i. Define indicators that can be measured;

ii. Collect data that is useful for decision-making or from which lessons can be learned,
and;

iii. It is better to approximate an answer for a few important questions than to have an
exact answer for many unimportant questions.

3
Table 17.1 Type and Sources of Information Required for Monitoring in the Health Sector

Category of What to What Who Who uses How to use What


information Monitor & records to collects data information decisions
Evaluate keep data can be made

Work plan Timing of Monthly/ Health Project Ensure staff Reschedule


activities activities quarterly/ facility implementa and other activities and
annual work tion team resources are deployment
plans available of resources
as needed
Availability of Work HOD HMT,
personnel, schedules HOD
resources
   
Costs and Utilisation of Project Account HMT Ensure Authorise
expenditures resources budget ant  HMC funds are expenditure
 Accounting  Auditor available to s
Resources
records implement  Make
mobilised 
 Receipts MoH activities budget and

Expenditure  Headquart Ensure project
Cash and ers compliance revisions
bank 
Donor with GoK  Determine
transactions
 partners and funding need for
Reports regulations other
MOH/
funding
donor
sources
   
Staff Knowledge, HoD HMC Staff Task
supervision attitudes  HRM  HRM motivation allocation
and skills of  Qualit staff  Training

and staff y staff needs
Teams developmen  Recruitment

Performanc t 
e reviews  Disciplinary
Resolve action
 
Job work place Promotion
description problems
Experience/co s
mmitment/
education of
and staff

Salaries and
benefits or
other forms of

4
Category of What to What Who Who uses How to use What
information Monitor & records to collects data information decisions
Evaluate keep data can be made
compensation

Job
performance
   
Management Equipment Assets and HoD HoD Ensure Authorizati
of facility e.g. computer, stock Supplies availability on for
assets and motor vehicle, registers & of required utilisation
other stock  Assets and Logistics physical  Minimum
physical stock Maintena resources quantity to
materials movement  Ensure good be kept
Procurement nce
logs/registe condition of  When to
regulations rs physical order
 Invoices resources  Amount to

Inspection/ keep in
service/aud reserve for
it reports emergency
    
Performance Outputs, Minutes of HMC HMC Ensure
Revise
results outcomes, project  Project  HOD objectives
objectives
impact review imple  Project  Retrain
are realistic
meetings mentat  Assess staff
 implemen
Minutes of ion tation quality of  Revise
Department team team services project
al meeting  Donor provided strategy
s  Assess and
 agency
Attendance appropriaten approach
registers ess of
intervention
s

17.5 Types of Evaluation

Baseline/formative

The evaluation is conducted before implementation of the plan to assess needs and potentials.
It can also determine feasibility of the plan.

Midterm Evaluation

Conducted during the implementation period to identify areas that require change or
modifications and in the process detect deficiencies and ensure immediate redesign of
intervention strategies to forestall failed implementation.

5
Summative/End of Project Evaluation

This is conducted at the end of the project to assess outcomes achieved as an effect of project
activity implementation

Post Project Evaluation

Evaluation conducted to measure programme sustainability after its successful


implementation and closure.

Impact Evaluation

Evaluation to assess long term effects associated with a successful project implementation.

17.6 Monitoring Process

The monitoring process will take the logical steps below depending on whether one is
looking at the process from accountability perspective, manager perspective or evaluator’s
perspective. This involves:

i. Recording data on key indicators as a result of activities carried out;

ii. Analysing and processing data for consumption;

iii. Storing and retrieving information for use by different stakeholders;

iv. Reporting activity results based on activity timeframe;

v. Providing feedback to appropriate managers and stakeholders internally and


externally.

17.7 Evaluation Process

The evaluation process will entail:

i. Designing evaluation strategy;

ii. Participatory planning meeting;

iii. Developing evaluation plan;

iv. Implement evaluation plan;

v. Analyse evaluation results;

vi. Participatory reflection on results;

vii. Implementation of improvements.

6
16.8 M&E Conceptual Framework

The M&E conceptual framework demonstrates the theory of the sequence of cause and effect
that ultimately lead to a particular ultimate result. In the health sector, the ultimate result is
positive health impact on clients in any of the health areas.

Figure 17.1 The M&E Conceptual Framework

Input Process Output Outcome Impact


-Resources
-Activities -Services -Knowledge -Incidences
-Staff
E.g. Training, -Trainees -Improved -Prevalence
-Supplies
Distribution services rates

Source: MSH (2013).

The conceptual framework demonstrates the process of monitoring and evaluation.

What you need to do at each level:

 Input level: Monitor whether resources, staff, supplies etc. are being provided.
 Process level: Monitor whether activities are happening.

 Output level: Monitor whether required outputs are generated by activities carried
out according to planned schedule.

 Outcome level: Evaluate whether there is gain in the expected areas.


 Impact level: Evaluate or conduct demographic health survey to show impact.

The results at every level are used as feedback to influence the preceding level and gaining
lessons learned thereby improving health provision and outcomes as well as effectiveness and
efficiency.

17.9 Evaluation Terms of Reference (TOR)

What is TOR?

TOR refers to the definition and structured description of the scope of work and the schedule
that must be carried out by the person, company or evaluation team undertaking an
evaluation.

7
Characteristics of TORs

The terms of reference recalls the background and specifies the scope of the evaluation,
process, products, technical aspects, and states the main motives for an evaluation and the
questions asked. It sums up available knowledge and outlines an evaluation methodology
describing the distribution of work, schedule and the responsibilities among the people
participating in an evaluation process. It also specifies the qualifications required from
candidate teams or individuals as well as the criteria to be used to select an evaluation team.

TOR serves as a ‘contract’ between project/institution and evaluators, outlining key elements
and should reflect strategic choices on what to focus on. The optimal type of TOR is one that
satisfies the interests of all stakeholders concerned. This is not always possible. However,
given the range of motivations for undertaking an evaluation, it requires the TOR to retain
enough flexibility for the evaluation team to determine the best approach for collecting and
analysing data.

Components of TOR

At a minimum, it is expected that ToRs for all evaluations will address the following
sections.

A. Title

B. Background and context

Overview and historical context of project under evaluation, project justification and
implementation experiences/challenges, project documents and revisions thereof, project
objectives and expected outcomes.

C. Purpose of the evaluation (objectives)

Who commissioned the evaluation? Why at this point? What is it expected to accomplish?
What decisions might evaluation guide in? Who will use the evaluation results and how do
you involve them?

D. Scope of work for the evaluation

There is need to determine the unit of analysis to be covered -project, cluster of projects,
programme, a process within a project, time period, geographical coverage.

E. Evaluation criteria and key evaluation questions

Identify the key evaluation questions to be answered by the evaluate on along with their
related evaluation criteria-project relevance, efficiency, effectiveness, impact, and
sustainability.

F. Evaluation methodology

8
The methods used to collect and analyse data on which the quality of the evaluation is
dependent on i.e. desk reviews, questionnaires, surveys, structured interviews, discussions,
workshops, field visits, observations, retrospective baseline construction etc., data sources,
and possible references to an evaluation.

G. Expected deliverables/outputs

Planned field missions and expected deliverables and respective timeframes including:

 Inception report -containing a refined work plan, methodology and evaluation


tools.

 Draft evaluation report in line with institution evaluation policy and guidelines.
 Final evaluation report, including annex with management response.

 Presentation of evaluation findings, lessons and recommendations to project and


stakeholders.

H. Timeframe

Evaluation inception to presentation of results.

I. Evaluation team composition

Qualified independent and impartial evaluators not involved in project design or


implementation, gender balanced, balance geographical representation.

J. Management of evaluation process

Roles and responsibilities matrix of all the evaluation stakeholders -evaluators, managers,
technical unit, field staff, implementers.

K. Budget

L. Annexes:

 Job descriptions of evaluators.


 List of background documents for the desk review.

 List of stakeholders.

 Project/institution standard format and guidelines for evaluation reports.

Monitoring and Evaluation Tools

Basic monitoring tools are used to collect input, process and output indicators. The tools and
formats should focus on results and progress towards outcomes. These include:

9
Work Plan

The work plan is a planning tool that serves as a guide for implementation of action steps
(activities) to achieve the stated overall goal and particularly specific objectives of a project.
It provides the framework for evaluating progress toward objectives and is the primary
document used to monitor on-going progress, to adjust activities as needed and to evaluate
outcomes. The work plan is a ‘living’ document and, as such, it may change over the duration
of implementation to reflect a more realistic implementation process.

Monitoring Plan

A monitoring plan is a set of requirements for monitoring and verification of objectives


achieved by a project during implementation. These may be:

 Monthly service statistics summary registers;


 Financial reports;

 Monthly/quarterly institutional reports;

 Checklists;

 Questionnaires;

 Interview guides;

 Focus group discussion guides;

 Observation guides;

 Internet (secondary data).

Evaluation tools

Evaluation tools are used for assessing effective indicators (outcome and impact indicators).
They focus on assessing program outcomes and impact. These include:

i. Performance monitoring plan

A performance monitoring plan (PMP) serves as a roadmap for monitoring and evaluating
programme performance throughout its lifespan. It is a detailed plan for managing indicators
in order to monitor project performance, outcomes and impact. The PMP contain the
performance indicators and their definition, data source, method of data collection or
calculation, when data is collected, responsibility, why the data is important, who will use the
data and for what purpose.

10
ii. Evaluation plan

An evaluation plan is a written document that states the objectives of the evaluation,
questions, information to be collected and timeframe of the evaluation. The plan should
constitute sections describing the key questions to be addressed related to areas of expected
learning from the evaluation as a part of the evaluation framework, programme
implementation objectives, outcome objectives and performance measures and procedures for
managing and monitoring the evaluation.

17.11 Work Plans

Definition

A work plan is an annual or multi-year summary of tasks, timeframes and responsibilities that
is used to support the implementation and evaluation of programme implementation. It is a
valuable tool with a detailed account of how employees propose to accomplish their goals
during project implementation-what actions need to happen, who will do them, when they will
be completed and what resources will be required. A work plan is also used as a monitoring
tool to ensure that production of outputs and progress towards outcomes and impact is timely
and reflects project goals.

Key elements of a work plan

i. Clearly defined goals, outputs and outcomes;

ii. Activities – tasks to achieve outputs, outcomes;

iii. Costs (budget) – indication of the activity’s cost;

iv. Monitoring and evaluation – ensures that measures to monitor and assess the
effectiveness of an activity are included such as recording achievements, collecting data,
and assessments.

Developing a work plan

The overall process of work planning is a comprehensive tool that helps programme staff to
translate project/programme goals into operational terms on an annual basis. Monitoring and
evaluation are integral parts of a work plan and will provide a basis for tracking achievements
and revising strategies on how to best achieve project goals.

Work plans set out how a project will achieve its clearly defined goals by converting project
goals into smaller, manageable outcomes and tasks that will ensure that the skills, experience
and resources available are used efficiently and sustainably. A work plan will also help a
supervisor know what projects and activities supervisees will be working on over the next
several months,

11
A work plan generally includes a brief introduction or overview of a project and a breakdown
of how individual project-related tasks will be accomplished through activities. The detailed
breakdown is usually tabulated with columns capturing specific activity descriptions,
outputs/outcomes, a timeline for completion, cost projections for implementation and staff
responsible (Table 17.2). It is mandatory to include monitoring and evaluation activities in
the work plan.

305

12
Table 17.2 Work Plan Template

Output/
Activity Budget Budget Progress
Outcomes/ Responsibility
Description Assumptions (Ksh) Remarks
Objective Deliverables June Jul Aug Sept Oct Nov Dec Jan Feb Mar Apr May

1.1 Provide a) Conduct 10 clients Jones Salama 1000 units


essential HIV/AIDS receiving testing kits
drugs for testing for ART every
adults and adults and month
elderly with elderly facility
common Clients
health 200,000 x x x x x x x x x x x x
conditions
to increase
their
survival
rate by
10%

b) Conduct 1 outreach Cyril Mwema Transport,


Outreaches every quarter Tent hire,
Quarterly brochures 450,000 x x x x

13
Implementing Work Plans

Implementation is the process of taking a work plan and its concepts and putting it into action.
The work plan will serve as a guide for what needs to be accomplished, by whom, and in what
specific timeframe. However, surprises do come up and changes in the work plan may be
necessary. It is critically important that staffs involved in work plan implementation are made
aware of such changes and how the changes may affect their role in the implementation process.

Throughout the implementation process, data related to the measures identified should be
collected. These data will be important in the monitoring and evaluation process to determine
whether or not the programme had the intended outcome.

Monitoring and Evaluating Work Plans

Monitoring the work plan will be done by assessing whether activities were implemented as
initially planned. This is usually done through monthly/quarterly activity implementation
reviews. Enhanced process evaluation will entail an examination of whether activities are being
carried out correctly, on time and within budget. Results of the evaluation should be used to
enhance or review implementation.

17.12 Evaluation Reports

According to the Business Dictionary, a report is a document containing information organised


in a narrative, graphic, or tabular form, prepared on ad hoc, periodic, recurring, regular, or on as
required basis.

In programme management, a report is:

i. A compilation of descriptive information;

ii. A communication tool to present M&E results by presenting raw data and information as
knowledge;

iii. An opportunity for project implementers to inform themselves and others (stakeholders,
partners, donors, etc.) on the progress, problems, difficulties encountered, successes and
lessons learned during implementation of programs and activities.

Reports may refer to specific periods, events, occurrences, or subjects, and may be
communicated or presented in oral or written form. Some questions to answer before writing a
report are:

 Have you considered the needs/characteristics of the readers? i.e. Executive, technical team,
staff, donor, general public).

14
 If it is a public health report, does it make health care performance information clear,
meaningful, and usable by consumers?

Why report?

 Reporting enables the assessment of progress and achievements and helps focus
audiences on the results of activities, enabling the improvement of subsequent work
plans.

 Reporting helps form the basis for decision-making and learning at the programme level.
 Reporting communicates how effectively and efficiently a programme is meeting its
objectives.

Elements of a good report

 Self-explanatory statement of facts relating to a specific subject(s).


 Systematic and logical presentations of relevant ascertained facts, figures, conclusions and
recommendations.

 Time bound for timely decision making.


 Concise and objective.

 Appropriate grammar, language and tone for the consumers (avoid technical jargon).

 Complete and compact document.

Monitoring and evaluation reports

Types of reports

 Progress report -usually quarterly, semi-annual or annual.


 Evaluation report- mid-term, end-term evaluation.

 End of project report.

Guidelines for writing M&E reports

 Provide a 1 page brief summary (executive summary) and ensure it accurately captures the
content and recommendations in the report.

 Be as concise as possible given the information that needs to be conveyed.

308

15
 Focus on relevant results being achieved compared with the expected results as defined in
the log frame/performance monitoring plan and check that the expected results are
realistic.

 Specify actions to overcome problems and accelerate performance where necessary. (The
basis of this narrative is what you had planned and how you are responding. For example,
why something that was planned did not take place and what you plan to do about it).

 If findings are included in the report, make sure they are objectively verifiable.
 Be clear on your audience (directors, government, donor, technical persons, staff) and ensure
that the information is meaningful and useful to the intended reader.

 Ensure timely submission of progress reports


 Be consistent in your use of terminology, definitions and define any technical terms or
acronyms.

 Present data with the help of figures, summary tables, maps, photographs, and graphs.
 Include references for sources and authorities (if any) and a table of contents.

Progress report format

Cover page

 Name of institution;
 Reporting period;

 Name of person responsible for reporting/contact person;

 Table of contents;

 Acronyms;

 Executive summary;

 This section should have one introductory paragraph and major highlights of findings and
key lessons learned (1 to 2 pages);

 Report body.
This section should consist of a table of the hierarchical objectives with a short paragraph
describing significant outcome results, why your targets were met/not met and what steps to take,
lessons learned (if any) and highlights of activities for the next period. Tables, maps,
309

16
photographs, and graphs may be used where appropriate to enhance clarity and results
interpretation.

Figure 17.2 Sample Project Achievements Table

Output / Outcome
/Deliverable
Objective

Planned Achieved Remarks on achievement Next period activities

Indicator achievement (include when these are achieved)


 Achievements on outcome indicators;
 Achievements on milestones;

 Achievements on impact indicators.

17

You might also like