Evaluation & Impact Assessment
Evaluation & Impact Assessment
.,_
.,_
.,_
., _ THEORIES
• Social
Programme :: Evaluation
Science
theory 111 •- theory
.-.,- theory ■-
Utilisation-Focoused theories
Developed by Based on
see on
.
m1se
~alue
n lies
Key principles
-4
r,· ( · ;
f :-- I • .1, • : 'f j\ •
• Jennifer Greene
-- ~------- ..... --.- ---·-.. ~~--:-..- - ~
- ··-·-:r--:,:--;rr :;.-~ r-•:r·"':"I'" :-.~~~'"'= '"·--:--.. ------ ...~-
-.SQ '. 1
Theory-Driven Evaluation
••r. .) Theory
')
....:i.c: :i.-"°'" ""'-~.: ...i...,_ ,~-.
-L,;......J ,f
~........
"·... ... ~"
,.._.,
l
BNd'rn·Jib 6 the-
"'
~f- . ' • • uap :-.E"afiu·att on -~ • d
EYsi luat"' .- · _ . . •• .. • ir.
de.te.rmm~tton_ Th.! evaluatgr engages -a diverse range of prog_ram stakehQld~rs and acts as
a, "€tTi~idl ·friendH•.OT. ~toa.~hu while :gµidin_g them through the evaluation ·process.
S~po~rment ev.a:luati~n s-eeks to increase the pr-obability .of pro_gram success .by
prav.ig\·ng stakelw\ders wjth the to.of:S and skill~ to,,seJf..evaluate and mainstream-.evaluatton
with;~ thw organt-zation. .Fetterma·n 1_outlin:es thr.e.e ,mairt steps· for .condueting
ein,pOWenJlent ~v~_l.liati:QD~ (1) E>e-velop and refine· the ~'missiont (2J take stock and
pf-i.orJlliZe •tlre prog.ratfl"s acf1Vitie$, and i3) plan. f-0r· the future"' R'ead m.ore on Fetterman's
1
'.l=l&Jey Ctten, ;PhD,; i:S oo.e of me m~l),co~rlbtrtors to, llheox-y-Driven Evaluati•on. 1Hi:s ctp-proach
focuses on ,the· tfieory of change ,~nd ,causal mechanisms untlerlyin,g the program. Che.n
ireco,gnlzes 1that progriaros1.ezj'st fn. an open system) c;:onsistjng o.f inputs, outputs, outcomes,
•and i·flJJJ"acts. He suggests that eva.lu·ac~r~ sh.ould start by wo-rking with stakeholders to
t\tl<lers'tand•:.th.&. as-suinptions and intended logic behind th.e ,program. A logic model can be
used to· nlus.trat~ the ~.s:al r.e'l.a.tton.sJ.ijp..s b_gtween activities ~nd outcomes. Chen offers
many ~ggestio1ts (or COllStmtVl-ng_ praijrant t}Xeo'ry mod.efs, such as -action model (i.e.
1
£}'stem.a-tic ·plan. for arra:ngin,g ' S ~ resoums, setting_s ,to deltv~r sewices.) and change
madeJ (.LA, $.S of desa-iptrjv:e ~ssu.mp,tiop9 Qbqut causal) processes tutderlyin~ intervention
and 1Q'llfOQl)le}. :Ev.aluatars sb.atilcil .cons11der using this approa~~ when worl~ing with
1
Conclusion
These evaluation theories highlight the diversity of approaches within the field, each offering
unique strengths tailored to different contexts and objectives. Whether the focus is on utility.
st
akeholder values, empowerment, or uhderstanding causal mechanisms. these th eories
provide evaluators with frameworks that ensure the evaluation process is meaningful,
inclusive, and impactful. By selecting the appropriate evaluation theory, evaluators can
en hance the effectiveness of their work and contribute to the continuous
• •impro vement of
social programs.
I
LFA ls:
• An instrument for logical analysis and structured thinking In project planning
• A framework, a battery of questions which, if they are used in a uniform way,
provide a structure for the dialogue between different stakeholders in a project.
• A planning instrument, which encompasses the different elements in a process of
change (problems, objective, stakeholders, plan for implementation etc.). The
project plan may be summarized in a LFA matrix, the log frame.
• An instrument to create participation/ accountability/ ownership.
L'FA is used during all the phases of a project cycle and is suitable for capacity development.
"One basic idea in the LFA method is that one should not siart talking about what one
wants to do (the_activities) ~ut ~bout the p_roblem that~~ be',olved and about what
one wants to achieve/ the ob1ecttves". - Sida (2002)~ ' V"
Information may be collected from them by "LFA Workshop" or "GOPP" (Goal Oriented
Project Planning)
Step 3 Probl~m Analysis/ Situation Analysis - an analysis of the problem that shall be solved
by the project and the reasons for its existence.
The causes are analyzed in order to find the reasons for the focal problem and, thereby, the
solutions/the relevant activities. The effects demonstrate the arguments (the needs) for
implementing the change/ the project.
The basic questions that a problem analysis should answer the following:
• What is the main/ focal problem that shall be solved with the aid of the project?
(Why is change/ a project nee~ed?)
Ill
-
.. .
A problem tree is always "read" from the bottom up. The problems below lead to the
problems above. A problem analysis should preferably be made during a workshop to
which different stakeholders are invited.
Step 4 Objective Analysis- the picture of the future situation
The project group should set 3 levels of objectives: Overall ob~es, Project purpose and
Results. • '. '\,
Relationship between the problem analysisJI!. o i;,ctive analysis
Problem tree L, , ~. - ~ bjective tree
Effects :=============~ -""', ·-., .~ Development objectives
u L1~~ ~
Focal problem :============: .. ~ject purposei Immediate objective
U
• Causes '
~-~~r
~ •
. u
Immediate rsults/ outputs
_ _L
·;
,y
l/
Narrative summary
ment objectives,
components of project are (a) develop
it d~fines t_he project structure. The
(b) immediate objectives, (c) outputs, (d) activities, and (e) inputs.
cribes the
ed as a broad objective or goal. It des
• Development objectives It is also call gain from
ective target groups can expect to
development benefits which the resp ch arc
pro ject or pro gram me. ft con tain s hints on the kind of benefits whi
the should
ups and by what type capabilities they
expected to accurate to the target gro eco nom ic, soc ial and
r conditions in changing
be enabled to keep up or improve thei ject / pro gra ms
it is a long term goal that a pro
institutional environments. Therefore,
development interventions.
aims to achieve in synergy with other
purp~se
ct objectiv~s or pr~ ct purpose. The
• Immediate objectives Also called as dire changes m bel}~~r, structures or capacity
the
of programme or a project describes _v~rable
the targ et gro ups which directly result ~ro~ th~tihZ@!ion. of the deli
of it ~s the
put s or resu lts of the pro gram me or proJect 1~x_P,eiedto yield. Thus,
out Ject /
n, to t;{eiei1r at the end of pro
specific change in behavior or conditio ~
programme. ct
Also called as results, whi ch des lmi ~Jr e, goods and services, the dire
• Outputs mme. The
fr.<>V~1icfe of a ~roje~ or progra
deliverables which are contributed sup por t or the
nat u~ top e and mtens1ty of
outputs or results must express t~ provide
.e result of project's activities which
solution being sought. Thus, it is~~~bJ
the opportunity for change or -P la
~~
• Activities Measures or~~~ out by the pro jec t/ programme in
order to
tive tasks
Therefore, it is specific substan
achieve and obtain the oii~ ~ff esu lts.
ciates.
performed by the pro ~4 /1\ :r asso
means for
which are identified as necessary
• Inputs Also caJed ~r}i;rurces, ng par ties.
urces provided by the participati
performing a. ti itie II project reso
mary, objectively
For each cell of the narrative sum
Objectively verifiable ~ icators: means that the
eloped. They should be SMART. It
verifiable indicators need to be dev
attainable, relevant and timely.
indicators should be simple, measurable,
ditions the
be specific and to relate to the con
- Specific: Key indicators need to
activities see to change.
ntitative or
be able to be measured in either qua
- Measurable: An indicator must they are precise,
ors are preferred because
qualitative terms. Quantifiable indicat . How ever,
statistical analysis of the data
can be aggregated and allow further litative
men t process indicato rs may be difficult to quantify and qua
dev elop
indicators should be used.
t using an
Att aina ·~le: The !nd icator mu st be attainable at reasonable cos
- equipment
ropriat e collection met hod. It sho uld be feasible in terms of finances ' '
app
skills and time.
r --. --
f Implementation
. . . • r
I The oper atio nal pha se of a proJ•ect commences when Implementing act1v1ties begm m orde
t h' or two year s afte r
In most cases, this may be one
t~ ac •e~e th e expected outp uts/ results. In the
at the end of the design pha se (PPM).
e pr~Ject conc ept had been established mu st
changed, so that a verification of the PPM
meantime, log. frame conditions may have
.
take place dun ng the operational planning
the
ational and it shou ld be esta blis hed by
lm~Iementation should have a plan of oper proj ect
(a) wor k plans / wor k sche dule s, (b)
proJect team and will be documented as plan /
l plans, (d) mat eria l and equ ipm ent
budget / reso urce plans, (c) personne
proc urem ent plan / staf f training plans.
analysis?
Why have the different step s in the LFA lder
s 1-4 (context, prob lem analysis, stak eho
• Relevance: With the assistance of step t thin g, by
sure that w~ oi~ g the righ
analysis, objective analysis), we can make
dealing ~ th~ 1gh ~ prob ~e~ s and
involv.in~ the rele vant ~ta~eholder~,
e~a bl,~ "~td se~ ct t~e righ t act1v1t1es ~ta
estabhshmg the corr ect ob1ect1ves, which
Jec 'Ws .~ev ant 1n a prob lem -sol ving
later sta~e. These step s ensu re that the pro
perspective.
~ ..) ,
~7' pct ivi ty plan, reso urce planning
• Feasibility: With the assistance of st~ gs in the righ t
• hat we are doin g thin
indicators of objective fulfillment), we ca
~~ a h e righ t activities and with suff
icie nt
way, that the programme is ~ e
resources (personnel, equipm ~tJ~ ~im e) to solve the prob lem .
can
(analysis of risks and assu mpt ions ), we
• Sustain~bility: With the a·. i~e )~8 -
assess whe ther the pro· ec e
c\Pt mue by itself, with out exte rnal supp
ort, and that
-term.
the project purp ose is : , f)ie in the long
r
I The Log Frame Mo n..l f,n g~u atl on
I Log
Hierarchy
fram~ ~e " of Monitoring & Evaluati
activity
Ex-post evaluation
on Level of Information
~ Goal
Purpose Evaluation at completion and ongoing Out com es/ Imp act
Out com es/ effe ctiv enes s
review
Monitoring and Review Sust aina bilit y
Outputs
Out put
Monitorin_g Inpu t/ Out puts
Activities ·'
Inputs
• Advantages
1. It ensures that fundamental questions are
asked and weaknesses are analysed, in
order to provide decision makers with better
and more relevant information.
2. It guides systematic and logical analysis
of the inter-related key elements which
constitute a well-designed project. \
3. It improves planning by highlighting
linkages between project elements and
external factors.
i
- -
Limitations of LFA
~ ~
.
,
I
) LffJ .a i-/41i ~ e r . I M l~ j J l f . ~ ~
,4-/;:jl;- :Wrwd;,;,_ 1/wi~ Mr i ~f ol T, '
i) L-ffl d;, M1 ~ 4 ~ ·man,·uv~ ~ ·
~,.~7
~ ~
I
I IAEVALUATION AND IMPACT ASSESSMENT
Block 1: Programme Hvaluatlon
Unltl:lntroductlontoHvaluatlon
Monitoring
• Monitoring is the ·systematic process of observing and recording on a
regular basis, the activities carried out in a project, to ensure that the
activities are in line with the objectives of the programme.
• Monitoring takes into account optimum utilization of resources, to assist
the mangers in rational decision making. It keeps a track on the progress
and checks the quality of the project or programme against set criteria and
checks adherence to established standards.
I
I
What is evaluation?
- ----
-
j
-
·--
. process of assessing the
Programme evaluation is a con t'mua1an d systematic
.,, value or potential value of extension programme to guide decision-making for
the programme's future .
When we evaluate ....
► We examine the assumptions upon which an existing or proposed
programme is based.
► We study the goals and objectives of the programme.
► We collect information about a programme's inputs and outcomes.
► We compare it to some pre-set standards.
► We make a value judgment about the programme.
► We report findings in a manner that facilitates their use.
Why evaluate?
Demands on extension for programme efficiency, programme effectiveness
and for public accountability are increasing. Evaluation can help meet these
demands in various ways.
► Planning
• To assess needs
• To set priorities
• To direct allocation of resources
• To guide policy
► Analysis of programme effectiveness or quality
• To determine achievement of project objectives
• To identify strengths and weaknesses of a programme
• To determine if the needs of beneficiaries are being met
• To determine the cost-effectiveness of a programme
• To assess causes of success or failure
► Direct decision-making
• To improve programme management and effectiveness
• To identify and facilitate needed change
• To continue expand or terminate a programme
► Maintain accountability
• To stakeholders
• To funding agencies
• To the general public
► Programme impact assessment
• To discover a programme's impact on individuals and/or
communities
► Advocate
-
. -. .
..
When to evaluate?
The re are several basic questions
to ask wh en deciding wh eth er to car
an evaluation. If the answers to the ry out
se questions are "No'', this may not
time for any evaluation. be the
► Is the pro gra mm e imp
orta nt or significant enough to
evaluation? wa rra nt
► Is the re a legal req uire me
nt to carry ~ut an evaluation?
► Will the res ults of the evalua
tion influence decision-making abo
pro gra mm e? Will the evaluation ut the
answer questions posed by your
stak eho lde rs or tho se interested in
evaluation?
► Are sufficient funds availab
le to carry out the evaluation?
► Is the re eno ugh time to com
plete the evaluation?
Role of evaluator?
The role of an eva lua tor is continu
ally expanding. The traditional role
eva lua tor wa s a com bin atio n of an
of expert, scientist and res ear che
unc ove red cle ar- cut cause-and-effe r who
ct relationships. Today evaluators
oft en edu cat ors , facilitators, con are
sultants, interpreters, mediators
cha nge age nts . and /or
An evaluator's credibility
An eva lua tor is judged by his
or h~r competence and personal
Competence is developed thro ugh style.
training and experience. Personal
develops ove r tim e thro ugh a style
combination of training, experience
per son al characteristics. and
Competence:
► Background in the pro gra mm
e are a are being evaluated
► Capacity to und ers tan d a pro
gramme's context, goals and objecti
► Conceptual skills to design
ves
the evaluation
► Mastery of qualitative and qua
ntitative approaches to evaluation
data
collection
► Basic qua ntit ativ e and qualita
tive dat a analysis skills
► Rep ort wri ting and pre sen tati
on skills
Per son al style:
► Com mu nic atio n skills
► Confidence
► Str ong inte rpe rso nal skills
C
• -
Evaluation criteria
1. Relevance - the ex
tent to which the ob
intervention are consist jectives of the develop
ent with beneficiaries' ne ment
needs, global priorities eds and problems, coun
and partners' and dono try
objectives continue to be rs' policies; whether the
relevant.
2. Effectiveness - the
extent to which the ob
intervention were achie jectives of the develop
ved, or are expected to ment
account their relative im be achieved, taking into
portance.
3. Bffldency- examines
how resources inputs, fun
converted to results an ds, expertise, time - have
d whether the .results we been
cost. re achieved at a reasona
ble
4. Sustainability - the
extent to which the be
intervention continue nefits from a developme
after major developme nt
completed and the prob nt assis tan ce has been
ability of continued long
5. Impact- the positive an term benefits.
d negative, primary and
produced by a developm secondary long-term effec
ent intervention, directly ts
unintended. or ind irectly, intended or
Approaches of Programm
e Evaluation:
1. Quantitative Approach
es: Utilize nume_rical data
measure program outco and statistical analysis to
mes, such as surveys,
administrative records. pre-post tests, and
2. Qualitative Approach
es: Focus on understan
perceptions, and perspec ding the experiences,
tives of program particip
through methods like int ants and stakeholders
erviews, focus groups, an
3. Mixed-Methods Appr d case studies.
oaches: Combine quantitativ
to provide a more comp e and qualitative methods
rehensive understanding
of program effectiveness
and impact.
-•
orks to
4. Theory-Based Approaches: Use theori es and conceptual framew
,
guide the evaluation process, including logic models, theori es of change
and progra m theories.
process,
5. Participatory Approaches: Involve stakeh olders in the evaluation
ce
from planni ng and design to data collection and analysis, to enhan
relevance, ownership, and utilization of evaluation findings.
Evaluation Principles
General principes of evaluation
As an evaluator one must keep in mind following points:
ting
1. Clearly specify what is to be evaluated. Are you intere sted in evalua
knowledge acquir ed or attitud es qeveloped?
to the
2. Select an evaluation technique in terms of its relevance
ting
characteristics or performance to be measured. The tools for evalua
knowledge will be different from the tools for evaluating attitud es.
ques.
3. Comprehensive evaluation requires a variety of evaluation techni
(such
Even for evaluating knowledge you will require variety of test items
e
as multiple choice or short answer). This is a writte n test You may requir
an oral test or a open book examination to evaluate the knowledge base.
4. Awareness of limitations of evaluation techniques improve their use.