0% found this document useful (0 votes)
57 views28 pages

Module 6 Educational Evaluation

Module 6 of the Educational Evaluation course at Isabela State University focuses on the significance of educational evaluation and various evaluation approaches, methods, and techniques. It emphasizes the CIPP evaluation model and outlines the role of evaluation in improving educational quality, decision-making, and program effectiveness. The module also covers formative and summative evaluations, guiding principles for evaluators, and the importance of accreditation in maintaining educational standards.

Uploaded by

Cha Peralta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views28 pages

Module 6 Educational Evaluation

Module 6 of the Educational Evaluation course at Isabela State University focuses on the significance of educational evaluation and various evaluation approaches, methods, and techniques. It emphasizes the CIPP evaluation model and outlines the role of evaluation in improving educational quality, decision-making, and program effectiveness. The module also covers formative and summative evaluations, guiding principles for evaluators, and the importance of accreditation in maintaining educational standards.

Uploaded by

Cha Peralta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

lOMoARcPSD|54224508

Module 6 Educational Evaluation

Bsed Math (Isabela State University)

Scan to open on Studocu

Studocu is not sponsored or endorsed by any college or university


Downloaded by pika cruz (pikacruz83@gmail.com)
lOMoARcPSD|54224508

REPUBLIC OF THE PHILIPPINES


ISABELA STATE UNIVERSITY
ROXAS CAMPUS

COLLEGE OF EDUCATION

MODULE 6: Educational Evaluation

SEd PROF 222: Assessment in Learning 2

Submitted by:

ROSENDO, KRISTINE JANE

RARAMA, ASHLIE

SALADINO, KATE MAUREEN

SALAZAR, JHAY-MARK

Submitted to:

Lessons in the module In this module, you will be exploring


these topics:

L
e
s
s
o
n

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

E
d
u
c
a
t
i
o
n
a
l

E
v
a
l
u
a
t
i
o
n

L
e
s
s
o
n

2
:

E
v
a
l
u
a
t
i
o
n

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

p
p
r
o
a
c
h
e
s
Lesson
3:
Evalua
tion
Metho
ds &
Techni
ques
Lesson
4: The
CIPP
Evalua
tion
Model
Prof. MARILENE MATUSALEM

MODULE 6

Educational Evaluation

INTRODUCTION

Hello! Welcome to Module 6. In this module, you

are to explore EVALUATION as a tool towards

improving quality of educational services and

programs. Specifically, you will examine the

SIGNIFICANCE OF EDUCATIONAL EVALUATION and

A VARIETY OF EVALUATION APPROACHES,

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

METHODS AND TECHNIQUES. However, this

module emphasizes the CIPP EVALUATION MODEL

which remains a practical lens for you to have a

better grasp of the impact of evaluation to the

academic institutions and the educational system.

LEARNING OUTCOMES

Upon the completion of this module, you should be able to:

a) articulate the role of evaluation in ensuring the quality of education academic

institutions provide;

b) compare various evaluation approaches, methods and techniques; and

c) explain the CIPP evaluation model as well as its utility in schools.

LEARNING CONTENT

Lesson 1: Educational Evaluation

What is Educational Evaluation?

Educational evaluation is an essential part of educational policy-making, planning, and

implementation. It is a systematic, continuous and comprehensive process of determining the

merit, worth, and significance of school initiatives and programs (Navarro, R. L. & Santos, R.,

2013). Its main tenet is the holistic appraisal of a learner, his/her environment and

accomplishments (Cajigal, R. & Mantuano, M., 2014).

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Educational evaluation is not limited to the teacher-student engagement. All programs or

activities in the school that went through the process of deliberate planning and

implementation require an assessment of their worth and value (Reyes, E. & Dizon, E., 2015).

Educational evaluation is deemed imperative as a tool in the continuous quality improvement

of schools as an institution. In layman’s term, educational evaluation is the process of

ascertaining the quality of education provided by schools.

As a tool for decision making, educational evaluation

generates data that may trigger changes in the current

practices, programs, initiatives, activities and policies of

schools. The results of evaluation shall become the basis

in the formulation of appropriate educational decisions

and actions (Kubiszyn, T. & Borich, G., 2000).

Instructional & Grading: Inside the classroom, teachers reach instructional decisions with
respect to the extent of attainment of the intended learning outcomes. Data is obtained from
test results and performance scores. Analysis will lead teachers to implement adjustments in
the delivery of the lessons and the designs of assessment tasks. This includes also decisions
for promotion or retention of students in a particular grade level.

Diagnostic: Assessment of the strengths and weaknesses of the learners allows teachers to
identify the root cause/s of the difficulty. Diagnostic assessment provides relevant
information regarding the readiness of the students. Intervention and remediation programs
must be based on needs assessment

Selection & Placement: Evaluation data may also be gathered to select the students to be

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

admitted to a program or activity. Moreover, the placement decision is made once the
student is admitted to the school and usually intends to identify students who need
remediation or enrichment classes.

Guidance & Counseling: Guidance and counseling initiatives are deemed more suitable if they
are products of assessment. This includes the use of clinical observations. Evaluation results
may become basis for guidance and counseling initiatives in response to the needs of the
learners.

Program/ Curriculum: Based on results of evaluation, school administrators may decide to


continue, discontinue, revise or pursue a program, activity, and curriculum. Evaluation shall
lead to better planning and implementation in the succeeding school endeavors. Hence,
evaluation should be an imperative in every school’s processes and procedures.

Administrative Policy: Given the available resources of the school, a thorough evaluation of
the efficiency of utilization of funding and assets shall provide the basis for modifications in
plans, policies and processes. Decisions whether to acquire new facilities, machineries and
materials and whether to add more staffs must be based on gathered data.

What are the guiding principles for evaluators?

According to the American Evaluation Association (2018), the five guiding principles for

evaluators are as follow:

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Evaluation is an intrinsic and essential component of teaching and learning. The results of

an evaluation in educational setting may determine whether a student passes to the next

grade level, a teacher gets promoted, a particular textbook will be used, a course will

continue to be offered, a laboratory will require renovation, and a school regulation will be

modified. Educational assessment typically uses preselected measurements such as norm-

referenced standardized tools to measure and evaluate quality of learners, instructors,

classes, institutions or the educational system as a whole (ACSME, 2007).

Competency evaluation is a means for teachers to determine the ability of their students, not

necessarily through a standardized test. Performance evaluation ascertains the extent of

capability to demonstrate a particular skill. Course evaluation evaluates the quality of the

delivery of a given course while program evaluation determines if a program “works”. All of

these are components of educational evaluation.

The evaluation process goes through four phases as shown in the diagram below.

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Planning: In the planning phase, there must be constructive alignment among objectives,

programs and evaluation criteria. What are the program’s conceptual underpinnings? What

information is needed to make decisions? Which stakeholders will be directly involved in the

process? Designing the data collection tool is also a foremost concern in this phase.

Implementation: In the implementation phase, the prior concern is the administration of the

data collection tool. Extra care in data gathering and handling is a must to ensure authenticity

of findings.

Analysis: In the analysis phase, objectivity in interpretation and credibility of the findings are to

be established. Appropriate quantitative and qualitative data analysis tools must be utilized

carefully.

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Reporting: In the reporting phase, translating the evaluation results in concordance with the

context of the recipients of the findings. Data presentation must lead to clarity and not

confusion. Consequently, the results will lead to planning for program changes.

All these four phases complete the evaluation cycle regardless of the evaluation approach
employed by the academic institutions.

As evidence of the significance of educational evaluation, many schools pursue accreditation

endeavors. DepEd, CHED and TESDA have established respective standards for K-12, tertiary

and technical-vocational education. These standards have become the basis of the evaluation

tools of several external accrediting agencies such as PACUCOA, PAASCU, ACSCU, AACCUP

and etc.

Activity 1: Word Cloud

Directions: Search the meanings of the words and acronyms shown in the word cloud below.

Which terms seem unfamiliar to you?

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Activity 2: Pattern Fan

Directions: Fill up the template shown by writing the important ideas you have learned about

education evaluation

Educational
Evaluation

Lesson 2: Evaluation Approaches

What is Evaluation Approaches?

Evaluation approaches refer to the different ways to view, design, and conduct

evaluation activities. Some evaluation approaches provide solutions to problems; others

improve existing processes and procedures. Generally, any evaluation process may either

employ formative or summative approaches depending upon the intent of the evaluation

activity.

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

What is Formative evaluation?

Formative evaluation is an on-going process that allows for feedback to be implemented

during a program cycle. Formative evaluation is deemed a process- oriented approach where

feedback is generated while the program is being run (Boulmetis, J. & Dutwin, P., 2005).

Formative evaluation includes several types (Trochim, W., 2020):

What is Summative evaluation?

Summative evaluation takes place at the end of a program cycle providing an overall

description of its effectiveness. Summative evaluation measures the extent of attainment of

the program objectives. The results enable schools to determine the future direction of a

program or initiative. Summative evaluation includes several types (Trochim, W., 2020):

10

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Moreover, House (1978) and Stufflebeam & Webster (1980) classified approaches for

conducting evaluations based on epistemology, perspective, and orientation.

In terms of the ways of obtaining knowledge, the objectivist epistemology is

associated with utilitarian ethics which concurs that something is good if the society as

whole is happy about it and it’s possible to validate externally the knowledge acquired

through publicly exposed evaluation methods and data. The subjectivist epistemology is

associated with intuitionist/pluralist ethics which posits that there is no single

interpretation of “good” and evaluation entails looking into both the explicit and the tacit

knowledge.

In terms of perspective, evaluation approaches may be categorized as elitist or mass-

based. An elitist perspective focuses on the views of the administrators and/or experts in

11

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

the field or profession. On the contrary, the mass-based perspective puts the consumers at

the apex of evaluation and highly participatory in nature. The consumers may refer to the

students, parents, community, and employers.

In terms of orientation, evaluation approaches may be clustered into political, question

and values orientation. The political orientation or pseudo-evaluation approaches tend to

selectively present information and is skewed towards certain perspectives or ideas. These

types of evaluation includes public relations inspired (a feel good evaluation focused on the

positives of a program), politically controlled (multiple truths uncovered) and evaluation by

pretext (the client has a hidden agenda for conducting the evaluation that is unknown to

the evaluator).

The question orientation or quasi-evaluation approaches entail the collection of

evidence to ascertain whether any change that has occurred is due to the program or

intervention or other confounding factors. An elitist quasi-evaluation employs experimental

research (causal relationships), management information systems (scientific efficiency),

testing programs (individual differences), objective-based studies (outcome-objective

relationship) and content analysis (communication data). However, a mass-based

perspective quasi-evaluation determines the extent of accountability based on well-defined

performance expectation and accurate accounting of outcomes.

The values orientation or true evaluation approaches are not only concerned with

goals, but also whether the goals are worth achieving. The evaluator considers the impact,

12

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

accomplishments and consequences of the program. A decision-oriented approach

promotes the use of evaluation as premise for the educational decisions and planning

activities. Policy studies include evaluation approaches that focus in assessing potential

costs and benefits of competing policies. Consumer-oriented approach determines how the

school has satisfied the clientele’s needs and expectation.

Additionally, accreditation is a mechanism that allows academic institutions to prove

that they meet a general standard of quality. It is the formal recognition by an authoritative

body of the competence to operate with respect to specified criterions. As a process, it is a

form of peer review in which an association of schools, colleges and universities evaluates a

particular institution based on an agreed set of norms encouraging improvement of every

affiliate member. As a result, schools receive recognition from the agency for having met

the prescribed minimum requirements.

Certification, on the other hand, represents a written assurance by a third party of the

conformity of a product, process or services to specified requirements. In the Philippine

context, this may refer to grant to operate certain programs in schools and universities.

Connoisseurship as an outgrowth of art appreciation advocates the use of qualitative

evaluation. It attempts to discern the subtle but significant aspect of classroom life,

schooling and education as a whole.

The adversary approach makes use of debate as its methodology. Two opposing views on

issues are presented with a neutral party acting as the referee. Moreover, the client-

13

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

centered approach places the unique needs of the clients at its core.

Evaluation ascertains how well a program, a practice, an intervention or an initiative

achieves its goals. It helps in determining what works well and what could be improved. The

selection of the evaluation approach to employ, however, is dictated by the intent of the

institution to be evaluated

Activity 3: Background Knowledge Probe

Directions: Based on your understanding of the types of assessment, identify which type must
be employed according to the intention of the evaluator. Indicate your response by putting a
check under the appropriate column.

Formative Summative
Intention Assessment Assessment
1. Group students according to their achievement
levels.
2. Provide timely feedback to students.
3. Help students to feel safe to take risks and make
mistakes in the classroom.

4. Certify learning and award


qualification.
5. Diagnose student learning needs.

14

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

6. Motivate students to increase effort and


achievement.
7. Actively engage students in their own learning
process.
8. Provide information about student performance to
stakeholders.

Activity 4: Balancing Act


Directions: Choose three evaluation approaches. Describe the advantages and disadvantages
of each approach. Present your response using the template below.

Lesson 3: Evaluation Methods and Techniques

What are the different Evaluation Methods and Techniques

Evaluation helps schools to sought answers to questions such as “How are we doing?”, “How

do we know?”, and “What are we going to do now?” It is ideal in investigating the influence of

courses of action on the school’s vision, mission, goals, learning and teaching practices,

responses to changes, and operational procedures. Quality education program evaluation

includes both qualitative and quantitative measures and evidences.

In the deciding which evaluation methodology to employ, academic institutions must deal

with theoretical and practical issues. Theoretical issues include the value of the type of data,

15

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

the perceived scientific rigor of the data and underlying philosophies of evaluation. Practical

issues encompass the credibility of results, skills of the staffs, and financial and time

constraints (NSF, 2010).

Quantitative method focuses on “what” and “how many” while qualitative method

focuses on “why” and “how”. To choose between them, you may use the flowchart below.

This comparison of the two methods is too simplistic. Both methods may or may not

satisfy the canons of scientific rigor. Quantitative methods may seem precise if used properly

and carefully; but, if respondents failed to comprehend completely the items in the survey

then findings may be affected badly. Qualitative method setbacks, however, includes the

difficulty of gathering credible data sources, time-consuming and costly nature of data

collection, and intricacy of data analysis and interpretation (Patton, 2002). Nowadays, to take

advantage of the strengths of each method the use of mixed-methods is advocated.

Different evaluation techniques have different purposes, work in different contexts, and

16

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

give you different types of feedback. Depending on what you expect to obtain from the

evaluation, you might find some techniques more useful than others. Listed below are the

common techniques employed in education evaluations (NSF, 2010).

Technique Salient Features

Surveys - gather descriptive responses to a wide range of


close-ended or open-ended questions
- require a large number of respondents
- may be administered by pen-and-paper or via web-
based online data collection systems
- can be easily analyzed by existing software
- provide a general picture but may fail to consider
audience’s context
Interviews - regard the participants’ viewpoint as meaningful
and recognizable
- require well-selected group of participants
based on a defined inclusion criteria
- may be done through face-to-face or
telephone/video interview
- may use carefully worded questionnaire
(structured) or a free-wheeling probing
(unstructured)
- require skillfulness and flexibility in interviewing
- prone to information distortion by interviewee
- produce vast volume of information that may be
difficult to transcribe
Focus groups - combine elements of interviewing and participant
observation
- explicitly use group dynamics to
generate data and insights
- may be conducted in a room or through web-
based discussion platforms
- may be used at both the formative aand
summative stages of an evaluation
- less costly than individual in-depth
Observations - gather firsthand data on the interventions,
processes, or behaviors

17

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

- occur in natural, unstructured, and flexible setting


- need qualified and highly-trained observers
- may push some participants to behave differently
- may be prone to distortion due selective perception
of observer
Tests - provide means to assess subject's knowledge and
capacity to apply knowledge may be in selected-
response or constructed-response formats
- may be interpreted based on a certain
- norm or criterion are criticized as fragmented,
superficial and punitive
- provide objective information that can be scored in
a straightforward manner
- may be distorted via coaching or cheating
Checklists - use a standard list of action items, steps, or
elements that the clientele should have
demonstrated in completing a task, program or
activity
- can be cheap and easy and covers a wide array of
factors
- depth and breadth is limited
Document Studies - use existing documents and secondary data
- useful in analyzing trends and patterns over time
- prone to doubts towards its authenticity,
completeness, and suitability time consuming to
analyze and difficult
- to access data
Key Informant - entails selection or invitation of participants based
on their skills, background and involvement in the
program
- an "insider" perspective
- provides concerning the issue evaluated
- prone to informants' biases and impressions
- requires observance of professional relationship
between evaluator and informants to avoid tainting
the results
Case Studies - provides a specific illustrative case or exemplar of
the issue evaluated
- allow a thorough exploration of interactions
between treatment and contextual factors

18

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

- require well-trained data collection and reporting


teams
- may be exposed to excessive interpretation and
generalization.
L D M R H I S T O R I C A L I
F I X E X P E R I M E N T F E
Other evaluation techniques include cohort studies, social network analysis, self-
Z B E N C H M A R K I N G O K
E V A feasibility
completion questionnaires, L U Astudies,
T force
I Ofield N analysis
Q JandQetc. C Q
Q U A L I T A T I V E V L U Z
F E A S I B I L I T Y L A S C
K S I S Y L A N A E S N V I O
EducationalL evaluation
E C may R need
O FbothQqualitative
L R and O quantitative
T C A methods F Mbecause of
W E I V R E T N I I W B W V P
the diversity of issues addressed. The choice of methods should fit the need for the
X O B S E R V A T I O N Y M E
P D Lof resources
evaluation, availability K H CandRtime, A and
E capability
S E R H staffs.
of the A TWhile every
U L Q A U Q T E U N Y Q V R I
O E own
evaluator has his/her B preference,
Y T I theLdominant
A U notion
Q M K noSoneKmethod
is that T is always
R I C N V E S R U O C T J E O
best G F F E Y S N T R O H O C T R

Activity 5: Word Search


Directions: Placed in this puzzle are terms related to evaluation methodologies. Can you find all
of them?

19

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Activity 6: Pyramid
Directions: From the lessons you have learned; fill up the pyramid of thoughts below.

ID
EA
S
CONCEPT

GENERALIZATIONS

Lesson 4: The CIPP Evaluation Model

What is CIPP Evaluation Model?

The CIPP (context, input, process, and product) evaluation model claims that evaluation is

conducted to reach a well-founded decision. It does not assume linear relationship among its

components. This model can be used for both formative and summative kinds of evaluation

activity. By alternately focusing on program context, inputs, process, and products, the CIPP

model encompasses all phases of an educational program: planning, implementation and

evaluation. The first three elements of the CIPP model are suitable for formative evaluation

20

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

while the fourth element is ideal for summative studies. The components of the model are

summarized in the model adapted from Stufflebeam (2003) below.

Context Evaluation: The context evaluation component of the model establishes the

connection between the program goals and evaluation. The evaluator describes the

environment and determines the needs of the program beneficiaries. The unmet needs,

problems, issues and challenges are identified and evaluated.

Input Evaluation: The input evaluation component of the model determines how

resources are utilized to achieve program objectives and goals. Data regarding the school’s

mission, goals, and plans are collected leading to the assessment of the responsiveness of

program strategies to the stakeholders’ needs. A comparison to alternative strategies used in

similar programs is also aimed in this stage. The input evaluation complements the context

evaluation.

Process Evaluation: The progress evaluation component of the model reviews the

21

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

program quality. It ascertains whether the program is implemented as it is planned. Program

activities are monitored, documented and assessed. Feedback mechanisms and continuous

quality improvement are of utmost concern by this stage.

Product Evaluation: The product evaluation component of the model measures the impact

of the program to target beneficiaries. Evaluators assess the program effectiveness and

sustainability. As a summative component, decisions whether to continue, modify or terminate

the program are established in this stage.

As a whole, the CIPP model looks at evaluation both in terms of processes and products in

all the various phases of school program, project, intervention, curriculum, or initiative

implementation. Outcomes and projected objectives are matched and the discrepancies

between them are considered as basis for future plans and decisions.

Activity 7: Classify Them

Directions: Identify the tool that can be used in each level of evaluation activities.

Opinion polls Interest inventories


Observation guides Focus-group discussion
Personality inventories Interview guides
Tracer studies Rating scales
Anecdotal records National test results

22

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Classroom Level Evaluation School System Level Evaluation

ASSESSMENT TASK
Choose the option that provides the correct response.

1. How is assessment related to a course’s learning objectives?


a. Assessment and learning objectives are essentially the same thing.
b. The learning objectives are based on the way students are assessed.
c. Teachers use assessment to ensure a course’s objectives are met.
d. They are not at all related to one another.

2. If a teacher gives and exam and everyone fails, what should he/she do?
a. Give the exam again.
b. Determine why students missed the questions they missed.
c. Make the exam easier.
d. Adjust his/her teaching style.

23

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

3. Feedback is important because …


a. It allows students to learn from their mistakes.
b. It makes the student feel good about themselves.
c. It explains the grade that was assigned.
d. Teachers are supposed to give their students feedback.

4. Which is NOT true about formative evaluation?


a. It is focus is program improvement.
b. It judges the worth of a program while in progress.
c. It is primarily diagnostic in nature.
d. It is concerned with the overall effectiveness of a program.

5. Which is NOT true about summative evaluation?


a. It is done at the completion of a
program.
b. Gathered data determine the worth of the program.
c. It is generally high stakes.
d. It entails comparing against some benchmark.

6. When is focus group more preferable than in-depth interview?


a. Peer pressure would inhibit responses and cloud
results.
b. Subject matter is not so sensitive.
c. Group interaction is deemed nonproductive.
d. A greater volume of issues must be covered.

7. Which is a good tool for obtaining information when in-depth probing


is not necessary?
a. observation c. case study
b. survey d. key informant

8. If the university was implementing a new online learning scheme this school
year, which might be regarded as stakeholders?
a. students and teachers c. IT support officers
b. staff development officers d. All of these

9. Which key question is aimed in the input evaluation stage?


a. What are the impediments to meeting necessary or useful needs?
b. How cost-effective is each identified approach?
c. Was the program running efficiently?

24

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

d. Were the intended outcomes of the program realized?

10. Which key question is aimed in the product evaluation stage?


a. What are the longer-term implications of the program outcomes?
b. Did participants accept and carry out their roles?
c. How feasible is each of the identified approaches?
d. What relevant opportunities exist?

ONLINE REFERENCE

https://pdfcoffee.com/assessment-of-learning-2-module-pdf-free.html

REFERENCES

American Evaluation Association. (2018). American evaluation association guiding


principles for evaluators. Retrieved from { HYPERLINK
"https://www.eval.org/p/cm/ld/fid=51" } on July 10, 2020.

Australian Conference on Science and Mathematics Education (ACSME). (2007). WEAS: A


web-based educational assessment system. Retrieved from { HYPERLINK
"https://dl.acm.org/doi/pdf/10.1145/1233341.1233365%20on%20July%2010" }, 2020.

25

Downloaded by pika cruz (pikacruz83@gmail.com)


lOMoARcPSD|54224508

Boulmetis, J. & Dutwin, P. (2005). The ABC’s of evaluation: Timeless techniques for
program and project managers (2nd ed.). San Francisco: Jossey-Bass.

Cajigal, R. M. and Mantuano, M. D. (2014). Assessment of learning 2. Manila: Adriana


Publishing Co., Inc.

House, E. R. (1978). Assumptions underlying evaluation models. Educational Researcher.

7(3), 4-12.

Navarro, R. L. & Santos, R. (2013). Authentic assessment of student learning Outcomes:


Assessment of Learning 2. Manila: Lorimar Publishing Inc.

National Science Foundation (2010). The 2010 User-Friendly Handbook for Project
Evaluation. Retrieved from { HYPERLINK
"https://www.purdue.edu/research/docs/pdf/2010NSFuser-
friendlyhandbookforprojectevaluation.pdf" } on July 10, 2020.

Patton MQ. (2002). Qualitative evaluation and research methods. Newbury Park (CA):
Sage Publishing Inc.
Powell Tate (2020). Evaluation Approaches & Types. Retrieved from { HYPERLINK
"http://toolkit.pellinstitute.org/evaluation-101/evaluation-approaches-types/" } on July
10, 2020.

Reyes, E. and Dizon, E. (2015) Curriculum development. Manila: Adriana Publishing Co., Inc.

Spaulding, D.T. (2008). Program evaluation in practice: Core concepts and examples for
discussion and analysis. San Francisco, CA: Jossey-Bass.

Stufflebeam, D. L., & Webster, W. J. (1980). { HYPERLINK


"https://www.jstor.org/stable/1163593" }. { HYPERLINK
"https://en.wikipedia.org/wiki/Educational_Evaluation_and_Policy _Analysis" \
o

"Educational Evaluation and Policy Analysis" }. 2(3), 5-19. { HYPERLINK


"https://en.wikipedia.org/wiki/OCLC_(identifier)" \o "OCLC (identifier)" } {
HYPERLINK "https://www.worldcat.org/oclc/482457112" }.

26

Downloaded by pika cruz (pikacruz83@gmail.com)

You might also like