Title: Evaluation Design and Models
1Evaluation Design and Models
- Presented by
- Cynthia Eschenburg
- Jennifer Hofmann
- Hsin-Ju Hsu
- Jim Mowery
2What is the Purpose of Evaluation?
- To appraise quality.
- Trough a systematic examination.
- By focusing on a specific issue.
- To provide a formal report
- To allow the information needed to make a
decision.
3How is Evaluation Similar to Research?
- Both engage in disciplined inquiry
- Both use measurement devices
- Both analyze their data systematically
- Both describe their endeavor in formal reports
- Both rely on a technical set of tools
- it would often be impossible to differentiate
between them according to their activities.
4How does Evaluation Differ from Research? Part 1
of 4
- Researchers want to draw conclusions
- Evaluators are more interested in decisions
- Researchers are interested in understanding
phenomena, often for no other purpose than to
understand them better - Evaluators want to understand phenomena better
in order to guide someones actions.
5How does Evaluation Differ from Research? Part 2
of 4
- An ideal research investigation would be findings
that could be generalized to a wide variety of
comparable situations. - Evaluation, on the contrary, is typically
focused on a particular educational program. - Evaluators seek to determine how worthwhile a
program is in order to facilitate a decision.
Researchers search for scientific truth without
any desire to attach estimates of worth to their
findings. -
6How does Evaluation Differ from Research? Part 3
of 4
- The necessity to attach quality estimates cannot
be escaped. Usually couched in comparisons as
comparisons facilitate decisions. - The value ingredient is not requisite in
research. - Evaluations enable educators to make better
decisions. - Research describes.
7Salient Differences Between Evaluation and
Research
8Definition of Formative Evaluation Popham
- Formative evaluation refers to appraisals of
quality focused on instructional programs still
capable of being modified. - Formative evaluators attempt to appraise programs
in order to inform the program developers how to
ameliorate deficiencies in their instruction.
9Formative (Process) Evaluation
- Provides information for program improvement,
modification, documentation and management - Intent is to strengthen a program by providing
feedback on its implementation, progress and
success - Information is collected early in the program so
that changes can be made to enhance program
effectiveness, rather than waiting until the
program is over - Useful for all programs, but is especially
appropriate for those lasting for several years - This type also describes the program in
sufficient detail so that others may adapt it to
their own situations
Source http//www.feraonline.com/typeeval.htmlt3
, October 8, 2002
10Evidence Formative Evaluation
- Utilization
- Access
- Student competency technology use
- Teacher Observations
- Student Feedback
- Quality of Curriculum
11Formative Evaluation
- Use of Evidence
- Ensure students are utilizing and have equal
access to technology - Provide additional instruction in using the
technology if needed - Evaluate feedback for program improvement(s)
- Make adjustments to curriculum if not correctly
aligned
12Define Summative Role of Evaluation
- Summative evaluation refers to appraisals of
quality focused on completed instructional
programs. - Summative evaluations look at programs in terms
of overall success. In other words, the review is
made after the final program has been rolled out
and participants have had a chance to apply what
they learned.
13What Evidence Would We Seek for a Summative
Evaluation?
- Usage. Is the website being used? Which modules?
How often? Are the people using the modules
getting the better grades? - Changes in performance. Is there any difference
in performance this year versus previous years
when the web site wasnt available? - Audience Differences. Are there differences
between the English speaking vs Non-English
speaking students differences between income
groups. - Comparison to Test Group. Examine the
differences between the classes with access to
the supplemental information and classes without
it.
14Using the research scenario, contrast the
evaluators and researchers goals.
- The evaluator would assess the value of
integrating technology into the science
curriculum as opposed to maintaining the current
system. - The researcher would assess the value of
integrating technology into a science curriculum.
15Management-oriented Evaluation
Hsin-Ju Hsu October, 2002
16Daniel Stufflebeams CIPP Model (1987)
- CIPPs classic evaluation procedures have
influenced many of the evaluations to this day. - Four Components
- Context human needs related problem
- Input program development
- Process quality-control monitoring
- Product measurement of the effectiveness
17Steps for CIPP Model
- Focus the evaluation
- Identify decision-makers and decisions to be made
as a result of the evaluation. - Collect the information
- What sources of information, instruments,
methods, and sampling procedures can be used? - Organize the information
- What resources will data collection adhere to?
18Steps for CIPP Model (cont.)
- Analysis of information
- What data analysis techniques can be used?
- Reporting of information
- Who is the audience? How and when to make the
final report? - Administer the evaluation
- After all decisions have been made about the
evaluation in collaboration with decision-makers,
conduct the evaluation according to the plan.
19Case Study F2F Research Scenario
- Purpose
- Coordinate a new program to evaluate the needs of
integrating technology into science curriculum in
the assigned elementary school for different
ethnic group students that include middle class
or impoverished families, and ESL students for
whom English Is not the primary home language. - Method
- Use CIPP evaluation model to evaluate the
program.
20Case Study F2F Research Scenario (cont.)
- CIPP Four Components
- Context Evaluation
- Use internet-based lesson supplements to enhance
academic achievement in science. - Socioeconomic, Ethnic and English As Second
Language (ESL). - Compare present, probable and possible outputs.
- Input Evaluation
- Evaluate how to employ resources to achieve
program objective. - Evaluate strategies and resources.
- Homepage, science units (10), URL (5).
- Access and opportunity to complete units.
21Case Study F2F Research Scenario (cont.)
- CIPP four component (cont.)
- Process Evaluation
- IT staff design a home page for students to have
the ability to access the course website from
home. - For middle class students, they can access
Internet at home. For impoverished students, they
can use the computer at school if they dont have
computer at home. - Product Evaluation
- All students can learn science and computer by
Internet-based lesson. - Evaluate the supplemental use of technology
related to academic achievement across student
populations.
22Case Study F2F Research Scenario (cont)
- CIPP Steps
- Focus the evaluation
- Conduct a program for technology-based science
curriculum. IT staff makes a web page for
students to access everywhere. - Collect the information
- Integrate Internet resources, web-based lesson,
media and technology instruments into science
curriculum. - Organize the information
- Include instructional design, curriculum
materials, and student performance.
23Case Study F2F Research Scenario (cont)
- CIPP steps (cont)
- Analysis of information
- Analyze student performance by grading assignment
or portfolio. - Reporting the information
- Write a semester report and announce the result
of the programs evaluation to principal or
committee. - Administer the evaluation
- Identify if the use of technology can improve
student achievement.