Comprehensive School Reform Program Evaluation - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Comprehensive School Reform Program Evaluation

Description:

The systematic collection of information about the activities, characteristics, ... Can be obtrusive. May not reflect typical reality ... – PowerPoint PPT presentation

Number of Views:41
Avg rating:3.0/5.0
Slides: 28
Provided by: maryc64
Category:

less

Transcript and Presenter's Notes

Title: Comprehensive School Reform Program Evaluation


1
Comprehensive School Reform Program Evaluation
  • Mary Church
  • CSRD Technical Assistance Workshop
  • April 17, 2000

2
What is Program Evaluation?
  • The systematic collection of information about
    the activities, characteristics, and/or outcomes
    of programs to make judgements about the program,
    improve program effectiveness, and/or inform
    decisions about future programming
  • - Michael Quinn Patton, Utilization-Focused
    Evaluation

3
Program Planning and Evaluation Model
4
Stage 1 Problem Definition
  • Begin with problem statement
  • Problem statement addresses the following
    questions
  • What is the current situation?
  • What is its impact?
  • What is the impact of correcting the problem?
  • Is addressing the problem worthwhile?
  • Is addressing the problem doable?

5
Stage 2 Define Solution
  • Generate promising solutions
  • Determine root cause(s) of problem
  • Fishbone Diagram
  • Select one solution

6
Stage 3 Define Objectives
  • Define Goal(s)
  • Define Objectives
  • Specific
  • Measurable
  • Guide program planning

7
Stage 4 Plan Program
  • Before beginning the evaluation, it is important
    to have a clear understanding of how the program
    works. The first step in the evaluation process
    is developing a program logic model.
  • Program Logic Model a theoretical picture of how
    a program operates. It describes logically,
    step-by-step, how a program is intended to work
    to achieve objectives.
  • Inputs program resources
  • Activities program activities
  • Outputs program products

8
Program Logic Model
9
CSR Program Logic Model
Increased Academic Performance
Number of trainings to teachers, number of
classes taught, number of students taught
Training of teachers, teaching of students,
materials
Model program expenditures, teacher time, student
time
10
Stage 5 Design Evaluation
  • Program evaluations address two basic questions
  • Formative Is the program being implemented as
    planned?
  • Summative Are the efforts resulting in the
    outcomes intended?

11
Program Logic Model
Summative Evaluation
OUTCOMES
OUTPUTS
Formative Evaluation
ACTIVITIES
INPUTS
12
Designing the Evaluation
  • 1. WHAT do we want to know about the program?
    e.g., list five things that youd like to know,
    things you arent certain about, that would make
    a difference in the operation of your program.
  • 2. WHO has this information? e.g., program
    clients, staff, other agencies.
  • 3. WHEN will information be collected? e.g.,
    when entering program, completion of program,
    from who?
  • 4. HOW can we get the information? e.g.,
    surveys, interviews, observation, focus groups.

13
Data Collection Methods
  • Document Review
  • Assessment of Student Achievement
  • Interview
  • Questionnaire Survey
  • Focus Group
  • Observation

14
Document Review
  • Definition Examination of pre-existing documents
    and data sources
  • Focus
  • Nature and level of school reform activities
  • Incidence of events of interest
  • Existing student achievement information
  • Advantages
  • Data already exist
  • Low cost
  • Typically unobtrusive
  • Relatively unbiased
  • Disadvantages
  • Lack of quality control
  • Validity and reliability may be unknown
  • Can be limited in scope

15
Assessment
  • Definition Test of student performance
  • Focus
  • Student performance in cognitive and affective
    domains
  • Advantages
  • Objective data often with known reliability and
    validity
  • Can be low cost (standardized testing)
  • Provides a generally accepted portrayal of
    schooling outcomes
  • Can include large samples of students
  • Disadvantages
  • May provide a limited and narrow picture of
    student performance
  • Can be high cost (performance-based assessments)
  • May need careful sampling

16
Interview
  • Definition One person asks questions of another.
    May be conducted face-to-face or by telephone.
  • Focus
  • School staff/parent/student perceptions
  • School staff/parent satisfaction
  • Improvement suggestions
  • Degree of implementation
  • Anticipated and unanticipated outcomes
  • Advantages
  • Indepth information
  • Quality control
  • High response rate
  • Opportunity to probe
  • Disadvantages
  • Relatively costly
  • Needs trained data collectors
  • Data can be biased
  • May require careful sampling

17
Questionnaire Survey
  • Definition A document containing questions and
    other types of items designed to solicit
    information appropriate for analysis.
  • Focus
  • School staff/parent/student perceptions
  • School staff/parent/student satisfaction
  • Improvement suggestions
  • Degree of implementation
  • Anticipated and unanticipated outcomes
  • Advantages
  • Relatively low cost
  • Can include structured and open-ended information
  • Relative ease of administration
  • Can cover a large number of respondents
  • Disadvantages
  • Response rate often a problem
  • Needs careful sampling
  • Data can be biased
  • Open-ended data may be difficult to analyze

18
Focus Group
  • Definition Convening of 6-12 persons to discuss
    and provide data on a particular issue.
  • Focus
  • School staff/parent/student perceptions
  • School staff/parent satisfaction
  • Implementation issues
  • Improvement suggestions
  • Degree of implementation
  • Anticipated and unanticipated outcomes
  • Advantages
  • Indepth information on program implementation and
    outcomes
  • Relatively free of response rate problems
  • Interactive discussion among stakeholders
  • Disadvantages
  • Relatively high cost
  • Needs trained facilitators
  • May be difficult to achieve appropriate
    representation in recruitment of participants
  • Group dynamics can bias discussion

19
Observation
  • Definition Observation and recording of events
    and processes.
  • Focus
  • Program implementation
  • Classroom activities
  • School climate
  • Instructional practices
  • Advantages
  • Can provide contextual information
  • Increase objectivity and authenticity of data
  • Disadvantages
  • Needs trained observers
  • Relatively high cost
  • Can be obtrusive
  • May not reflect typical reality
  • Often just a snapshop of program implementation

20
Stage 6 Implement Program
  • Gain organizational commitment
  • Execute plan
  • inputs
  • activities
  • outputs

21
Stage 7 Conduct Evaluation
  • Collect data on program implementation and outcome

22
Collecting Data
  • Design a chart that summarizes the evaluation,
    showing each measure to be used, when it will be
    given, who it will be given to, and who will
    collect data.
  • For each measure used, develop instructions and
    procedures.
  • Appoint an individual as data coordinator to
    monitor and track the data collection process.
  • Provide adequate training to data collector(s).
  • Provide interim reports that include the results
    of data collection processes.

23
Stage 8 Report Findings
  • Start with the original purpose of the
    evaluation. For example, when reporting the
    results, always start with a review of your
    evaluation questions. This will help organize
    your analysis.
  • If you want to improve your program by
    identifying its strengths and weaknesses, you can
    organize data into program strengths, weaknesses,
    and suggestions for improvement.
  • If you want to fully understand how your program
    works, you can organize data in the chronological
    order in which clients go through the program.
  • If you focus on determining you programs
    outcomes, you can categorize data by each outcome.

24
  • Data tables
  • 1st Qtr 2nd Qtr 3rd Qtr 4th Qtr
  • East 20.4 27.4 90 20.4
  • West 30.6 38.6 34.6 31.6
  • North 45.9 46.9 45 43.9
  • Bar charts
  • Pie charts

Descriptive charts are useful
25
Stage 9 Use Findings
  • External Use
  • validate or justify the program to outsiders
  • be accountable to funders, board members,
    policymakers, public
  • retain/increase program funding

26
  • Internal use
  • expand successful services, modify unsuccessful
    ones
  • use for future planning or program replication
  • understand, verify, or increase the impact of
    services on customers
  • document with data what you think is happening
  • use results as a public relations tool with
    customers or the community
  • increase staff pride through documentation of
    accomplishments and quality
  • provide an opportunity for staff to step back and
    reflect on what the program is all about

27
Questions?
Write a Comment
User Comments (0)
About PowerShow.com