Impacts - PowerPoint PPT Presentation

About This Presentation
Title:

Impacts

Description:

CONDUCTING PROGRAM EVALUATIONS FOR FEDERAL PROGRAMS Brooke Blair, ALSDE Mark Ward, ALSDE Erin McCann, SEDL Mary Lou Meadows, SEDL – PowerPoint PPT presentation

Number of Views:62
Avg rating:3.0/5.0
Slides: 36
Provided by: ErinM156
Category:

less

Transcript and Presenter's Notes

Title: Impacts


1
CONDUCTING PROGRAM EVALUATIONS FOR FEDERAL
PROGRAMS
Brooke Blair, ALSDE Mark Ward, ALSDE Erin McCann,
SEDL Mary Lou Meadows, SEDL
2
Where is Home?
3
Session Objectives
  • Participants will
  • Increase their understanding of the connection
    between program evaluation, the Federal Programs
    Monitoring document, and the eGAP Consolidated
    Application.
  •  
  • Increase their understanding of the differences
    between immediate, short-term, intermediate, and
    long-term outcomes.
  • Increase their knowledge of indicators and
    performance measures for reporting the
    effectiveness of actions using some short-term
    and intermediate outcomes.

4
Needs Assessment Data
  • One measure, by itself, gives some useful
    information . . . But
  • Comprehensive measures used together and over
    time provide much richer information.
  • Together, these measures can provide a powerful
    picture that can help us understand the schools
    impact on student achievement.
  • These measures, when used together, give schools
    the information they need to get the results they
    want.

5
Bernhardts Model of Data Categories
Demographics
School Processes
Perceptions
Student Learning
Bernhardt, V. (2004). Data analysis for
continuous school improvement (2nd ed.).
Larchmont, NY Eye on Education.
6
Examples Enrollment Attendance Drop-out
Rate Ethnicity Gender Grade Level Language
Proficiency
Demographics
School Processes
Perceptions
Student Learning
Bernhardts Model of Data Categories
7
Examples Perceptions of learning
environment Values and beliefs Attitudes Observati
ons
Demographics
School Processes
Perceptions
Student Learning
Bernhardts Model of Data Categories
8
Demographics
School Processes
Perceptions
Examples Norm-referenced tests Criterion-referenc
ed tests Teacher observations
Student Learning
Bernhardts Model of Data Categories
9
Examples Scheduling Common Planning Time Special
Services Referrals School Policies
Demographics
School Processes
Perceptions
Student Learning
Bernhardts Model of Data Categories
10
Time
Why do you think that time would be an important
variable to data collection?
11
Compliance Assistance Review Document
  • Examples of Programs Requiring a Needs
    Assessment
  • Title I
  • Title II
  • Title III
  • McKinney Vento
  • Neglected Delinquent

12
Data Quality
13
No Child Left Behind Act of 2001Title I Best
Use of Funds
  • SEC.1001.Statement of purpose
  • (4) holding schools, LEAsaccountable for
    improving the academic achievement of all
    students, and identifying and turning around low
    performing schools that have failed to provide a
    high-quality education to their students, while
    providing alternatives to students in such
    schools to enable the students to receive a
    high-quality education
  • (5)distributing and targeting resources
    sufficiently to make a difference to LEAs and
    schools where needs are the greatest

(Title I, Improving the Academic Achievement of
the Disadvantaged)
14
Key Considerations forProgram Evaluation
  • The types of data used to determine success.
  • The activities that are associated with success.
  • How the results are being used to drive future
    improvement effort.
  • How you are prioritizing needs to make the
    greatest impact.
  • AND whether you are achieving the desired
    outcomes.

15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
Outcomes/Impacts
  • Immediate
  • Short-Term
  • Intermediate
  • Long-Term

Adapted from Innovation Network, Inc., Logic
Model Workbook, www.innonet.org
20
IMMEDIATE IMPACTS
  • Direct results of an activity
  • of participants who attended a workshop
  • of students attending a tutoring program
  • of materials provided
  • Web site designed and activated
  • Policy manual written and approved
  • Position descriptions developed
  • Job positions filled

21
SHORT-TERM IMPACTS
  • Changes in Learning as a result of an activity
  • New knowledge
  • New skills
  • Changed attitudes, opinions, or values
  • Changed motivation
  • Changed aspirations

22
INTERMEDIATE IMPACTS
  • Changes in Action as a result of gains in
    learning
  • Modified behavior
  • Changed practice
  • Changed decisions
  • Changed policies

23
LONG-TERM IMPACTS
  • Changes in Condition as a result of actions
    taken
  • Human
  • Economic
  • Civic/Community
  • Environment

24
Strategy and Action Steps
  • Strategy Provide supplemental reading/literacy
    instruction for students identified as at risk.
  • Actions
  • Purchase Read with Ease (computer assisted
    learning program).
  • Hire lab instructors, or reallocate teacher time
    to allow for time to work in lab with at risk
    students.
  • Schedule lab hours for at risk students before
    and after school.
  • Train lab instructors in use of Read with Ease.
  • Lab instructors provide support to at risk
    students in computer reading lab.

25
Strategy Action Steps changed Types of expected
outcomes/impacts
  • Purchase Read with Ease

-- Immediate
-- Immediate
  • Hire lab instructors

Schedule lab hours
-- Immediate
Train lab instructors
-- Immediate, Short, Intermediate
Lab instructors provide support
-- Short, Intermediate, Long
26
Evidence of Outcome/ImpactPerformance Measures
27
Evidence of Outcome/ImpactPerformance Measures
28
Strategy and Action Steps
  • Strategy Provide school-based reading/literacy
    professional development for administrators,
    teachers, and other instructional staff.
  • Actions
  • Hire reading coach to facilitate ongoing
    reading/literacy professional development at the
    school.
  • Reading coach and principal meet weekly to
    discuss reading/literacy issues related to
    students and teachers.
  • Instructional staff meet weekly for 1 hour on
    reading/literacy instruction.
  • Reading coach assists instructional staff in
    meetings and in implementation of new
    reading/literacy strategies.

29
(No Transcript)
30
(No Transcript)
31
Measuring ImpactsPerformance Measurements
  • Surveys, interviews, focus groups
  • teachers, administrators, coaches/mentors,
    students, parents, community
  • Pre-post tests of knowledge/skill
  • professional development participants, teachers,
    students
  • Observations
  • of teachers, administrators, coaches, students
  • Document/records reviews
  • participation/attendance records, lesson plans,
    journals/logs, student homework/projects, class
    grades, performance on benchmark and standardized
    tests

32
(No Transcript)
33
(No Transcript)
34
(No Transcript)
35
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com