Title: Evaluating SES Providers
1Evaluating SES Providers
- Steven M. Ross
- Allison Potter
- Center for Research in Educational Policy
- The University of Memphis
- http//www.memphis.edu/crep
2Determining Evaluation Measures
Effectiveness Increased student achievement in
reading/language arts or mathematics. Customer
satisfaction Positive perceptions by parents of
SES students. Service delivery and
compliance Positive perceptions by principals,
teachers, LEA staff, etc.
3Figure 1. Components of a Comprehensive
SES/Evaluation Modeling Plan
Service Delivery and Compliance
District Coordinator Survey
Customer Satisfaction
Principal/Liaison Survey
Provider Survey
Teacher Survey
Parent Survey
Effectiveness (Student Achievement)
State Tests
Additional Tests
4Effectiveness Measures
- Student-level test scores from state-mandated
- assessments
- Considerations
- availability only for certain grades (e.g.,
3-higher)? - Lack of pretest scores prevents gains from being
determined
5Effectiveness Measures
2. Supplementary individualized assessments in
reading/language arts or math
- Considerations
- Without pretest scores and comparison students,
SES gain cannot be determined - Validity may be suspect if assessments not
administered by trained independent testers
6Effectiveness Measures
- Provider-developed assessments in
reading/language arts or math
- Considerations
- Test results may not be valid or suitable for
states evaluation purpose - Tests may favor providers strategies
7Customer Satisfaction Measures
- Parent and family perceptions
- Considerations
- Parent respondents may not be representative of
the population served by provider - Samples sizes will vary due to provider size
- Comparisons limited due to parent familiarity
with only one provider
8Customer Satisfaction Measures
2. Student perceptions
- Considerations
- Young students may have difficulty judging
quality of services and communicating impressions - Time consuming and may require parent permission
to obtain
9Service Delivery and Compliance Measures
- Records of services provided, student
attendance - rates, and costs
- Considerations
- States may obtain data from a variety of sources,
including providers, teachers, principals, and
district staff - Corroborating data from multiple sources can
increase accuracy of evaluation conclusions
10Service Delivery and Compliance Measures
2. Feedback from SES customers
- Considerations
- First-hand impressions or observations may be
lacking - Translation may be needed to reach parents who do
not speak English - Obtaining representative samples may be difficult
11Service Delivery and Compliance Measures
3. Feedback from district staff
- Considerations
- Districts may lack firsthand impressions or
observations of tutoring services - Some districts may also be SES providers
12Service Delivery and Compliance Measures
4. Feedback from school staff
- Considerations
- Teachers may also be SES instructors or lack
first-hand impressions of providers - Teachers may need to provide information on
multiple providers, which may be confusing and
time consuming - Identifying teachers to solicit responses may be
difficult
13Technology and Database Considerations
States will need to collect a large amount of
data to evaluate SES providers, which may require
a regional database that connects
- Achievement data and related characteristics for
all students who are eligible for SES - Each student served by SES with a specific SES
provider - Details about the services offered by each SES
provider
14Evaluation Designs Student Achievement
- A. Benchmark Comparison
- Rating (Low to Moderate rigor)
- Percentage of SES students by provider attaining
proficiency on state assessment
15Evaluation Designs Student Achievement
A. Benchmark Comparison
- Upgrades
- Percentage of SES in all performance categories
(Below Basic, Basic, etc.) - Comparison of performance relative to prior year
and to state norms - Comparison to a control sample
16Evaluation Designs Student Achievement
- Advantages
- Inexpensive and less demanding
- Easily understood by practitioners and public
- Linked directly to NCLB accountability
- Disadvantages
- Doesnt control for student characteristics
- Doesnt control for schools
- Uses broad achievement indices
17Evaluation Designs Student Achievement
- B. Multiple Linear Regression Design
- Rating (Moderate rigor)
- Compares actual gains to predicted gains for
students enrolled in SES, using district data to
control for student variables (e.g., income,
ethnicity, gender, ELL, special education
status, etc.).
18Evaluation Designs Student Achievement
- Multiple Linear Regression Design
- Advantages
- More costly than Benchmark, but relatively
economical - Student characteristics are statistically
controlled
- Disadvantages
- Doesnt control for school effects
- Less understandable to practitioners and public
- Effect sizes may be less stable than for Model C.
19Evaluation Designs Student Achievement
- C. Matched Samples Design
- Rating (High Moderate to Strong rigor)
- Match and compare SES students to similar
students attending same school (or, if not
feasible, similar school) - Use multiple matches if possible
20Evaluation Designs Student Achievement
C. Matched Samples Design
- Advantages
- Some control over school effects
- Easily understood by practitioners and public
- Highest potential rigor of all designs
- Disadvantages
- More costly and time consuming
- Within-school matches may be difficult to achieve
21Data Collection Tools
- Surveys for LEAs, principals/site coordinators,
teachers, parents, and providers. - Common core set of questions from all groups to
permit triangulation. - Choice of frequently, occasionally, not at
all, and dont know - Open-ended question, Additional comments
22Data Collection Tools
- Selected survey questions
- What was the start date of provider services?
- In which subjects did your students receive
services from this provider? - Are you employed by the provider for which you
are completing this survey? - How often does the provider
- Communicate with you during the school year?
- Meet the obligations for conducting tutoring
sessions?
23Data Collection Tools
- Selected survey questions
- The provider
- Adapted the tutoring services to this schools
curriculum - Aligned their services with state and local
standards - Offered services to Special Education and ESL
students - Complied with applicable federal, state, and
local laws
24Data Collection Tools
- Selected survey questions
- Overall assessment
- I believe the services offered by this provider
positively impacted student achievement - Overall, I am satisfied with the services of this
provider
25Data Collection Tools
Sample questionnaire responses to open ended
question
Teachers
- The program began much too late in the school
year (after testing) to impact learning this
year. I have never spoken to the instructors. I
have no knowledge as to the structure of the
program. - Provider never called his classroom teacher,
never looked at student records, or coordinated
efforts until finally his classroom teacher got
through and spoke of learning problems. - I saw great gains with the kids who were served
by this provider they benefited from this
program.
26Data Collection Tools
- Provider survey selected questions
- Describe the format of your services
- Program duration
- Setting
- Format (small groups, individual)
- What is your general instructional plan?
- Describe qualifications of tutors (including data
on background checks) - List information regarding students served, goals
achieved, and tutoring sessions attended
27Rubric of Overall Evaluation of Provider
Effectiveness
28Decision Tree for SES Providers
Probation I
29CONCLUSION
- SES Evaluation models that are both suitably
rigorous and practical for states to employ are
still evolving.
- Each state has unique needs, priorities, access
to resources, and procedures used to implement
SES.
- States may face a trade-off between practicality
concerns (cost and time) and rigor (the
reliability and accuracy of findings).
30CONCLUSION
- Each state should begin its SES evaluation
planning process by identifying
- the specific questions that its SES evaluation
needs to answer, and
b) the resources that can be allocated reasonably
to support further evaluation planning, data
collection, analysis, reporting, and
dissemination.
31CONCLUSION
- Work through the hierarchy of evaluation designs
presented here and select the design that allows
the highest level of rigor.
- States may wish to engage third-party evaluation
experts in helping to plan and conduct these
evaluations.