Title: Statewide Educational Accountability Under NCLB
1Statewide Educational Accountability Under NCLB
- A Discussion of Selected System Design Variables
for Presentation to the CCSSO Workshop on AYP
Implementation - William J. Erpenbach
- St. Louis, Missouri
- September 11, 2003
2Statewide Educational Accountability Under NCLB
- Based on the recently published paper discussing,
Central Issues Arising from an Examination of
State Accountability Workbooks and U. S.
Department of Education Reviews Under the NCLB
Act of 2001. - In collaboration with
- Ellen Forte-Fast, President and CEO, edCount,
Inc. - Abigail Potts, CCSSO
3State Accountability Workbooks and Plans
- A brief examination of some of the approval
decisions and how they played out against - Components of an Integrated System of
Accountability. - Design variables identified in Chapter 1 of
CCSSOs December 2002 publication, Making Valid
and Reliable Decisions in Determining Adequate
Yearly Progress. - EDs Accountability Peer Review Guidance.
- Conclusions presented are based on EDs
approvals and information gleaned from States.
4Principles of Accountability and EDs Peer
Review Guidance
- Adequate Yearly Progress Model and Method
- Calculation methods annual decisions 37 cells
subgroup accountability based primarily on
academic assessments. - Other Academic Indicators
- Graduation rate another indicator.
- Inclusion and Participation Rates
- The Full State Accountability System
- All schools all students rewards and sanctions
report cards. - Reliability and Validity of the System
- Valid and reliable decisions.
- See Appendix B
5Framework for Integrated Systems of Educational
Accountability Systems
- Academic Content and Student Achievement
Standards - Aligned Student Assessments (multiple measures of
student achievement) - Integrated School, District, and State
Educational Accountability System - Standards-Based Decision-Making
- Related Professional Development
- Public Reporting of Results
6Backdrop to the State Reviews and Decisions
- Sequence
- Early 12/02 Final Accountability Regulations.
- Seven States (CO, IN, LA, MA, MS, NY, NY)
Invited by ED to submit Workbooks early. - Mid 12/02 CCSSOs AYP Publication.
- State Meetings with ED Officials Begin 12/02.
- Late 12/02 ED releases Workbooks to States.
- Early 01/03 CCSSO Workshop for States (1st 5
approvals announced just prior). - State Accountability Workbooks due to ED
01/31/03.
- EDs Decision Process
- NCLB Enacted 01/02.
- Subsequent Regulations.
- Internal unpublished policy papers/decisions.
- Peer ReviewsProcess and Reports to ED
- ED negotiates with SEAs
- Hickok letters to SEAs
- ED Decisions Announced
7Things to Keep in Mind
- The paper not intended to be an analysis of the
peer review process or the decisions rendered by
ED. - Hopefully, ED will release a summary of its own.
- The paper not intended to aid and abet gaming
the accountability system requirements. - It is often difficult to know all of what a State
has designed into its system. State Workbooks did
not necessarily always address all of the
elements and ED didnt necessarily always pursue
answers in these cases. - Unfortunately, Validity and Reliability questions
did not receive much attention in Peer Reviews
and EDs decisions. (See pp. 45-46, 59 references
in Title I, and 7 of 19 questions in Peer
Report.) - Before considering changes in your States model,
ask yourself about the technical, policy,
political, and practical effects of EDs various
decisions on your States AYP model.
8Things Learned Along the Way
- Six months and 52 State Reviews LaterLessons
learned - Dont try to read the law literally!
- There is no use in trying to be logical or
rational! - Like making sausage, policy decision-making is
sometimes not a pretty process! - Dont expect anything to happen in a timely or
sequential fashion. It wont!
- A sense of humor always helps!
- The law doesnt always mean what it seems to say!
- Youre likely to find new law in the regulations!
- Disclosure will almost always cause problems.
- Just when you think youve seen the final
surprise, there will be another one real soon! - Approved doesnt necessarily mean that!
- Todays answer might change tomorrow (dont bank
on it)!
9Interesting State Strategies
- Use of confidence intervals of at least 95 for
every indicator including safe harbor and
count items. - Minimum ns including a higher minimum n for
the SWDs subgroup and other subgroups (p. 23). - Rigorous FAY definition (pp. 28-29).
- Use of index for percent proficient (pp. 15-17).
- Defining exit criteria for LEP or SWDs
subgroupsextending time served students may be
included in these subgroups for AYP
determinations (pp. 34-37). - AYP Trajectories (AMOs and IGs).
- Small Schools (p. 23).
- Not rolling up data over multiple years to make
subgroup determinations. - Use of progress on other academic indicators
rather than specific targets.
10Assessments and Accountability Systems
- AssessmentsWhich to include in the
accountability system (p. 6). (Although not
technically covered under the accountability
system reviews, it seems impossible to not at
least consider these given that AYP is based
primarily on a States academic assessments.)
- Some States switched to reading only dropping
Language Arts from AYP determinations (pp. 8-9).
ED stopped mentioning need for re-reviews could
still be an issue for Timeline Waiver States.
11Student Academic Achievement Standards
- Revisiting student academic achievement
standardswhere the cut scores are set and how
they are applied (pp. 7-8).
- Some States did this and others added the use of
Standard Errors of Measurement (p. 27).
12Defining Achievement Levels
- Defining student achievement labels (what it
means to be proficient, etc., pp. 7-8). -
- A few States did this others included as
proficient students in a lower level and
others used an index giving partial proficiency
credit for lower-scoring students.
13Uniform Averaging Procedures
- Impact of uniform averaging procedures (pp. 24
and 51).
- Many States opted to utilize these procedures to
make AYP determinations. ED also approved some
non-uniform models.
14Starting Points, Annual Measurable Goals, and
Intermediate Goals
- Determination of Starting Points and Setting the
Annual Measurable Objectives and Intermediate
Goals (pp. 37-45).
- Variations approved included
- Ohios AMOs and IGs Model.
- Starting Points based on other than 2001-02 for
Timeline Waiver States. - At least one State was approved for letting
LEAs set their own starting points although those
below the States are required to meet the
States in order to make AYP.
15Minimum n for Accountability (pp. 20-24)
- Thirty to 40 fairly typical. Many States also
applying confidence intervals. - Two States are using 0 with CIs.
- Some States used different n for SWDs, e.g., 45
vs. 30. - One State will use 52 for subgroups and 30 for
all students.
- Another State will use 40 for all students and
employ a 50/10/200 rule for subgroupssubgroups
with 200 or more will be considered for AYP
subgroups between 50 and 199 will be considered
if they represent at least 10 of the entire
student body but subgroups below 50 will not.
16Continuing Exited Students in LEP and SWDs
Subgroups
- Including exited LEP students and Students with
Disabilities in those subgroups for AYP
determinations (pp. 34-35).
- ED approved several State plans requiring each
State to have specific exit criteria thus,
students may continue to remain in these
subgroups as long as they have not satisfied the
criteria to be exitedmany related issues. - Related session later today.
17Apportioning Memberships Across Subgroups
- Students counted in multiple achievement
determinations.
- At least one State proposed apportioning
membership across subgroups not approved. EDs
position appears consistent with respect to not
apportioning membership across subgroups (pp. 16
and 49).
18Other Design Variables and Decisions That Emerged
- Dual Accountability Systems (pp. 18-19).
- Out-of-Level Testing (Instructional-Level
Assessments, pp. 30-31). - Accountability based on non-augmented NRTs (p.
10). - IEP defines Standard Number of Years to graduate
(p. 44). - Use of most recent scores (pp. 11-13).
- Application of Confidence Intervals to Safe
Harbor (pp. 24-27) and Count determinationsa
late shift (p. 50) . - Using number tested FAY rather than number
enrolled FAY to calculate proficiency (p. 15). - Opportunity to review and present evidencewide
variations (pp. 27-28). - State capacity impacted scope of the
accountability design.
19This is another fine mess youve gotten me into,
Ollie!
- Any of thirty-seven (37) cells.
- Same Academic Subject.
- Either Academic Subject.
- Participation Rate.
- Other Academic Indicator.
- Academic Subject and Participation Rate paired.
- Academic Subject and Other Academic Indicator
paired. - Academic Indicator and Participation Rate paired?
What are the possible AYP Patterns (pp, 14-15)?
20Next Steps?
- So, whats next? Thinking about revisiting your
State accountability model? Some things to
consider
- Analyze the plan for areas you might want to
change (especially if originally denied). - What are the policy, political, and practical
implications? - Examine the rationale a State used in advancing
its position if of interest to you. - Carefully examine potential long-range impacting
changes such as minimum n and confidence
intervals.
21To Contact Me
- Bill Erpenbach
- erpenwj_at_chorus.net
- 608-836-3226
- 608-836-1738 (fax)