Title: Using Data to Drive Change
1Using Data to Drive Change
- MAG Conference
- November 2006
2How do we use data to influence decisions?
- School Level by Bruce Hislop, PGCPS
- Program Level by Clare Von Secker and Steve
Bedford, MCPS - Instructional Level by Carolyn Wood, HCPS and
Steve Perakis, CCPS - Interventions by Bob Lissitz, MARCES, UMCP
3Determining Reasonable AMO Targets for Schools
Far Above or Below State AMO Targets
- Bruce Hislop
- Accountability Reporting Officer, PGCPS
- Maryland Assessment Group Conference
- November 16, 2006
4AMO Scenarios For A Given School
5SY07 Target For A Yellow School
6SY07 Target For A Yellow School
7SY07 Target For A Yellow School
8SY07 Target For A Pink School
9SY07 Target For A Pink School
10SY07 Target For A Pink School
11SY07 Target For A Green School
12SY07 Target For A Green School
13SY07 Target For A Green School
14SY07 Target For A Red School
15SY07 Target For A Red School
Identification of Red Schools
- Distance below SY06 AMO
- Level in School Improvement (RP, RI)
16SY07 Target For A Red School
Three Four-year-to-target Models
- Linear Growth
- 70-70-85-100
- 50-50-75-100
17SY07- SY10 Targets For A Red School
Linear Growth
18SY07- SY10 Targets For A Red School
Linear Growth
19SY07- SY10 Targets For A Red School
70-70-85-100
20SY07- SY10 Targets For A Red School
70-70-85-100
21SY07- SY10 Targets For A Red School
50-50-75-100
22SY07- SY10 Targets For A Red School
50-50-75-100
23Which Model Is Right For You?
Model is determined by levels of challenge and
opportunity
- Percent FARMS
- Percent classes taught by HQT
- Years in principalship
24Which Model Is Right For You?
Degree of each factor is weighted, giving a
weighted option scale.
25Which Model Is Right For You?
Sum of weights determines accountability model
26This plan is under development and will most
likely undergo more revisions.
27Using PSAT Data to Drive Instructional Change
- Clare Von Secker, Ph.D. and Stephen L. Bedford
- Montgomery County Public Schools
- November 16, 2006
28Goals of PSAT Testing
- Preparation for the SAT
- Identify students who
- Need additional support
- Usually score range of 20 to 35
- Have Honors/AP potential
- At or above mean verbal score of 44
- At or above mean math score of 45
29PSAT Honors/AP Identification
- Identified as many as 2,000 sophomores per year
who had Honors/AP potential but - Were enrolled in regular-level English and
mathematics courses in Grade 10 - Were still enrolled in regular-level English and
mathematics courses in Grade 11
30Identify Other Considerations/Factors/Barriers
- Enrollment decisions take into account students
- Enrollment in other Honors courses
- Academic performance
- Attendance
- Interests, motivation, and recommendations
31Data Management Needs
- Decision-makers needed a way to
- Compile all relevant student Honors/AP indicators
- Sort quickly and efficiently through large
amounts of student-level information - Get buy-in from stakeholders (teachers, students,
parents)
32Key Indicators
- Stakeholders identified
- PSAT participation
- PSAT verbal, math, and writing scores
- Semester 1 enrollment in Honors-level core
courses - Total number of semester 1 Honors courses
- Semester 1 English courses and marks
- Semester 1 mathematics courses and marks
- Grade point average (GPA)
- Attendance rate
- Participation in services for limited English
proficiency and special education - Gender and race/ethnicity (to assure equity)
33Student variables are listed as column headers in
EXCEL
- The column headers included for each school are
- Grade
- High school attended
- Student last name, first name, and MCPS id
- Yes/No flag showing whether student took the
PSAT in Grade 10 - PSAT verbal, math, and writing scores
- Yes/No flags showing enrollment in Honors-level
English, mathematics, science, social studies,
and any course - Total number of Honors core courses taken
semester 1 - Semester 1 English and mathematics courses and
final marks - Grade point average (GPA)
- Attendance rate
- Gender and race/ethnicity
- Yes/No flag showing participation in services for
limited English proficiency or special education - Name of students guidance counselor
- Columns for counselors notes
34Filters identify all values of each variable
- The values for the variable showing students
Semester 1 Grade 10 math course include - Below Algebra 1A (remedial mathematics courses)
- Algebra 1A
- Geometry A
- Honors Geometry A
- Algebra 2A
- Algebra with Analysis 2a (Honors Algebra 2)
- Precalculus
- Precalculus with Analysis (Honors Precalculus)
35(No Transcript)
36 How many of the students with PSAT scores of 45
or higher are not enrolled in Honors mathematics
during the first semester (Honors math
No)? The search found 97 out of a total 552
students in the sophomore class who should be
considered for Honors-level mathematics in Grade
11.
37 The search found 33 with PSAT math scores
greater than or equal to 45 who were not enrolled
in any Honors course during the first semester.
38- The search found 18 students with
- PSAT math scores greater than or equal to 45
- not enrolled in any Honors course during the
first semester - with GPAs greater than or equal to 2.5.
39- Hispanic female student
- PSAT scores of 64 (verbal), 54 (math), and 62
(writing) - not enrolled in any Honors course
- GPAs greater than or equal to 2.5
- C in English 10
- D in Algebra 2
40(No Transcript)
41Mastering Algebra Using Teacher-Made Assessments
to Support Teaching and Learning
- What We Can Learn About Students and Instruction
From Locally-Developed Mid-Term Exams
42Meeting the H S A Challenge
- Beginning with students entering grade 9 in
September, 2005, all students must pass four High
School Assessments, including Algebra/Data
Analysis, or earn a minimum score on each and a
Composite Score as a prerequisite to earning a
Maryland high school diploma
43What We Know
- Students enrolled in grades 7, 8, and 9 during
2005-06 must PASS H S A BUT - In 2006, only 12 of our 7th graders took and
passed Algebra and only about 33 of our eighth
graders had taken and passed Algebra - In the past, about 40 of students have completed
9th grade without having passed Algebra
44The Scope of the Challenge in Algebra Where We
Stand
- AS OF SEPTEMBER, 2006
- Current 10th Graders
- 1800 students have passed H S A Algebra
- 1300 students have yet to pass H S A
- Current 9th Graders
- 1200 students have passed H S A Algebra
- 2000 students have yet to pass H S A
45Math Skill among Middle School Algebra Takers is
High 2006 Data
46Math Skill Level of Studnts Taking Algebra in
High School is Low
47Success in Algebra What It Takes
- Curriculum aligned to Core Learning Goals
- Instruction aligned with research-based best
practice and the individual needs of students - Frequent assessment that is FORMATIVE to inform
instruction and motivate student effort
48Course Pathways to Mastery of Algebra 1
- Fully prepared middle school students complete
Algebra in ONE YEAR (Grade 7) - Somewhat prepared middle school students complete
standard Algebra course in TWO YEARS (Grades 7-8
or Grades 8-9) - Struggling students (including those who
previously failed H S A) complete Cognitive Tutor
Algebra course in Two Years (Grades 8-9 and
Grades 9-10) - NOTE Cognitive Tutor was introduced for the first
time last year in some schools and grade levels
49Cognitive Tutor
- Based on cognitive model that simulates student
thinking and problem-solving in math - (Partially) Computer-delivered
- Individualized to meet needs of student
- Continuing assessment to support
diagnostic-prescriptive teaching - Simulates one-on-one coaching
- Meets scientifically-based research
requirements of No Child Left Behind
50Multiple Roles of Assessment
- Certify student attainment of standard
- Individual students
- Programs (based on aggregated scores)
- Identify students for intervention
- Placement in an intervention program
- Instructional assistance as needed
- Inform teachers instructional decisions (pacing,
delivery, amount and kind of content) - Group item-level dataitem analysis, item
discriminations - Individual item-level dataitems and item
clusters - Evaluate and improve instructional programs
- Success rates on local end-of-course tests
- Success rates on state tests
51Building Mid-Term Exams
- Mathematics Office needed assessments aligned
with curriculum and H S A to - Provide practice to students
- Show teachers what students need to know
- Intervention needs
- Contribute to student course grade
- Influence instruction
52Examination of Two Locally- Developed Assessments
- Algebra and Cognitive Tutor tests administered as
mid-term exams - Tests reflect different skill sequences
- BOTH focus on Indicators within Goal 1(Functions
and Algebra) and Goal 3 (Collect, Organize,
Analyze, and Present Data) - Proportion of items aligned with Indicators
varies - Both include 20 SR items, 10 SPR items, and 3 or
more CR items - Both given under H S A-like conditions
53Reviewing the Products
- Are we measuring anything?
- Are we measuring what we intend to measure?
- How do we know?
54Key Question
- How well do two mid-term exams that are designed
for students enrolled in Algebra 1 and Cognitive
Tutor Algebra, constructed by teachers, and
aligned with state learning outcomes predict
student success on H S A and provide useful
feedback to guide teachers instructional
decisions for individuals and groups?
55Sub-Questions
- How reliable are the tests as instruments?
- How valid are the tests for formative and
summative use? - FROM A CONTENT/CONSTRUCT Perspective
- To what degree are the tests aligned with H S
A/CLGs (content validity)? With MSA? - Test construction
- Format and item construction
- Test characteristics
- Item structure
- Item-total relationships
- FROM A PREDICTION Perspective
- Predicting success on H S A
- FROM A USER (INSTRUCTION) Perspective
- Supporting instructional decisions
- Informing students and parents
56Study Sample
- Algebra 1B
- 988 9th graders
- 61 proficient on MSA in Grade 8 (2005)
- 88 passed H S A in May, 2006
- Cognitive Tutor Algebra
- 45 9th graders
- 16 proficient on MSA in Grade 8 (2005)
- 34 passed H S A in May, 2006
57Are these instruments measuring anything?
- How reliable are they?
- Evidence of Internal Consistency
- Cronbach alpha
- Algebra 1B version alpha 0.82
- Cognitive Tutor version Alpha 0.76
- Item-Total Correlations 0.15 - 0.62
- What can we say?
- Reliability modest but OK for a classroom test
58What will the tests tell us?
- Inferences about student competence in algebra
(overall) and the likelihood they will pass the H
S A - Inferences about students knowledge about the
indicators - Inferences about students ability to manage
different kinds of items
59Are the Midterms measuring Mathematics
Knowledge? Algebra/Data Analysis?
- Evidence from MSA and H S A Algebra
- Algebra 1B Midterm (Grade 9 ONLY N988)
- Correlation with MSA Total (GR 8)0.58
- Correlation with MSA ALGEBRA Subscore (GR 8)
0.46 - Correlation with MSA Total (GR 7) 0.50
- Correlation with MSA ALGEBRA (GR 7) Subscore
0.38 - Correlation with H S A Algebra Total r 0.58
- Correlation with H S A Algebra/SS1 Patterns and
Functions r 0.44
60Are the Midterms measuring Mathematics
Knowledge? Algebra/Data Analysis?
- Evidence from MSA and H S A
- Algebra/Cognitive Tutor Midterm (Grades 9 and 10
9 ONLY N574) - Correlation with MSA Total (GR 8)0.45
- Correlation with MSA ALGEBRA Subscore (GR 8)
0.33 - Correlation with H S A Algebra Total r 0.50
- Correlation with H S A Algebra SS 1 (Patterns and
Functions) r 0.46
61Cognitive Tutor Exam Parallels the H S A Topics
and Item Design
- Items by Topic
- (1.1) Analyze patterns and functions9 items
- (1.2) Use language of math to model and interpret
real-world situations19 items - (3.1) Collect, represent, organize data--5 items
- Items by Type
- 10 student-produced response
- 20 selected response
- 2 brief constructed response
- 1 extended constructed response
62Algebra 1B Exam Parallels H S A Topics and Item
Design
- Items by Topic
- (1.1) Analyze patterns and functions17 items
- (1.2) Use language of math to model and interpret
real-world situations10 items - (3.1) Collect, organize, and present data--5
items - (3.2) Apply statistics and probability in real
world1 item - Items by Type
- 10 student-produced response
- 20 selected response
- 4 brief constructed response
63Evidence Supporting Interpretation of Item
Clusters
- Looked for correlations among items purportedly
measuring the same indicator no consistent
patterns - Found some tendency for like items (formats) to
cluster
64Evidence of Predictive Validity
- Algebra 1 B
- Use Raw Score to Predict H S A ALG status
- Adjusted R-squared 0.341
- Constant 391.60 B 1.31
- SE Estimate 19.51
- Cognitive Tutor
- Use Raw Score to Predict H S A Algebra status
- Adjusted R-square 0.616
- Constant 353.05 B2.27
- SE Estimate 18.08
65Success Rates for Ninth Graders
- Students enrolled in Algebra 1B
- OVERALL PASS RATE 88
- Per Cent of Students at MSA (GR 8) BASIC Who
Passed 72 - Students enrolled in Cognitive Tutor
- OVERALL PASS RATE 40
- Per Cent of Students at MSA BASIC (GR 8) Who
Passed 35
66What We Learned
- A reasonably reliable measure resembling H S A
and aligned with Content Standards can be
constructed by teachers - SOME evidence that total scores are related to
later HSS performance and could help to predict
student success - Teachers and students are exposed to a
test-like event in preparation for a high
stakes test - More work is needed to construct an assessment
that provides useful formative feedback to
teachers - No real evidence that items measuring Indicator
groups were related in predicted ways
Implications ? - Some evidence of relationships between item
types Implications ? - What would useful formative feedback look like
and what kind of instrument/set of instruments
would be necessary?
67From Prediction to Improvement A Vision for
Classroom Assessment
- The framework for assessment and the framework
for curriculum and instruction must be one and
the same. . . Using progress variables to
structure both instruction and assessment is one
way to ensure that the two are in alignment at
least at the planning level - At the classroom level, assessment tasks must
have a place in the rhythm of instruction,
occurring at places where it makes instructional
sense to include them. This is usually at points
where teachers need to see how much progress
students have made . . . - Mark Wilson (2005)
68Next Steps
- Improve quality and quantity of FORMATIVE data
- Consider best ways to provide formative data
given the nature of the course, the students, the
discipline (when, where, how, etc) - Examine Indicators to be tested within context of
instruction - Design/try-out/improve test items based on
student responses - Make available multiple assessment opportunities
for students - Assist teachers in scoring/evaluating student
performance - Review rationale for mid-term exams (e. g.
explaining the demands of H S A to students and
teachers, identifying students at risk on H S A,
etc.), decide upon critical features of the
exams, work to improve current forms e. G, - Structuring items that represent H S A content
- Item formats represent H S A items
69Predicting HSA Performance with Public Release
HSA Items
- Charles County Public Schools
- Steven Perakis
70Predicting HSA performance
- Charles County prepared Mock HSA Assessments,
some 2 months prior to the HSA examinations. - The tests mirrored the HSAs in length and we used
test blueprints provided by MSDE. - Student performance on the HSAs were tracked, and
were merged with student performance on their
examinations.
71- IRT calibration was undertaken on tests written
by approximately 2000 students. The tests were
comprised almost exclusively from public release
items. - The KR-21 on the examinations were approximately
0.92 and the correlations between IRT derived
student theta scores and the MAY 2006 HSA
examination scores were 0.80.
72- We attempted to predict HSA scores using multiple
regression, logistic regression, and simple cross
tabulations from Mock examination scores.
73Results
- Multiple regression Successfully predicted 53
of the students scores within /- 10 scaled
points and 81 of student HSA scores within /-
20 scaled score points. - Logistic regression Using the students actual
Pass_Flag on the HSA as the dependent variable,
probabilities for each student passing each HSA
were produced. Logistic regression produced
probability tables for passing for each student.
On the LSN Mock examination for example, above a
probability for passing of .60, 93.5 of the
students actually passed the spring HSA
representing 78 of the students writing that
examination.
74Thoughts and Conclusions
- Individualized student performance profile
reports by Core Learning Goal, focused teachers
in their test-preparation run-up to the HSAs. - Charles County found that student performance on
these tests comprised of public release items to
be effective predictors of student performance on
the 2006 HSAs.
75Future Study HSA Interventions
- Dr. Robert Lissitz
- MARCES
- University of Maryland College Park
76HSA Studies
- Voluntary Collaborative Research Group between
MARCES and School Systems (funded by MSDE) - Phase 1 Predicting Performance on HSA
- Phase 2 Examining Interventions
- What interventions exist that should be studied?
- What studies are already underway?
- What studies should be done?
77MARCES Contact Information
- rlissitz_at_umd.edu
- www.marces.org