Title: Using A Data System To Inform Instruction
1Using A Data System To Inform Instruction
Presented byTony Tripolone Technology
Leadership InstituteFor School District
Administrators Lower Hudson Regional
Information CenterWestchester Marriott January
9, 2006
2Analyzing Assessment Data
- Purpose of Assessments
- Appropriate Data Comparison
- Systemic Change
- Need to Narrow Focus
3The Benchmark
- Snapshot
- Caution Use Multiple Measures
- Item Analysis Data
- -Identifies strengths and weaknesses
- -Helps align curriculum
- -Indicates need for parallel tasks
- Cut Points
- Graphing will show Gaps
4P Value or Item Difficulty
- Definition
- Why P Values Are Used
- What It Means (High Low P)
- How To Calculate
5P Value or Item Difficulty Calculation
- P Value is the proportion of students in an
identified norm group who answer a test item
correctly usually referred to as the difficulty
index. - The reason for using a P Value graph is to allow
a district to compare student performance on each
assessment to a larger reference group as a
benchmark. - A high P Value would indicate a very easy
question while a low P Value would indicate an
extremely difficult question purposely designed
to discriminate performance levels. - How a P Value is calculated
- Multiple Choice questions are either
right or wrong and receive a score of 1 or 0. - These values are added and then
divided by the number of students taking the - assessment resulting in a P Value
number between 0 and 1. - Constructed Response questions have
rubric scores greater than 1. - Therefore, the total number of points
received on each question are added and - then divided by the number of
students taking the assessment. The resulting - average score is then divided by the
number of possible points in the rubric - resulting in a P Value number between
0 and 1.
6Gap Region Outperforms District
P Value or Item Difficulty The proportion of
students who answered the item correctly. Low P
Value reflects a difficult question High P Value
reflects an easier question
7Items Grouped by Key Idea and Performance
Indicators
8Midrange of Item Difficulty
- Definition
- Why Midrange Is Used
- What It Means
- How To Calculate
9Narrowing The FocusWhat is the Midrange?
- The identified gaps provide a focus for
improvement. - Understanding that the questions were designed to
discriminate between what a student needs to know
and be able to do at the various accountability
levels is very important in the analysis of the
data. - The data is useless without comparing district
results to a larger reference group as a
benchmark. - Determination of P Value or Item Difficulty is
critical in distinguishing between easy and
difficult questions. - The midrange is a narrowing of the range of
assessment scores to provide a more reasonable
and manageable focus for teachers to evaluate
their program. - If there is a highlighted gap within the
midrange, i.e. the region outperformed your
district on questions you would expect your
students to answer correctly, then it is
reasonable for you to make meaningful decisions
to effectively address instructional and
curricular changes.
10Midrange of Item Difficulty
- How To Calculate
- Determine the range of scores by subtracting the
lowest regional P Value from the highest on an
assessment - Multiply that difference by 20
- Add this product to the lowest score
- Subtract this product from the highest score
- For example If the range of scores is between
.90 (the highest - score) and .50 (the lowest
score), then - .90 - .50.40 X .20 .08
.50.08 .58 .90-.08 .82 - Midrange is .58 to .82 on the P
Value graph
11Midrange of item difficulty is .44 - .82
Highlight Key Idea, PI, Question Number of
GAP above
12Midrange of item difficulty is .44 - .82
Highlight Key Idea, PI, Question Number of
GAP above
13Trend Summary Chart
- Displays 5 Years of Assessment Items
- For Collaborative Discussion
14(No Transcript)
15The Trend Summary Chart
- Captures the proportion of questions asked in
relation to the identified Standards and
Performance Indicators or Subskills assessed over
a Five Year period. - Highlights identified Gaps or areas of
weakness. - Reflects only weaknesses that were identified
from the midrange on the assessment data charts
that one could reasonably consider needed to be
addressed. - Shows whether multiple choice or extended
response questions contributed to the weakness. - Reveals curriculum balance issues.
16(No Transcript)
17Considerations After Viewing Trend Data
- Number and frequency of items asked
- Performance Indicators that seem to be targeted
- Any emerging patterns
- Consistent areas of weaknesses on assessments
- Consistent areas of weaknesses in student work
throughout the year - Appropriate balance and emphasis in your
curriculum as indicated by the assessments - Curriculum alignment issues
- Instructional changes that demonstrate above
considerations
18Narrowing The Focus
- Select Three Areas Of Weakness
- Look At Student Work
- Work Collaboratively
- Determine Root Cause
- Align and Map Curriculum (Horizontally and
Vertically) - Develop Parallel Tasks
- Establish Periodic Benchmarks
- Analyze Data
- Begin Again
19(No Transcript)
20(No Transcript)
21(No Transcript)
22(No Transcript)
23(No Transcript)
24Collaboration Activity
25(No Transcript)
26(No Transcript)
27(No Transcript)
28The Collaborative Effort
- 1. Look at the actual questions to
- Identify SKILLS needed
- Determine STRATEGIES necessary
- 2. Look at student work to evaluate
- RANGE of RESPONSE
- Whats missing? DECLARATIVE or
PROCEDURAL Knowledge - 3. Develop Parallel Tasks to
- Provide BENCHMARK experiential
opportunities - Increase RIGOR and RELEVANCE of
expectations -
29The Collaborative Process
- Determine what you know or dont know by doing a
Gap Analysis - Come to discussion prepared by viewing
dataMentor - Alignment of curriculum and instruction to
standards is the agenda - Discussion should be collegial and supportive
- Decision making is best with active participation
by everyone - Freedom to share identified weaknesses is
encouraged - Examine instruction methods to determine best
practices or strategies - Seek suggestions for improvement
- Remember We want to improve student achievement
for all students
30Does the test behave ?
- Look at the Range of Responses distribution
- Determine number and selection at each
accountability level - Identify item difficulty by correct responses
for multiple choice - max. points increase in constructed response as
accountability level increases
31Range Of Responses
32Range Of Responses
33Range Of Responses
34Range Of Responses
35Go to DataMentor.org to see how this data
system can seamlessly facilitate the process of
using data to inform instruction(Follow Handout
Materials)
- Contact Information
- Tony Tripolone
- Administrator for Data Management and Analysis
- Wayne-Finger Lakes BOCES
- ttripolone_at_wflboces.org
- 585-394-9239 ext. 1045