Kentucky Performance Report - PowerPoint PPT Presentation

1 / 37
About This Presentation
Title:

Kentucky Performance Report

Description:

none – PowerPoint PPT presentation

Number of Views:82
Avg rating:3.0/5.0
Slides: 38
Provided by: rsi33
Category:

less

Transcript and Presenter's Notes

Title: Kentucky Performance Report


1
Kentucky Performance Report   Workbook      
Kentucky Department Education Fall 2002
Assessment Meetings             August 26-29,
2002 September 4-6, 2002  
 
2
A New Design for Kentuckys Performance Reports
September 2002
Creating Change
September 2001
Kentucky Performance Report
Kentucky Performance Report
  • Academic Trend Data
  • (Content Area) Trend Data
  • Count and Percent Trend Data
  • Subscores
  • Data Disaggregation
  • Summary Data and Descriptive Statistics
  • Questionnaire Data
  • NRT
  • Accountability Trend
  • Growth Chart
  • Growth Chart
  • Accountability Trend
  • Disaggregation Gap Trends
  • Content Area Index Trend
  • Academic Index Comparisons
  • Number and Percent (Content Area)
  • Sub-domain (Content Area)
  • Core Content (Content Area)
  • Questionnaire Data (Content Area)
  • Performance Level Percents
  • Mean Scale Score/Standard Deviation
  • Scale Score Data Disaggregation
  • NRT Data Disaggregation

Change
Change

Kentucky Core Content Report
  • Multiple Choice Profile
  • Open Response Profile

Change
Kentucky Evaluators Edition
  • Academic Index Comparisons
  • (Content Area) Trend Data
  • Scale Score Data Disaggregation
  • Mean Scale Score/Standard Deviation
  • Data Disaggregation per Content Area
  • NRT Data Disaggregation

Student Level Information
  • Student Item Level Report
  • Individual Student Report
  • Student Listing

No Change
3
Introduction includes important general
assessment information for understanding the
reports.
Report Cover Page includes an opening statement,
table of contents, and school identification
information.
4
(No Transcript)
5
Novice Reduction Targets
Questions What is the schools accountability
goal for 2002?   Did the school meet its
accountability goal?   Did the school meet novice
reduction and dropout criteria for
rewards?   What is the baseline for the
school? What is the standard error for the
school?   What do the numbers on the left margin
of the graph represent?   Did the school pass a
recognition point?   Would this school qualify
for a reward? Explain.   What is the schools
goal for the next biennium?  
Baseline
Recognition Points
Dropout Rate
Goal
Standard Error
Rewards Information
Notes
6
(No Transcript)
7
Four years of comparable school data.
Questions Did any academic area(s) show steady
growth over four years? If yes, which
areas? Did any academic area(s) decline or show
inconsistent performance? If yes, which
areas?    Did any of the non-academic data show
movement in either a positive or negative
direction? Explain.   Does the NRT data show
change? Explain.
  • The NRT Index (based on the CTBS Total Battery
    National Percentile) is calculated with student
    scores assigned on the following scale
  • 0 points for students scoring from the 1st to the
    24th national percentile or incomplete
    battery
  • 60 points for students scoring from the 25th to
    the 49th national percentile
  • 100 points for students scoring from the 50th to
    the 74th national percentile
  • 140 points for students scoring from the 75th to
    the 99th national percentile

Notes
8
(No Transcript)
9
Questions Where are the significant differences
for the school? (Hint Note the SDs and ns.)
    What subgroup(s) in which content areas show
significant differences for three or more of the
four years?     Are there subgroups in content
areas that show no significant differences for
any year?      Where in the KPR can additional
details about disaggregation be located?
     How can the SD notation/label help a
school improve? What might be the next steps?  
Gap in scale scores broken out by group over four
years of data.
Statistically significant difference
Legend contains valuable information
Notes
10
(No Transcript)
11
Questions Are there any content areas that
declined over the four years, were flat, or
showed uneven performance?   Content areas
should not be compared to each other.
Why?     Instead, compare each content area to
the absolute goal of 100. How close is the
academic index to 100? How close is each content
area to 100?   Did any content area(s) show
consistent growth?   What questions could
teachers and others in a school ask to identify
possible causes for the patterns that appear in
the scores?            
Displays total academic index and content area
indices over time.
Compare these indices to the absolute standard of
100. The state goal is proficiency or 100 by the
year 2014.
Notes
12
(No Transcript)
13
Questions Note Remember to compare to the
absolute goal of 100.   In what content area(s)
is the school outperforming the district,
region, and state?         In what content
area(s) is the school performing lower that the
district, region, and state?         
Displays school, district, region, and state
academic and specific content area indices.
Notes
14
(No Transcript)
15
Displays percent and number of students scoring
at each performance level in a specific content
area over time.
Questions How has the percent of students in each
category changed over time? Explain.      Has
the percent increased in the upper levels
(proficient/distinguished) and decreased in the
lower levels (apprentice/novice)?      What does
the data show about students in lowest
performance levels?      How might this
information impact the novice reduction targets
reflected on the growth chart?      What kind of
programs might the school consider implementing
to change the current pattern of performance?  
The distribution of scores among the various
categories can provide a basis for analysis.
Notes
16
(No Transcript)
17
Questions In what sub-domain are you above or
below the state mean?       Which sub-domain area
shows the biggest difference between the school
and state mean?     Why must the information in
this report be read horizontally and not
vertically?       What implications exist for
instruction and curriculum alignment?
Displays the school and state mean for items that
measure each sub-domain of a content area.
Compare school with state mean.
Do not compare scores across sub-domains.
Notes
18
(No Transcript)
19
Shows performance on open response and multiple
choice in specific Core Content categories,
compared to the state mean.
Questions  
Look at the last column (school minus state
mean), where are there negative values that are
greater than the standard error?       Look for
patterns in student responses (hint blanks,
zeros). Explain what you see.     What
implications does this report page have for
curriculum alignment?  
Standard Error
Information in this report should not be compared
vertically. This report gives raw item level
data which has not been scaled or linked using
item response theory.
Observations are the number of times students
were presented items in this category. Forms may
have a different number of items in a category.
Look for patterns. In this column, do you have
more positive or negative numbers?
Notes
20
(No Transcript)
21
Questions Are there any notable differences
between the school and state percentages?     Are
there implications for using different teacher
strategies or instructional practices?
         What questions could you ask in the
school to probe deeper?       What might be some
next steps if students and teachers do not share
the same perception of instruction?        
Alternate Portfolio is not included.
Legend is important to understanding.
Notes
22
(No Transcript)
23
Percent of students scoring at each performance
level by group.
Questions Identify any subgroups that have a
different pattern. Discuss any pattern(s).        
  Is there a specific subgroup(s) showing lower
performance?     What implications for a
students opportunity to learn could be discussed
from this report page?  
Distinguished
Proficient
Apprentice
Novice
Notes
24
(No Transcript)
25
Questions Which subgroup(s) has a mean that is
closest to a cut score line?             What
type of implications does this report page have
for curriculum and instruction?           How
could a school begin to prioritize instructional
services to meet the needs of students?
           
Mean scale score for school.
Dotted lines represent cut-points
One standard deviation above and below the mean.
Hispanic
African-American
Notes
26
(No Transcript)
27
Describe any significant differences found in the
schools subgroups that are not found at
district, region, or state levels.       Are
there any subgroups at the school level where no
significant differences exist? Explain.          
  How does this type of disaggregation impact
instructional choices and decisions?  
Scale scores broken out by group. Data from
school, district, region, and state shown.
Standard Error in ( ) for each scale score mean.
Difference in performance of groups reported.
Asterisk denotes significant difference.
28
Writing is reported separately for Portfolios and
On-Demand in three pages Number and Percent,
Performance Level Percents, and Data
Disaggregation.
29
This page provides total writing trend data
(portfolio and on-demand).
30
(No Transcript)
31
National Percentile Range
Questions Does the number and percentage of
students in each NP range show any trends over
time? Describe those trends. What trend would
you want to see for a positive impact on a
schools accountability index?
Displays the total number of accountable students
for the school or district over time.
Breaks out the number and percent of students
scoring in each category and identifies the
weight each category receives.
32
(No Transcript)
33
Questions Which subgroup(s) in each content area
is performing in the lower percentile range?
Explain any patterns.       What implications
could this have on instruction in reading,
language, and mathematics?    
CTBS Data by groups is shown for NCE and NP.
Notes
34
Student Level Information Provided to Schools
In Addition to the Kentucky Performance Report
  • Student Listing
  • Student Item Level Report
  • Individual Student Report

35
Lists all students accountable to this school,
including students who submitted Alternate
Portfolios and those tested in other locations
but accountable to this school.
Individual student performance levels are
reported for AH and PLVS for the first time this
year.
36
Number represents the order the questions were
presented.
Student Item Level Report shows each students
Test form Multiple-Choice Profile Open-Response
Profile Performance Level
37
Information about how the student scored against
the N, A, P, D standard and percentiles are both
reported.
Two copies of this report arrive from DRC, one
for parents and one for the students school
record.
Write a Comment
User Comments (0)
About PowerShow.com