Title: Standard Setting: Grade 3 Mathematics
1Standard Setting Grade 3 Mathematics
Massachusetts Comprehensive Assessment System
(MCAS)
- Sheraton Four Points Hotel
- Norwood, MA
- August 15-16, 2007
2Wednesday, August 15Overview of Plenary Session
- Welcome/Introductions
- Overview of MCAS Program
- Purpose of 2007 Standard Setting
- Body of Work Method and Procedures
- Ground Rules for Standard Setting
- Agenda (Wednesday-Thursday)
3Department of Education
- Bob Bickerton, Associate Commissioner
- Wayne Fernald, MCAS Mathematics Lead Developer
- Haley Freeman, MCAS Mathematics Development
Specialist - Mark Johnson, Director of MCAS Test Development
- Bob Lee, MCAS Chief Analyst
- Matt OConnor, Administrator for Administration,
Analysis and Reporting - Kit Viator, Director of Student Assessment
4Measured Progress
- Sally Blake, MCAS Lead Developer, Mathematics
- Lee Butler, Administrative Assistant
- Lisa Ehrlich, Assistant Vice President
- Kevin Haley, Manager of Data Analysis
- Renee Jordan, Service Center Representative
- Mark Peters, Program Assistant
- Miechelle Poulin, Program Assistant
- Michael J. Richards, Program Manager
- Kevin Sweeney, Assistant Vice President, Research
Analysis - David Tong, Assistant Director, MCAS Program
Management - Eric Wigode, Director of MCAS Test Development
5Standard Setting Facilitator
6Welcome Grade 3 Mathematics Panelists
- Karen Anderson Associate Professor Chair,
Education Dept. Stonehill College - Nancy Buell Elementary Mathematics
Specialist William H. Lincoln School - Bruce Carter Case Manager Urban League of
Eastern Mass. - Robert Cote 3rd Grade Classroom
Teacher Jordan/Jackson Elementary - Linda Gauthier Curriculum Coordinator Saugus
Public Schools - Cheryl Goguen Grade 4 General
Educator Miriam F. McCarthy School - Rebecca Gutierrez 4th Grade Teacher Newton
Elementary School - Steven Kaczmarczyk Special Education
Teacher Ellen Bigelow School - Kristine Klumpp Grade 3 Teacher Alden
Elementary School - Carol LaPolice Math Instructional Leadership
Specialist-Elementary Daniel B. Brunton School - Marlena McCoy Grade 4 Teacher Mittineague
Elementary School - Elaine McNamara Title I Director and
Teacher Parker Avenue School - Lyudmila Moiseyeva ELL Teacher Baker
Elementary School - Judy Moore Grade 3 Teacher Harvard
Elementary School - Stephanie Morris Grade 4 Teacher Craneville
School - Judith Richards Mathematics Teacher Graham
Parks School - Jennifer Rubera Grade 4 Teacher Pentucket
Lake Elementary - Michael Stanton Principal Boyden
Elementary School - Deborah Stewart Community Representative Urban
League
7Historical Background of the MCAS Tests
8Purpose of MCAS Program
- Inform/improve curriculum and instruction
- Evaluate student, school, and district performance
according to Curriculum Framework content
standards and MCAS performance standards - Certify eligibility for high school Competency
Determination (CD)
9Selected Features of MCAS
- Custom developed based on Massachusetts
Curriculum Framework content standards and MCAS
performance standards - 100 of questions used to determine student
scores released annually - Measures performance of ALL students educated
with public funds - Results reported according to raw scores and
performance levels
10Overview of 2006 Standards Setting Event and
Outcomes
- Cut scores successfully established at
Warning/Needs Improvement and at Needs
Improvement/Proficient - Some panelists expressed concern about whether
any test questions existed at the Above
Proficient level cut score at Proficient/Above
Proficient set at 40 (out of 40) - 2007 test designed to have sufficient questions
at Above Proficient level
11Purpose2007 Grade 3 MathematicsStandard Setting
- Primary purpose
- Establish a cut score at Proficient/Above
Proficient - Secondary purpose
- Validate cut scores at Warning/Needs Improvement
and Needs Improvement/Proficient
12Standard Setting vs.Standards Validation
- Standard setting (top cut point)
- Process of establishing original cut scores
- Panelists are not provided initial cut points
- Standards validation (bottom two cut points)
- Process of validating cut scores
- Panelists are provided initial cut points
132007 Standard Setting/Validation
Cut score to be validated
Cut score to be validated
Cut score needed
Warning
Needs Improvement
Proficient
Above Proficient
14Development of Content Standards
2000
Mathematics Curriculum Framework content
standards written for grade spans (e.g., grades
5-6 and grades 7-8)
- Supplement to the CF was created, pulling out
specific content standards for grades 3, 5, and
7 no brand-new standards were written
2004
15Content Standards vs. Performance Standards
- Content standards What
- Describe the knowledge and skills students
should acquire in a particular content and grade - Performance Standards How well
- Describe student work on MCAS tests at the Needs
Improvement, Proficient, and Above Proficient
levels
16General MCAS Performance Level Descriptors
- Needs Improvement
- Students at this level demonstrate partial
understanding of subject matter and solve simple
problems - Proficient
- Students at this level demonstrate a solid
understanding of challenging subject matter and
solve a wide variety of problems - Above Proficient
- Students at this level demonstrate a
comprehensive and in-depth understanding of
rigorous subject matter, and provide
sophisticated solutions to complex problems
17Linking Performance Standards with Student Work
- What is standard setting?
- Establishment of cut scores to distinguish
between performance levels - What is your job?
- Use the PLDs to evaluate student work and make
recommendation for Proficient/Above Proficient
cut score
18Purpose of Standard Setting
- Determine cut scores for reporting assessment
results - Answer the question
- How much is enough?
19General Phases of Standard Setting/Standards
Validation
- Data-collection phase
- Policy-making/decision-making phase
20Standard-Setting Methods
- Angoff
- Bookmark
- Body of Work
21Choosing a Standard-Setting Method
- Prior usage/history
- Recommendation/requirement
- by policy-making authority
- Type of assessment
Body of Work method chosen for MCAS test in Grade
3 Mathematics
22What is the Body of Work Procedure?
- Panelists examine student work (actual responses
to test questions) and make a judgment regarding
the performance level to which the student work
most closely corresponds.
Top cut Standard Setting Panelists examine
student work that has not been previously
classified and determine how that work should
be classified.
Lower cuts Standards Validation Panelists
examine student work that has been initially
classified into a performance level based on
starting cut points and determine if they agree
with these classifications or recommend changes
to them.
23Initial Classification of Student Work
- Initial classification of student work in grade 3
mathematics based on 2006 test results.
Step 1 Equate the 2007 grade 3 mathematics test
to the 2006 test.
Step 2 Find the raw score cuts on the 2007 form
that are equivalent to the cut points established
in August 2006.
Step 3 Select student work with scores ranging
from very low to very high classify them into
performance levels based on preliminary cut
points found in Step 2.
24Selected Student Work
Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math Example Distribution of Selected Student Work Grade 3 Math
Warning Warning Warning Warning Warning Warning Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Needs Improvement Proficient Proficient Proficient Proficient Proficient Proficient Proficient Proficient Proficient Above Proficient Above Proficient Above Proficient Above Proficient Above Proficient Above Proficient Above Proficient
X X X XX XXX XXX XX X X X X X X X X X X XX XXX XXX XX XX XX XX XX XX XX XX XX XX XX XX XX XX XX
25How to Classify Student Work
Materials you will need
- Performance Level Definitions
- General
- Grade and content specific
- Bodies of Student Work
- Responses to constructed-response questions
- Multiple-choice summary sheet
26How to Classify Student Work
- Examine the students responses to
multiple-choice questions - Examine the students responses to open-response
questions - Judge the students knowledge and skills
demonstrated relative to the PLDs - Panelists do not need to reach consensus on the
classifications
27How to Classify Student Work
To help prepare you to do these ratings, you will
spend time becoming familiar with the following
- Grade 3 mathematics test
- General MCAS and grade 3 math Performance Level
Descriptors - Bodies of student work
- Responses to multiple-choice items AND
constructed-response items
28How to Classify Student Work
- You will have the opportunity to discuss your
classifications and change them if desired. - Dont worry! We have procedures, materials, and
staff to assist you in this process.
29What Next?
- Take the assessment
- Complete the Item Map
- Discuss the Performance Level Definitions
- Complete training round
- Complete individual ratings
- Receive feedback from first round of ratings
- Discuss feedback and provide final ratings
- Complete an evaluation form
30Top 8 Most Misunderstood Things about Standard
Setting
8. Standard setting is a great opportunity to
rewrite Curriculum Framework standards.
7. The process is rigged.
6. This is a good time to vent about all the
things you hate about MCAS.
- 5. We should use this time to rework Math
performance level definitions.
31Top 8 Most Misunderstood Things about Standard
Setting
4. Standard setting is scoring.
3. Only Mathematics scholars should be doing this
work.
2. Only teachers should be doing this work.
32Ground Rules
- Role of facilitator is to facilitate and keep
process on track - Process solely focused on recommending
performance standards (cut scores) for MCAS - MCAS performance level definitions are integral
to process but are not up for debate - Panelists recommendations are vital however,
final cut scores determined by the MDOE - Each panelist must be in attendance for the
duration of the process for his/her judgments to
be considered - Each panelist must complete evaluation form at
the end of the event - Cell phones off, please!
33Agenda
- Wednesday, August 15
- Breakfast 800 am 900 am
- Work session 900 am 1200 pm
- Lunch 1200 pm 100 pm
- Work session 100 pm 400 pm
-
- Thursday, August 16
- Breakfast 800 am 900 am
- Work session 900 am 1200 pm
- Lunch 1200 pm 1245 pm
- Work session 1245 pm Until completion
-
34Room Assignment
35