Title: Performance Assessment
1Performance Assessment
Dr. Elizabeth Shiner Klein
State University of New York - Cortland
2Presentation Outline
- Introduction
- Why Implement Performance-Based Assessments?
- NCATE Position
- Performance Assessment Examples
- How to Implement Performance Assessments
3Purpose of Assessment
- To improve teaching and learning
4In an attempt to toughen its graduation
standards, Wilmont High School now required
students to answer one last question before
receiving their diploma.
5Introduction
- What is Performance Assessment?
- Not new
- Drivers Test
- CPR Certification
6Why Performance-Based?
- Authentic
- Individualized
- Provides artifacts for the NCATE Portfolio
7NCATE Performance Standards
- Standards describe what teacher candidates should
know and be able to do - Evidence is from assessments and evaluations of
candidate proficiencies in relation to the
standards - Responsibility of ALL program faculty to make the
case that candidates are meeting the standards
and DEMONSTRATED how well candidates meet the
standards.
8NCATE Performance Standards
- Descriptions of rubrics or criteria used to
evaluate the teacher candidates proficiencies
will be included in the data submission - Samples of candidate work which illustrate
different levels of performance and multiple
types of information will be included - Emphasizes content knowledge as well as teaching
skills in reading/language arts, science, math,
and social studies
9NCATE Sound Evidence
- Results from planned, purposeful and continuing
evaluation of candidate proficiencies, drawing on
diverse sources - Represents the scope of the standards for teacher
preparation - Measures the different attributes of standards in
appropriate and multiple ways
10NCATE Sound Evidence
- Results from rigorous and systematic efforts by
the institution to set performance levels and
judge the accomplishments of its candidates - Provides information that is credible--accurate,
consistent, fair and avoiding bias - Makes use of appropriate sampling and summarizing
procedures.
11Long before he became famous as the host of
Jeopardy. Alex Trebek was a high school science
teacher.
12Examples
- Music Performances
- Drama Performances
- Dance Performances
- Oral Presentations
- Science Lab Tasks
- Athletic Competitions
13Examples
- Interviews
- Conferences
- Self- Assessments
- Learning Logs
- Observations
- Debates
- Portfolios
14Examples
- Art Exhibits
- Poster Sessions
- Class Presentations
- Essays
- Stories and Poems
- Research Papers
15Examples
- Reflective Journals
- Videotapes
- Science Projects
- Classroom Simulations
- Role Playing
- Defining Historical Evidence
16Actual Classroom Example
- George Lucas Educational Foundation
- Learn and Live
- Online Text, Resources, and Video Clips
- Available http//www.glef.org
17Video Focus Questions
- Is it an example of authentic learning?
- How is assessment imbedded into instruction?
- When does Mr. Dieckman involve the students in
the criteria development? - If all K-12 students were taught and assessed in
this manner--what would it mean for teacher
education and higher education in general?
18General Scoring Criteria
- 5 Student accurately ............ with no
errors. - Explains why (does more than is required,
works on a higher level) -
- 4 Student accurately ........... with 1 or 2
minor errors. Does not explain why (or is not
accurate in their explanation) -
- 3 Student demonstrates evidence of _______ with
some errors. Have the general concept, at least
half of the responses are acceptable.
19General Scoring Criteria
- 2 Student shows some evidence of ..........with
many errors. ___ correct responses. -
- 1 Student has little or no ...........but tries
with significant errors. -
- 0 No attempt (zero scoring criteria optional).
20General Scoring Criteria
- Level 6 Solid work that may go beyond the
requirements of the task(s). - Level 5 Fully achieves the requirements of the
task(s). - Level 4 Substantially completes the requirements
of the task(s).
21General Scoring Criteria
- Level 3 Limited completion of the requirements
of the task(s). - Level 2 Requirements of the task(s) not
completed. - Level 1 Does not achieve any requirements of
the task(s) - Adapted from the Mathematics Assessment of the
California Assessment Program
22Validity and Reliability Concerns
- Know the behavior you are looking for
- Collect enough information on similar tasks
- Unpiloted, one-event assessment in the
performance area is even more dangerous than
one-shot multiple choice testing.
23Validity and Reliability Concerns
- Use multiple judges if possible for inter-rater
reliability - Develop and use quality scoring rubrics
- Make sure the task is valid for inferring mastery
of a complex capacity - Generalizability
24Cautions
- Time consuming
- University of California-Santa Cruz
- 1,000 question test bank and scantron!
- Good rubrics are hard to make
- However, assessment becomes imbedded in your
instructional time
25A diabolical new testing technique math essay
questions.
26Cautions
- Students need time to adjust
- They are used to asking the instructor tell me
what I need to know rather than the instructor
asking them what do you know? - May resist having to prove their competence
- Want clear criteria and directions
- So, involve them in the development of the
criteria --make it their assessment, too.
27Components of a Quality Assessment
- Good Assessments are often indistinguishable from
quality instructional tasks - ... an ideal assessment would be a good
teaching activity, and indeed, might even serve
as a teaching activity when not used for an
assessment (Shavelson and Baxter 199220).
28Components of a Quality Assessment
- A good assessment should investigate how well the
student understands the topic as well as a
mastery of a body of content knowledge
29Components of a Quality Assessment
- A quality assessment should not just emphasis the
correctness of the answers or performance, but
also how the answer was realized or how an
activity was carried out (Loucks-Horsley 1989)
30Implementation Strategies
- Assessment should be planned as instruction is
planned - Faculty should have a partner with whom to share
ideas, rubrics, performance assessments and to
assist in inter-rater reliability - Generic rubrics can be developed to avoid
reinventing the wheel
31Implementation Strategies
- Strategies should include student peer assessment
- Cooperative grouping should be used for
completing some assessments - Faculty should expect to learn from trial and
error
(AEL VEA 1992vi)
32Concluding Remarks
- Start off simple and work toward more authentic
performance assessments. - You should be assessing in the same manner that
you teach--using alternative assessment
techniques may require some to change teaching
styles.
33Concluding Remarks
- Borrowed assessments must be adapted to your
learning environment based on the needs of your
students and your curriculum - Higher Education Faculty--K-12 teachers are
assessment resources for you!
34References
Appalachia Educational Laboratory (AEL) and
Virginia Education Association (VEA). (1992).
Alternative assessments in math and science
Moving toward a moving target. Charleston,
WV. Barba, R. H. (1995). Science in the
multicultural classroom A guide to teaching and
learning. Boston Allyn and Bacon. Loucks-Horsley
, S. (1989). Science assessment What is and what
might be. Educational Leadership. 96.
NCATE/ACEI. Performance-based program review
draft guidelines. Available at
http//ncate.org/pilots.htm. Shavelson, R. J.
G. P. Baxter. (May 1992). What weve learned
about assessing hands-on science. Educational
Leadership 49(6), p20, 6p. Worthen, B. R.
(February 1993). Critical issues that will
determine the future of alternative assessment.
Phi Delta Kappan, 74(6), p444, 10p.
35Questions?
- This presentation available at
- http//education.cortland.edu/faculty/
- kleine/assessment.html
36(No Transcript)