Title: Developing Rubrics to Clarify Expectations
1Developing Rubrics to Clarify Expectations
- Terri Flateby
- University of South Florida
- New College of Florida
- September 19, 2008
2Overview of Assessment Process
- 2
- Select or develop measures consistent with the
outcomes
1 Select or develop measurable student learning
outcomes (course or program)
- 3
- Assess Student learning Outcomes
5 Make adjustments in curriculum, instructional
strategies, or activities to address weaknesses
and strengths identified
- 4
- Analyze Assessment Results
3Assessment Purposes
- Summative Assessment
- Is comprehensive in nature and end oriented
- (achievement of learning outcomes at the
institutional or program level)
4Assessment Purposes (cont.)
- Formative Assessment
- Has an improvement focus and helps determine
students progress toward achieving outcomes.
When expected learning progress is not reached,
allows for changes in the course or program. It
is more on-going.
5Rubrics
- are used for summative or formative purposes.
Applicable to performance-based assignments - Papers
- Projects
- Discussions
6Rubrics
- Consist of
- Criteria used to evaluate performance
- Levels to describe potential range of performance
7- Rubrics show how learners products will be
assessed, including the criteria that will be
used and the performance levels for each of the
criteria.
8Types of Measures Used to Evaluate Student
Performance
- Norm-referenced
- A students achievement or performances to a
group is compared - Often achievement of content
- Criterion-referenced
- A students performance based upon a specific set
of criteria is judged - A product such as reports, arts, designs,
performances of any type is judged
9Types of Rubrics
- Holistic
- (mostly appropriate for summative assessment)
- Analytic
- (mostly appropriate for formative assessment)
10Holistic Rubrics
- Holistic a single score is given by combining
criteria - Example
- CLAST Holistic scoring rubric, score of 6
- The paper presents or implies a thesis that is
developed with noticeable coherence. The
writers ideas are usually substantive,
sophisticated and carefully elaborated. The
writers choice of language and structure is
precise and purposeful, often to the point of
being polished. Control of sentence structure,
usage, and mechanics, despite an occasional flaw,
contributes to the writers ability to
communicate the purpose.
CLAST Technical Manual, 2003
11Analytic Rubrics
- Analytic analysis a score for each of the
traits or characteristics of the
assignment-either generic or assignment specific.
- Example of one trait Quality of Details
12Trait Quality of Details
13Steps to Develop Scoring Rubrics
- Identify what you will assess, e.g. writing, lab
report, an assessment plan - Identify characteristics or elements of the
product Writing quality of details, main idea,
reasoning, etc., Assessment goals/student
outcomes, use of results to improve programs - Describe best work for each category, element or
trait - Details help to develop each element of the
text and provide supporting statements, evidence,
or examples necessary to explain or persuade
effectively. - Describe minimal acceptable level
- Details are related to the purpose and main
idea of the paper, but do not provide sufficient
clarity, depth or accuracy to explain or persuade
effectively.
Adapted from MJA, May 15, 2002
14Steps to Develop Scoring Rubrics (continued)
- 5. Describe unacceptable product
- Details are loosely related to the elements of
the text, but do not support those elements with
sufficient clarity, depth or accuracy. - 6. Describe levels between best and worst
acceptable - 7. Revise with colleagues
15CLAQWA Analytic Scale Rubric
- Located at
- http//www.usf.edu/assessment
16Trait Reasoning
17Guidelines for DevelopingTraits and Rubrics
- Level of specificity depends on assignment,
weight, what is necessary to determine competence
or proficiency. - Traits are nouns, e.g. quality of details.
- Rubrics describe rather than direct. Details
provide, not details should provide. - Number of levels depends on how finely levels
can be discriminated. It is unnecessary to have
the same number for teach trait. Stop adding
levels when the distinctions are meaningless or
very difficult to determine.
Walvoord, B. Anderson, V. (1998)
18Guidelines for DevelopingTraits and Rubrics
(continued)
- Relationships among levels
- Related but additive subtractive
- Distinctly different ex. Blooms Taxonomy
levels - Description of levels be as objective and
concrete as possible try to avoid words such as
adequate, appropriate, and good unless
commonly known depends on who will score. Can
use examples to explain. - Appropriate for use in courses with multiple
sections, programs. - Appropriate for program assessment.
19Creating an analytic rubric
20Identifying Weaknesses With Analytic Scoring
21Checklist for a good rubric
- Rubric Categories Do the categories reflect the
major learning objectives? - Levels Are there distinct levels which are
assigned names and point values? - Criteria Are the descriptions clear? Are they on
a continuum and allow for student growth?
22Checklist for a good rubric (cont.)
- Student-Friendly Is the language clear and easy
for students to understand? - Instructor-Friendly Is it easy for the
instructor to use? - Validity Can the rubric be used to evaluate the
work? Can it be used for assessing needs? Can
students easily identify growth areas needed?
23Benefits of Using Rubrics
- Provide clear expectations
- Leads to consistency in grading student work
- Are appropriate for peer review
- Are useful for self-evaluation
- Can help identify student weaknesses
- Explain what improvements are necessary or
desired - Provide common criteria for multiple sections or
a program
24Reference for Rubrics
- http//www.winona.edu/air/rubrics.htm
25- Rubrics References
- Huba, Mary and Jan E. Freed. Learner-Centered
Assessment on College Campuses. Boston Allyn
and Bacon, 2000. - Walvoord, Barbara E. and Virginia J. Anderson.
Effective Grading A Tool for Learning and - Assessment. San Francisco Jossey-Bass, 1998.
- Wiggins, Grant. Educative Assessment Designing
Assessments to Inform and Improve Student - Performance. San Francisco Jossey-Bass, 1998.