Developing Assessment Instruments - PowerPoint PPT Presentation

About This Presentation
Title:

Developing Assessment Instruments

Description:

Refers directly to explicit 'criterion' or specified ... trick questions. double negatives, misleading information, etc. Barry Williams. 14. Other Factors ... – PowerPoint PPT presentation

Number of Views:106
Avg rating:3.0/5.0
Slides: 22
Provided by: cours1
Category:

less

Transcript and Presenter's Notes

Title: Developing Assessment Instruments


1
Developing Assessment Instruments
  • Dick Carey
  • Chap. 7

2
Criterion-Referenced Tests
  • Designed to measure explicit behavioral
    objectives
  • Used to evaluate
  • learner performance
  • effectiveness of the instruction

3
Criterion-Referenced
  • Also called objective-referenced
  • Refers directly to explicit criterion or
    specified performance
  • Criterion-Referenced Test must
  • match test item and performance objective
  • stipulate degree of mastery of the skill

4
Types of Criterion Tests
  • Pretest
  • 1. Consists of items that
  • measure entry behavior skills
  • test skills to be taught
  • draw from skills below the entry behavior line
  • 2. Helps determine appropriateness of required
    entry skills.
  • 3. Used during formative evaluation process. May
    be discarded in final version of instruction.

5
Types of Criterion Tests
  • Posttest
  • 1. Assesses all the objectives, focusing on
    terminal objectives
  • 2. Helps identify ineffective instructional
    segments
  • 3. Used during the design process and may be
    eventually modified to measure only terminal
    objectives

6
Designing Tests for Learning Domains
  • Intellectual Verbal Information
  • paper pencil
  • Attitudinal
  • state a preference or choose an option
  • Psychomotor
  • performance quantified on checklist
  • subordinate skills tested in paper-and-pencil
    format

7
Determining Mastery Levels
  • Approach 1
  • mastery defined as level of performance normally
    expected from the best learners
  • arbitrary (norm-referenced)
  • Approach 2
  • defined in statistical terms, beyond mere chance
  • mastery varies with critical nature of task
  • example nuclear work Vs. paint a house

8
Writing Test Items
  • What should test items do?
  • Match the behavior of the objective
  • Use the correct verb to specify the behavior
  • Match the conditions of the objective

9
Writing Test Items
  • How many test items do you need?
  • Determined by learning domains
  • Intellectual requires three or more
  • Wide range use random sample

10
Writing Items (continued)
  • What types (true / false, multiple choice, etc..)
    to use?
  • clues provided by the behavior listed in the
    objective
  • review Types of Test Items this chap. p 148

11
Writing Items (continued)
  • Item types tempered by
  • amount of testing time
  • ease of scoring
  • amount of time to grade
  • probability of guessing
  • ease of cheating, etc.
  • availability of simulations

12
Writing Items (continued)
  • What types are inappropriate?
  • true / false for definition
  • discrimination, not definition
  • Acceptable alternatives from best possible
  • for simulations
  • list steps

13
Constructing Test Items
  • Consider
  • vocabulary
  • setting of test item (familiar Vs. unfamiliar)
  • clarity
  • all necessary information
  • trick questions
  • double negatives, misleading information, etc.

14
Other Factors
  • Sequencing Items
  • Consider clustering by objective
  • Test Directions
  • Clear and concise
  • General
  • Section specific
  • Evaluating Tests / Test Items

15
Measuring Performance, Products, Attitudes
  • Write directions to guide learner activities
  • and

16
Evaluating Performance, Products, Attitudes
  • Construct an instrument to evaluate these
    activities
  • a product, performance, or attitude
  • Sometimes includes both process and a product
  • For example -- TRDEV 518

17
Test Directions for Performance, Products,
Attitudes
  • Determine the
  • Amount of guidance?
  • Special conditions
  • time limits, special steps, etc.
  • Nature of the task (i.e., complexity)
  • Sophistication level of the audience

18
Assessment Instruments for Performance, Products,
Attitudes
  • Identify what elements are to be evaluated
  • cleanliness, finish, tolerance of item, etc.
  • Paraphrase each element
  • Sequence items on the instrument
  • Select the type of judgment for rater
  • Determine instrument scoring

19
Formats for Assessments of Performance,
Products, Attitudes
  • Checklist
  • Rating Scale Frequency Counts
  • Etc.

20
Performance, Products, Attitudes -- Scoring
  • Guidelines?

21
Evaluating Congruency
  • Skills, Objectives, Assessments should refer to
    the same behaviors
  • To check for congruency
  • Construct an Congruency Evaluation Chart
  • include Subskills, Behavioral Objectives,
    Test Items
Write a Comment
User Comments (0)
About PowerShow.com