Evaluating Adult Programs - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Evaluating Adult Programs

Description:

... to achieve objectives? Do activities lead to achievement of ... Methods used to achieve objectives. Occurs while learning is taking place. Examines the process ... – PowerPoint PPT presentation

Number of Views:104
Avg rating:3.0/5.0
Slides: 23
Provided by: JamesH127
Category:

less

Transcript and Presenter's Notes

Title: Evaluating Adult Programs


1
Evaluating Adult Programs
2
What is Evaluation?
  • Systematic approach to assess the
    conceptualization, design, implementation, and
    utility of programs.
  • Asking questions
  • Collecting answers
  • Making decisions based on those answers

3
Why Evaluate?
  • When you dont know where you are going, you are
    probably already there.
  • Yogi Berra

4
Why Evaluate?
  • Make revisions and modifications
  • Determine effectiveness
  • Accountability
  • Decision making
  • When evaluation is performed, you always find out
    something.

5
What to Evaluate?
  • Leader/instructor
  • Program, instruction
  • Technology
  • Environment
  • Support services
  • Levels of use
  • Cost
  • Outcomes
  • Management
  • (Hezel, 1995)

6
What to Evaluate
  • According to Borg and Gall (1989) program
    evaluation may involve any of these areas
  • Instructional methods
  • Curriculum materials
  • Programs
  • Organizations
  • Educators
  • Students

7
Evaluation leads to decisions
  • Which programs will be continued?
  • Which curriculum materials will be used?
  • What methods of instruction will be incorporated?
  • What activities are beneficial?
  • Who will facilitate learning?

8
Types of Evaluation
  • Formative
  • Summative

9
Formative Evaluation
  • Focuses on appraisal of program quality
  • Whenduring the development stage
  • Whycollect information about process
  • Resultimplement adjustments
  • Usually, development personnel from within
    conduct the process

10
Major focusImprovement
  • What are programs goals objectives?
  • Is material sufficient to achieve objectives?
  • Do activities lead to achievement of objectives?
  • Is material technically accurate?
  • Are learning activities relevant?
  • What resources are required?
  • What modifications are needed?
  • What problems exist?

11
Summative Evaluation
  • Focuses on effectiveness of a completed program.
  • Whenat the conclusion of a program (viewed as
    product evaluation)
  • WhyDetermine the effects
  • ResultReport about the effectiveness
  • Major focusshould a program be continued?

12
Forms of Evaluation
  • Process evaluation
  • Product evaluation

13
Process Evaluation
  • Methods used to achieve objectives
  • Occurs while learning is taking place
  • Examines the process
  • Focuses on what does and does not need revision

14
Product Evaluation
  • Looks at results of a program after its
    implementation
  • Focuses on summative question techniques
  • Examination for accuracy or obsolescence

15
Five Evaluation Models
  • Objectives Approach (Tyler)
  • Goal-Free (Scriven)
  • CIPP (Stufflebeam)
  • Hierarchy of Evaluation (Kirkpatrick)
  • Naturalistic (Guba)

16
Objectives Approach (Tyler)
  • Consistency between goals, experiences, and
    outcomes
  • Pretest-posttest design
  • Behavior measured by norm-referenced or
    criterion-referenced tests
  • Measures student progress

17
Goal-Free (Scriven)
  • External evaluator unaware of stated goals and
    objectives
  • Determine value and worth of program based on
    outcomes or effects and quality of those effects

18
CIPP (Stufflebeam)
  • Evaluation is a tool to help make programs better
  • Collects information from variety of sources to
    provide basis for making better decisions
  • Based on four phases
  • Context
  • Input
  • Process
  • Product

19
Hierarchy of Evaluation (Kirkpatrick)
  • Four levels of evaluation
  • Level 1Reaction (participant satisfaction)
  • Level 2Learning (participant knowledge, mastery)
  • Level 3Behavior (transference of skills)
  • Level 4Results (community impact)

20
Naturalistic (Guba)
  • Takes into account participants definitions of
    key concerns and issues
  • Advocates qualitative modes of data collection
  • Allows subjects to set investigative agenda and
    determine criteria for evaluation
  • Use language and mode of presenting findings that
    are accessible to participants

21
Evaluation Strategies
  • Worth of evaluations
  • When they are useful, feasible, proper, and
    accurate
  • When not to evaluate
  • No purpose or need is unclear
  • Data not used
  • Possibly one-time only programs

22
Evaluation puts pieces together to make a picture
Write a Comment
User Comments (0)
About PowerShow.com