Formative and Summative Evaluations - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Formative and Summative Evaluations

Description:

Determines the weaknesses in the instruction so that revisions can be made ... Objectivism Observation and quantitative data collected to determine the ... – PowerPoint PPT presentation

Number of Views:4376
Avg rating:5.0/5.0
Slides: 37
Provided by: jeann85
Category:

less

Transcript and Presenter's Notes

Title: Formative and Summative Evaluations


1
Formative and Summative Evaluations
  • Instructional Design
  • For Multimedia

2
Evaluation Phases
Summative Evaluation
Analysis
Design
Development
Formative Evaluation
3
Formative Evaluation
  • Occurs before implementation
  • Determines the weaknesses in the instruction so
    that revisions can be made
  • Makes instruction more effective and efficient

4
Formative Evaluation Is Especially Important
When
  • Designer is novice
  • Content area is new
  • Technology is new
  • Audience is unfamiliar
  • Task performance is critical
  • Accountability is high
  • Client requests/expects evaluation
  • Instructions will be disseminated widely
  • Opportunities for later revision are slim

5
Formative Evaluation Phases
Design Reviews
Expert Reviews
One-to-One Evaluation
Small-Group Evaluation
Learner Validation
Field Trials
Ongoing Evaluation
6
Design Reviews
  • Should take place after each step of the design
    process
  • Goal Review
  • Review of Environment and Learner Analysis
  • Review of Task Analysis
  • Review of Assessment Specifications

7
Design ReviewsGoal Review
8
Design ReviewsEnvironment and Learner Analysis
Review
9
Design ReviewsTask Analysis Review
10
Design ReviewsAssessment Specification Review
11
Expert Reviews
  • Should take place when instructional materials
    are in draft form
  • Experts include
  • Content Experts
  • Instructional Design Experts
  • Content-specific Educational Experts
  • Learner Experts

12
Expert ReviewsContent Experts
  • Subject matter experts (SMEs) review for accuracy
    and completeness
  • Is the content accurate and up-to-date?
  • Does the content present a consistent
    perspective?
  • Example Physics expert

13
Expert ReviewsInstructional Design Experts
  • Reviews for instructional strategy and theory
  • Is the instructional strategies consistent with
    principles of instructional theory?
  • Example Instructional Designer

14
Expert ReviewsContent-Specific Educational Expert
  • Reviews for pedagogical approach in content area
  • Is the pedagogical approach consistent with
    current instructional theory in the content area?
  • Example Science education specialist

15
Expert ReviewsLearner Expert
  • Reviews appropriateness such as vocabulary,
    examples and illustrations
  • Are the examples, practice exercises, and
    feedback realistic and accurate?
  • Is the instruction appropriate for target
    learners?
  • Example 6th grade teacher

16
Expert ReviewsProcess
  • Distribute draft material to experts
  • Collect comments and prioritize into categories
    such as
  • CriticalRevisions should be made immediately
  • Non-criticalDisregard or address at a later date
  • More InfoFind more data or information

17
Learner Validation
  • Try instruction with representative learners to
    see how well they learn and what problems arise
    as they engage with the instruction
  • One-to-One Evaluations
  • Small Group Evaluation
  • Field Trials

18
Learner ValidationOne-to-One Evaluation
  • Present materials to one learner at a time
  • Typical problems that might arise are
  • Typographical errors
  • Unclear sentences
  • Poor or missing directions
  • Inappropriate examples
  • Unfamiliar vocabulary
  • Mislabeled pages or illustrations
  • Make revisions to instruction
  • Conduct more evaluations if necessary

19
Learner ValidationOne-to-One Evaluation Process
  • Present materials to student
  • Watch student interact with material
  • Employ Read-Think-Aloud method
  • Continually query students about problems they
    face and what they are thinking
  • Assure student that problems in the instruction
    are not their fault
  • Tape record or take notes during session
  • Reward participation

20
Learner ValidationSmall Group Evaluation
  • Present materials to 8-12 learners
  • Administer a questionnaire to obtain general
    demographic data and attitudes or experiences
  • Problems that might arise are
  • Students have more or less entry level skills
    than anticipated
  • Course was too long or too short
  • Learners react negatively to the instruction
  • Make revisions to instruction
  • Conduct more evaluations if necessary

21
Learner ValidationSmall-Group Evaluation Process
  • Conduct entry-level and pretests with students
  • Present instruction to students in natural
    setting
  • Observe students interacting with materials
  • Take notes and/or videotape session
  • Only intervene when instruction cannot proceed
    without assistance
  • Administer posttest
  • Administer attitudinal survey or discussion
  • Reward participation

22
Learner ValidationField Trials Evaluation
  • Administer instruction to 30 students
  • Problems that might arise
  • Instruction is not implemented as designed
  • Students have more or less entry-level skills
  • Assessments are too easy or difficult
  • Course is too long or too short
  • Students react negatively to instruction
  • Make revisions
  • Conduct more field trials if necessary

23
Learner ValidationField Trials Evaluation Process
  • Administer instruction students in normal
    setting, in various regions and with varying
    socioeconomic status
  • Collect and analyze data from pretests and
    posttests
  • Conduct follow-up interviews if necessary
  • Conduct questionnaire with instructors who
    deliver the training

24
Formative EvaluationOngoing Evaluation
  • Continue to collect and analyze data
  • Collect all comments/changes made by teachers who
    deliver the instruction
  • Keep track of changes in learner population
  • Revise instruction or produce new material to
    accompany instruction as needed

25
Formative Evaluation Summary
  • Conduct design reviews after each stage of design
    including goals, environment and learner
    analysis, task analysis and assessment
    specifications
  • Conduct expert reviews with content,
    instructional design, content-specific educator
    and learner experts
  • Conduct one-to-one evaluations with students
  • Conduct small-group evaluations with 8-12
    students
  • Conduct field trials with 30 or more students
  • Conduct ongoing evaluations

26
Summative Evaluation
  • Occurs after implementation(after program has
    completed full cycle)
  • Determines the effectiveness, appeal, and
    efficiency of instruction
  • Assesses whether the instruction adequately
    solves the problem that was identified in the
    needs assessment

27
Summative Evaluation Phases
Determine Goals
Select Orientation
Select Design
Design/Select Evaluation Measure
Collect Data
Analyze Data
Report Results
28
Summative EvaluationDetermine Goals
  • Identify questions that should be answered as a
    result of the evaluation
  • Does implementation of the instruction solve the
    problem identified in the assessment?
  • Do the learners achieve the goals of the
    instruction?
  • How do the learners feel about the instruction?
  • What are the costs of the instruction, what is
    the return on investment (ROI)?
  • How much time does it take for learners to
    complete the instruction?
  • Is the instruction implemented as designed?
  • What unexpected outcomes result from the
    instruction?

29
Summative EvaluationDetermine Goals
  • Select indicators of success
  • If program is successful, what will we observe
    it in
  • Instructional materials?
  • Learners activities?
  • Teachers knowledge, practice and attitudes?
  • Learners understanding, processes, skills, and
    attitudes?

30
Summative EvaluationSelect Orientation
  • Come to an agreement with client on most
    appropriate orientation of evaluation
  • Objectivism Observation and quantitative data
    collected to determine the degree to which the
    goals of the instruction have been met
  • Subjectivism Expert judgment and qualitative
    data not based on instructional goals

31
Summative EvaluationSelect Design of Evaluation
  • What data will be collected, when, and under what
    conditions?
  • Instruction, Posttest
  • Pretest, Instruction, Posttest
  • Pretest, Instruction, Posttest, Posttest, Posttest

32
Summative EvaluationDesign or Select Evaluation
Measures
  • Payoff Outcomes - Review statistics that may have
    changed after instruction was implemented
  • Learning Outcomes - Measure for an increase in
    test scores
  • Attitudes - Conduct interviews, questionnaires,
    and observations
  • Level of Implementation - Compare design of
    program to how it is implemented
  • Costs - Examine costs to implement and continue
    program, personnel, facilities, equipment, and
    material

33
Summative EvaluationCollect Data
  • Devise a plan for the collection of data that
    includes a schedule of data collection periods

34
Summative EvaluationAnalyze Data
  • Analyze the data so that it is easy for the
    client to see how the instructional program
    affected the problem presented in the needs
    assessment.

35
Summative EvaluationReport Results
  • Prepare a report of the summative evaluation
    findings that includes
  • Summary
  • Background
  • Description of Evaluation Study
  • Results
  • Discussion
  • Conclusion and Recommendations

36
Summative Evaluation Summary
  • Determine the goals of the evaluation
  • Select objective or subjective orientation
  • Select design of evaluation plan
  • Design or select evaluation measures
  • Collect the data
  • Analyze the data
  • Report the results
Write a Comment
User Comments (0)
About PowerShow.com