Title: Introduction to Instructional Design Designing & Conducting
1Introduction to Instructional Design Designing
Conducting Formative Evaluations
- Dr. Lloyd Rieber
- The University of Georgia
- Department of Instructional Technology
- Athens, Georgia USA
2Objectives
- Describe the purposes for and various stages of
formative evaluation. - Describe the instruments used in a formative
evaluation. - Develop an appropriate formative evaluation plan.
- Collect data according to the formative
evaluation plan. - Compare and contrast formative evaluation to
summative evaluation.
3Revise Instruction
Conduct Instructional Analysis
Assess Need to Identify Goal(s)
Write Performance Objectives
Develop Assessment Instruments
Develop Instructional Strategy
Develop And Select Instructional Materials
Design and Conduct Formative Evaluation
Analyze Learners and Contexts
Design and Conduct Summative Evaluation
(Dick Careys Model)
4The concepts of formative evaluation
- Definition
- The collection of data and information during the
development of instruction that can be used to
improve the effectiveness of the instruction. - Purpose
- To obtain data that can be used to revise the
instruction to make it more efficient and
effective.
5Formative Evaluation Helps to Answer the
Following Questions
- How effective is this instruction at this stage
of development? - What has been learned?
- How usable is the instruction?
- How easy is it for students to use the media Ive
developed? - How motivational is the instruction?
- In what ways can it be improved?
- Improvement is the goal of formative evaluation.
After all, your instruction is at a very
formative stage, is it not?
6What Data Should I Collect?
- Be very open to collecting any data that will
help you answer the questions on the previous
slide. - Dont be defensive as a designer expect
improvements to be needed. - The sooner you begin the evaluation process, the
less costly will be the revisions. - Imagine trying to persuade the most skeptical
person about your lessons effectiveness - Be your own worst critic
7Evaluation and Research Use Similar Methods
- A variety of data Quantitative and qualitative
- Triangulation Do all data point to the same
interpretations? - Quantitative Based on numbers
- Carefully designed instruments that can be scored
- More focus on what and when questions
- Qualitative Based on words
- YOU are the instrument!
- Careful observation
- More focus on why questions
8Quantitative Designs Consider the Pros and Cons
of the Following
9The type of collected data
- Test data collected on entry behaviors tests,
pretests, posttests, and performance context. - Comments or notations made by learners
- Data collected on attitude questionnaires and
debriefing comments - The time required
- Reactions of subject-matter specialist
- Reactions of a manager or supervisor
10The role of subject-matter, learning, and
learner specialists
- Its important to have the instruction reviewed
by specialists. - SME may be able to comment on the accuracy and
currency of the instruction. - Learning specialist may be able to critique your
instruction related to what is known about
enhancing that particular type of learning - Learner specialist may be able to provide
insights into the appropriateness of the material
for the eventual performance context.
11The three phases of formative evaluation
- One-to-One Evaluation
- Small-Group Evaluation
- Field Trial
12One-to-One Evaluation
- Purpose
- To identify and remove the most obvious errors in
the instruction - To obtain initial performance indications and
reactions to the content by learners - Criteria
- Clarity
- Impact
- Feasibility
13One-to-One Evaluation
- Selecting Learners
- Select a few learners who are representative of
the target population. - The elements to be evaluated
- The instruction
- The posttest and attitude questionnaire
- The utility of the evaluation instruments
- The reliability of your judgments
- Scoring strategy
14One-to-One Evaluation
- One-to-one formative evaluation is very dependent
on the ability of designer to establish rapport
with individual learners and then to interact
effectively. - Without the learner, there is no formative
evaluation!
15Small-Group Evaluation
- Purposes
- To determine the effectiveness of changes made
following the one-to-one evaluation. - To identify any remaining learning problems that
learners may have. - To determine whether learners can use the
instruction without interacting with the
instructor.
16Small-Group Evaluation
- Selecting Learners
- Select a group of approximately eight to twenty
learners. - Data
- Quantitative data consist of test scores as well
as time requirements and cost projections - Descriptive information consists of comments
collected from attitude questionnaires,
interviews, or evaluators notes written during
the trial.
17Small-Group Evaluation
- Begins by explaining that the materials are in a
formative stage of development and that it is
necessary to obtain feedback on how they may be
improved. - The instructor should intervene as little as
possible in the process.
18Field Trial
- Purpose
- To determine whether the changes/revisions in the
instruction made after the small group stage were
effective. - To see whether the instruction can be used in the
context for which it was intended.
19Field Trial
- The elements to be evaluated
- The adequacy of learner performance
- The feasibility of delivery
- Learner achievement and attitudes
- Instructor procedures and attitudes
- Resources such as time, cost, space,and
equipment.
20Field Trial
Selecting Learners Identify a group of about
thirty individuals who are representative of the
target population. Observation of the
instruction in use and interview with learners
and instructor will be very valuable.
21Formative Evaluation in the Performance Context
Questions
Data Sources
Methods
- Interviews
- Questionnaires
- Observations
- Records analysis
- Learners
- Colleagues/
- peers of learners
- Subordinates of learners
- Supervisors
- Customers
- Company
- records
- Did the skills transfer?
- How are the skills used?
- What physical, social, managerial factors
enhanced transfer and use of the skills? - What physical, social, managerial factors
inhibited transfer and use of the skills? - Does using the skills help resolve the original
need? How? What is the evidence? - How might training be refined or improved?
22Formative Evaluation in the Performance Context
- Outcomes
- The strengths and weaknesses in the instruction
- Areas where transfer and use of skills can better
be supported - Suggestions for revising instruction to remove
any instructional barriers to implementing the
new skills in the work setting.
23Formative Evaluation of Selected Materials
- In this circumstance the instructor should
proceed directly to a field trial with a group of
learners. - Purpose
- -To determine whether they are effective with a
particular population and a specific setting. - -To identify ways in which additions to and / or
deletions from the materials or changes in
instructional procedures facilitate learning. - The instructor should certainly take the time
following the field evaluation to thoroughly
debrief the learners on their reaction to the
instruction.
24Concerns Influencing Formative Evaluation
- Context Concerns
- To ensure that any technical equipment is
operating effectively. - Concerns about Learners
- To verify that they are actually members of
the target population. - Concerns about Formative Evaluation Outcomes
- Be prepared to obtain information that
indicates that your materials are not as
effective as you thought. - Concerns with Implementing Formative Evaluation
- To answer a question of when, where, and how.
25Assessing Physics Understanding
- Pretend there is no friction or gravity. If a
ball is moving to the right and its acceleration
is also to the right, which of the following is
true? - The balls speed is not changing.
- The balls speed is increasing.
- The balls speed is decreasing.
- The balls speed increases at first, and then
decreases. - None of the above are true.
26Assessing Physics Understanding
- If the speedometer needle of a car moved at a
steady rate from the 30 mph mark to the 40 mph
mark over a stretch of flat, straight road, which
of the following is true? - Acceleration was nonzero in the opposite
direction the car was moving. - Acceleration was 0.
- Acceleration was nonzero in the direction the car
was moving. - Acceleration was nonzero, but decreasing.
- Acceleration was nonzero and increasing.
27Assessing Physics Understanding
C
B
D
E
A
Imagine that you threw a ball up into the air and
it just left your hand at point A. Describe the
motion of the ball and all the forces acting on
it at each point.
28Assessing Physics Understanding
29Formative vs. Summative Evaluation
- The purpose of formative evaluation is to improve
instruction by getting data for revisions. - The purpose of summative evaluation is to prove
the worth of the instruction, given that it will
not be revised.
30Summative Evaluation Is Similar To
31Closing
- Formative evaluation of instructional materials
is conducted to determine the effectiveness of
the materials and to revise them where needed. - Formative evaluation is an iterative process
containing at least three cycles of data
collection, analysis, and revision - One-to-one
- Small group
- Field test
- Formative evaluation aims to improve the
instruction, whereas summative evaluation aims to
prove the worth of the instruction.