Carl Wieman - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Carl Wieman

Description:

Measuring Impact in STEM Ed; Are they thinking like experts? Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy – PowerPoint PPT presentation

Number of Views:1345
Avg rating:3.0/5.0
Slides: 31
Provided by: cwie97
Category:

less

Transcript and Presenter's Notes

Title: Carl Wieman


1
Measuring Impact in STEM Ed Are they thinking
like experts?
Carl Wieman Assoc. Director for Science White
House Office of Science and Technology Policy
2
  • The White House perspective

Maintaining our leadership in research and
technology is crucial to Americas success. But
if we want to win the future if we want
innovation to produce jobs in America and not
overseas then we also have to win the race to
educate our kids.  B. Obama
Major Policy questions What is effective
teaching, particularly in STEM? Can it be
developed? How? How can we achieve better
learning? (evidence!)
3
switching hats to science education researcher
What is the broad goal of your project? ? how
to measure
What is the learning that matters to you? (30
s)
Bunch of facts solution techniques? May be
useful, but use tiny fraction of what learn in
school, and in career need vastly more than learn
in school.
Want them to understand _____! DNA, relativity,
PH
What does understand mean? How measure if
achieved?
Think about and use ____ like a
scientist/engineer.
4
Think like a scientist/engineer.
I. What does that mean? Expert thinking (cog.
pysch.) II. Development of expert thinking III.
More details on expert thinking IV. Measuring
--developing tools
5
Major advances past 1-2 decades Consistent
picture ? Achieving learning
brain research
Science classroom studies
cognitive psychology
?principles of learning help design experiments
and make sense of results. Understand both what
and why.
6
Expert competence research
historians, scientists, chess players, doctors,...
  • Expert competence
  • factual knowledge
  • Mental organizational framework ? retrieval and
    application

patterns, relationships, scientific concepts
  • Ability to monitor own thinking and learning
  • ("Do I understand this? How can I check?")

New ways of thinking-- require MANY hours of
intense practice to develop
Cambridge Handbook on Expertise and Expert
Performance
7
Significantly changing the brain, not just adding
bits of knowledge.
Building proteins, growing neurons ? enhance
neuron connections, ...
Brief digression on research on development of
expertise.
8
  • Essential element of developing expertise
  • Deliberate practice (A. Ericcson)
  • task of challenging but achievable level that
    requires explicit expert-like thinking. Intensely
    engaged
  • reflection and guidance on result
  • repeat repeat repeat, ...
  • 10,000 hours later-- very high level expertise

Different brain, develops with exercise.
cew interpretation--formative assessment,
constructivism, self-regulated learning all
contained in deliberate practice framework.
accurate, readable summary in Talent is
over-rated, by Colvin
9
Think like a scientist/engineer.
I. What does that mean? Expert thinking (cog.
pysch.) II. Development of expert thinking III.
More details on expert thinking IV. Measuring
--developing tools
10
How experts solve a problem Cognitive task
analysis
(and how different from non-experts)
features in your discipline? (1 min)
  • concepts and mental models (analogies)
  • testing these and recognizing when apply or not
  • distinguishing relevant irrelevant information
  • established criteria for checking suitability of
    solution method or final answer (sense-making
    and self-checking)

How Scientists Think in the Real World
Implications for Science Education, K. Dunbar,
Journal of Applied Developmental Psychology
21(1) 4958 2000
11
Lots of complex pattern recognition
What features and relationships important? Which
are not? (surface features vs. underlying
structure)
Often hear-- Novice problem solvers just do
pattern matching, experts use more sophisticated
concept based strategies.
cew unproven claim (not official WH position) It
is all pattern matching experts just look for
and recognize different patterns.
12
Non-cognitive elements of thinking like a
scientist. Perceptions/attitudes/beliefs (importa
nt, but changed more quickly, essential precursor
to deliberate practice)
13
Perceptions about science ( how learned and
used)
Expert
Novice
Content isolated pieces of information to be
memorized. Handed down by an authority.
Unrelated to world. Problem solving simple
matching to memorized recipes.
Content coherent structure of
concepts. Describes nature, established by
experiment. Prob. Solving Systematic
concept-based strategies. Widely applicable.
consistent views across scientists in a
discipline (physics, chem, bio)
adapted from D. Hammer
14
Student Perceptions/Beliefs
Kathy Perkins, M. Gratny
60
Percent of Students
50
40
30
20
10
0
0
10
20
30
40
50
60
70
80
90
100
Expert
Novice
CLASS Overall Score (measured at start of 1st
term of college physics)
60
Actual Majors who were
B
originally intended phys majors
Percent of Students
50
Actual Majors who were NOT
40
originally intended phys majors
30
20
10
0
0
10
20
30
40
50
60
70
80
90
100
CLASS Overall Score (measured at start of 1st
term of college physics)
15
Student Beliefs
60
Actual Majors who were
Percent of Students
originally intended phys majors
50
Actual Majors who were NOT
40
originally intended phys majors
30
20
10
0
0
10
20
30
40
50
60
70
80
90
100
Expert
Novice
CLASS Overall Score (measured at start of 1st
term of college physics)
16
Course Grade in Phys I or Phys II(beliefs more
important factor than grades)
45
40
Percent of Students
35
30
25
20
15
10
5
0
DFW
C
B
A
Grade in 1st term of college physics
17
Creating tests to measure expert thinking as
different from non-expert (technical details) A.
Cognitive
Must understand student thinking! No substitute
for interviews. Cognitive think aloud
solution to task. Look for consistent features
that appear. Code interviews and have
independent coding to make objective. (BEWARE
CONFIRMATION BIAS!)
  • Things to look for
  • What mental models?
  • How make decisions?
  • What resources called upon (or not)?

18
Creating tests to measure expert thinking as
different from non-expert
Example testing use of expert mental model
troubleshooting Your laser suddenly put out
only half as much light as it had been before.
What change may have produced this result?
redesign What are all the ways you could double
the power coming out of your laser?
You would like to (e.g. build a bridge across
this river). What information do you need to
solve this problem?
19
Steps in test development
1. Interview faculty-- 2. Interview students--
understand student thinking 3. Open-ended survey
questions to probe. 4. Create multiple choice
test-- answer choices reflect actual student
thinking. 5. Validation interviews on test
experts and sample population 6. Administer to
classes-- run statistical tests on results.
Often iterate and/or skip steps,
refine. Reasonable data much better than no
data!
20
Measuring perceptions. Same basic
approach. Interview students, capture perceptions
in their own words. Survey as to level of
agreement.
40 statements, strongly agree to strongly
disagree-- Understanding physics basically
means being able to recall something you've read
or been shown. I do not expect physics
equations to help my understanding of the ideas
they are just for doing calculations.
21
Conclusion Important educational goal Thinking
like a scientist Requires careful analysis to
make explicit,and distinguish from thinking of
nonexperts. Straightforward process to create
tests that measure. More sensitive and
meaningful than typical exams.
Development and validation of instruments to
measure learning of expert-like thinking, W.
Adams and C. Wieman, Int. J. Sci Ed (in press).
Covers last part of talk and technical details.
22
Tips for developing assessment tools. 1.
Interview largest possible range of
people. Patterns and expert-novice differences
more obvious. 2. 100 student classes in large
university dont vary year-to-year. Good way to
get test-retest reliability, find out if can
measure changes. 3. Best questions a)measure
important aspect of student thinking and
learning. b) measure aspect that instructors care
about shocked at poor result. 4. Hard and not
so useful to measure expert-like thinking on
everything. Sample as proxy.
23
  • Key elements of Good Concept Inventory
  • created by physicists, key concepts where student
    failure is shocking.
  • (not probed by standard exams)
  • easy to administer exam pre post. Learning
    from this course
  • set of hard-to-learn topics-- (not everything)
  • proxy for broader learning (mastery application
    of concepts)
  • Suitable for use with wide range of institutions
    and students

24
How administer?
Attitude surveys-- online, 1st and last week of
class small bonus mark for completion.
80-98
Concept inventories-- Pre--in class, 1st week.
Paper, scantron. Students not keep test. Post--
In class last week (guide to in-class review
and study for final exam). No affect on course
mark. Occasional question on final. 90
25
Summary
  • Data to drive educational improvement
  • Requirements
  • measure value added (pre -post)
  • easy to use (more important than perfection)
  • test expert-thinking of obvious value to
    instructor
  • validated (measure what is claimed)
  • need many such instruments to use across
  • curriculum (collaborate)

instruments research papers class.colorado.edu C
WSEI.ubc.ca
26
Measuring conceptual mastery
  • Force Concept Inventory- basic concepts of force
    and motion 1st semester university physics.
    Simple real world applications.

Ask at start and end of semester-- What
learned? (100s of courses)
improved methods
On average learn lt30 of concepts did not already
know. Lecturer quality, class size,
institution,...doesn't matter! Similar data for
conceptual learning in other courses.
R. Hake, A six-thousand-student survey AJP
66, 64-74 (98).
27
Nearly all intro classes average shifts to
be 5-10 less like scientist. Explicit
connection with real life ? 0
change Emphasize process (modeling) ? 10 !!
new
28
What every teacher should know Components of
effective teaching/learning apply to all levels,
all settings 1. Motivation (lots of
research) 2. Connect with prior thinking 3.
Apply what is known about memory a. short term
limitations (relevant to you) b. achieving long
term retention retrieval and
application-- repeated spaced in time
4. Explicit authentic practice of expert
thinking. Extended strenuous
basic cognitive emotional psychology, diversity
29
(No Transcript)
30
Design principles for classroom instruction 1.
Move simple information transfer out of class.
Save class time for active thinking and feedback.
2. Cognitive task analysis-- how does expert
think about problems? 3. Class time filled with
problems and questions that call for explicit
expert thinking, address novice difficulties,
challenging but doable, and are motivating. 4.
Frequent specific feedback to guide thinking.
DP
Write a Comment
User Comments (0)
About PowerShow.com