Title: Assessing for Deep Learning
1Assessing for Deep Learning
- Presented at
- California Assessment Institute
- September 30, 2002
-
- Peggy Maki
- Director of Assessment, AAHE
2Areas of Focus
- Origin of our commitment to learn about student
learning - Focus of our assessment efforts
- Approaches to Learning
- Alignment of Teaching, Learning, Assessment
- Collective development of outcomes and rubrics
- Evidence of student learning/development
- Tell the story/answer the question
3Origin of Our Commitment to Learn about Student
Learning
Internal
4Focus of Our Assessment Efforts
- What do you expect your students to know and be
able to do by the end of their education at your
institution? - What do the curricula and other educational
experiences add up to? - What do you do in your classes or in your
programs to promote the kinds of learning or
development that the institution seeks?
5Questions (cond)
- Which students benefit from which classroom
teaching strategies or educational experiences? - What educational processes are responsible for
the intended student outcomes the institution
seeks? - How can you help students make connections
between classroom learning and experiences
outside of the classroom? - What pedagogies/educational experiences develop
knowledge, abilities, habits of mind, ways of
knowing/problem solving?
6Questions, cond
- How are curricula and pedagogy designed to
develop knowledge, abilities, habits of mind,
ways of knowing? - What methods of assessment capture desired
student learning--methods that align with
pedagogy, content, and curricular design? - How do you intentionally build upon what each of
you teaches or fosters to achieve programmatic
and institutional objectives?
7Approaches to Learning
- Surface Learning
- Deep Learning
8How the Leaner Learns
9When a Student Becomes a Biologist, Psychologist,
Writer..
10- Every assessment is also based on a set of
beliefs about the kinds of tasks or situations
that will prompt students to say, do, or create
something that demonstrates important knowledge
and skills. The tasks to which students are asked
to respond on an assessment are not arbitrary.
They must be carefully designed to provide
evidence that is linked to the cognitive model of
learning and to support the kinds of inferences
and decisions that will be based on the
assessment results. - National Research Council. Knowing what
students know The science and design of
educational assessment . Washington, D.C.
National Academy Press, 2001, p. 47.
11Aligning Teaching, Learning and Assessment
12Assessing for Learning
Assessment Task Designed to Ascertain How Well
Students Achieve Expected Outcome
13Aligning Teaching, Learning, Assessment
- What do you expect students to know, understand,
be able to do as a result of - your teaching?
What methods develop/foster those outcomes?
What assumptions underlie your methods?
14Alignment of our Outcomes
15When Do You Seek Evidence?
- Formativealong the way?
- For example, to ascertain progress
- or development
- Summativeat the end?
- For example, to ascertain mastery level of
achievement
16What Tasks Elicit Learning You Desire?
- Tasks that require students to select among
possible answers (multiple choice test)? - Tasks that require students to construct answers
(students problem-solving and thinking
abilities)?
17Write Outcome Statements
- Develop statements that describe what students
should know, understand and can do with what they
have learned. - Literate student evaluates information and
its sources critically and incorporates selected
information into his or her knowledge and value
system. - ONE OUTCOME Student examines and
- compares information from various
sources in - order to evaluate validity, reliability,
accuracy, - timeliness, and point of view or bias.
-
Source ACRL
18Develop Rubrics to Assess Work
- Levels of achievement
- Criteria that distinguish good work from poor
work - Descriptions of criteria at each level of
achievement - For example, mastery levels
19Evidence of student learning andDevelopment
- Student work samples
- Collections of student work (e.g. Portfolios)
- Capstone projects
- Course-embedded assessment
- Other observations of student behavior
- Internal juried review of student projects
20Evidence (cond)
- External juried review of student projects
- Externally reviewed internship
- Performance on a case study/problem
- Performance on problem and analysis (Student
explains how he or she solved a problem) - Performance on national licensure examinations
- Locally developed tests
- Standardized tests
- Pre-and post-tests
- Essay tests blind scored across units
21Tell the Story/Answer the Question
- Disaggregate the data.
- Report results using graphics and comparative
formats. (Show trends over time, differences
based on your demographics) - Publish short, issue-specific reports or research
briefs. (Organize presentation of results around
issues of interest, not the format of the data.) - Combine outcomes information with other data.
(When relevant, incorporate statistics about
course enrollments, FTE students, credit hours
earned, number of majors.) - M. Kinnick,Four Strategies for Effective Data
Reporting.
22Meaningful Use of Data
- Collect data from different sources to make a
meaningful point (for example, classroom samples
and samples from the librarians). - Collect data you believe will be useful to
answering the question you have raised. - Organize reports around issues, not solely data.
- Interpret your data so that it informs pedagogy,
budgeting, planning, decision-making, or policies
23Principles of Assessment
- Assessment is not a one-time activity rather, it
is evolutionary, ongoing, and incremental. - Over time assessment efforts should become more
comprehensive, systematic, integrative, and
organic. - Successful assessment efforts are compatible with
the institutions mission and its available
resources.
24- Assessments primary focus is on the
teaching/learning experience. - Institutional autonomy should be preserved and
innovation encouraged to provide the highest
quality education.
25What and how students learn depends to a major
extent on how they think they will be assessed.
John Biggs, Teaching for Quality Learning at
University What The Student Does. Society for
Research into Higher Education Open University
Press, 1999, p. 141.