Title: CAESL And The Future Of Science Assessment
1CAESL And The Future Of Science Assessment
- Rich Shavelson
-
- Stanford Education Assessment Lab
2The Talks Charge
- An address that summarizes our conference
theme of building a community to improve
assessment. We are hoping that you could help us
all to see where CAESL fits into the complicated
assessment landscape nationally and inspire us to
go forth and fix things!
3Overview
- The national landscape NCLB, science assessment
CAESL - The CAESL landscape Building communities to
improve assessment - The conclusion Go forth and fix things
4The National LandscapeNo Child Left Behind
5NCLB Accountability Mechanisms
- States must
- Test all 3rd 8th graders in the nation in
mathematics and reading in 2001-2014 and science
in 2007-08 - Develop achievement (proficiency) standards on
the test at least for Basic, Proficient,
Advanced - Track adequate yearly progress toward goal of
100 students proficient in reading and math
within 12 years - Achieve goal by subgroup including
race/ethnicity, poverty level, disability,
limited English proficiency, migrant, homeless
6NCLB Science Assessment 2006-07 Requirements
- In 2005-06, states must have challenging academic
content standards in science that may be - Grade specific
- Cover more than one grade
- Course specific in high school
- In 2007-08 states must administer up-to-date
science assessments - Aligned with science standards, including
higher-order thinking skills and understanding, - At least once each in grades 3-5, 6-9, and 10-12
7Some Responses To Science Assessment Mandate
- U.S. Department of Education
- CAESL assessment system NRC Committee on Test
Design for K-12 Science Achievement (CAESLs
Wilson Herman) - Delaware audit-assessment project
8CAESL Assessment System (Or MB Aquarium
Chocolate Mousse)
9NRC Committee ModelsMay 2004 Workshop
- Partnerships among test publishers, research
organizations, and scientific industries (Rich
Patz) - Application of assessment advice offered by the
National Commission on Instructionally Supportive
Assessment (Jim Popham) - Models for multi-level state science assessment
systems (Edys Quellmalz) - Classroom-based assessment system for science A
model (Barbara Plake)
http//www7.nationalacademies.org/bota/Science20A
ssessment20Workshop20Agenda.html
10Delaware Audit-Assessment Partnership Between
State Researchers
- Science assessment in Delaware
- Delaware Student Testing ProgramScience
- NAEP science assessment
- Audit assessment built around State science
framework and SEALs/CAESLs knowledge types - Model of collaboration between assessment
researchers and a state
11More Detail On Responses To NCLB Criteria For
Assessment Systems
- Comprehensive Measures the broad science
content/ knowledge domain using multiple
indicators - Coherent Links external, summative assessment
for accountability with internal (classroom)
formative assessment for improving learning and
teaching - Continuous Tracks students performance
throughout their school years
12Comprehensive Expanded SEAL/CAESL Targeted Goals
- What does it mean to achieve in K-12 science?
- Declarative knowledge knowing that--facts and
concepts in the domain - Procedural knowledge routine procedures and some
aspects of problem solving - Schematic (analytic) knowledge conceptual
models of how the natural world works - Strategic (transfer) knowledge knowing when,
where and how knowledge applies - Epistemic knowledge knowing how we
knowknowing how scientific knowledge is built
and justified - Communication social skills ability to
communicate ideas clearly and concisely in the
genre of science - Motivation commitment to learning, knowing and
using science - Manner habit of mind to inquire, observe, bring
observation to bear in knowledge claims and
reaching moral/ethical judgments
13Coherent Research Multi-Level Assessment Systems
- CAESLs model into practice
- CLAS multi-level system
- Washington State coherent assessment system of
embedded assessments - NRCmulti-level models (Quellmalz)
http//www7.nationalacademies.org/bota/Science20A
ssessment20Moody.pdf
14Continuous Tracking Of Student Performance
- BEAR/CAESL continuous assessment of key progress
variables - Multi- dimensional vertical equating (Patz et
al.)of SEAL/CAESL knowledge types?
Patz
http//www7.nationalacademies.org/bota/Science20A
ssessment20Patz.pdf
15Technology Needed To Realize Criteria For
Assessment System
- Comprehensive assessment must capitalize on
technology needed to - Deliver performance assessments, concept-maps,
POEs, Bayesian network assessment (etc.)
cost-effectively - Construct, present, store, and score assessments
(e.g., e-rater, automated) - Coherent assessment must capitalize on technology
to - Store and analyze data
- Integrate multiple levels of assessment,
- Present and interpret assessment information
- Provide quick feedback on performance
- Continuous assessment must capitalize on
technology to - Store and analyze data over time
- Integrate multiple levels
- Present and interpret assessment information over
time
16The CAESL Landscape Building Communities To
Improve Assessment
17Some Directions for Research
- Continue to develop assessments linked to
knowledge types focusing on - Knowledge types not yet measured in the CAESL
assessment system - Streamlining assessment-development process
- Validity of second-language learners assessment
scores - Accommodations for special needs students
- Continue to explore the use of progress
variables, solving practical (scoring)
constraints for teachers - Initiate case studies of emerging multi-level
assessment systems (e.g., WA, Maine) - Initiate studies of the use of computer
technology for collecting, scoring and modeling
student performance on assessments
18 And Grad Student Training
- Sponsor
- Annual research conference with both student and
researcher presentations - A class meeting at end of one of the quarters for
students to present work completed in class
(rotate annually) - Create practical opportunities for students to
apply knowledge holistically to science
assessment development problems (e.g., BASEE MSP
project)
19Missing (In) Action! Build Communities
- CAESL knows a great deal about science assessment
BUT it needs to - Communicate this knowledge to educators, policy
makers and the public - Link this knowledge to enhancing teachers
assessment practices and science curricula - Increase its research on practical applications
of what it knows - CAESL needs to do a better job linking what it
knows about assessment to enhancing - Teachers assessment practices
- Science curricula
- CAESL needs to increase its research on practical
applications of what it knows - CAESL can address these and other matters by
continuing to build community within the Center
20Enhance Public Understanding Of Assessment
- Focus on practical questions
- What is science assessment?
- What kinds of science knowledge and understanding
should we be concerned about and measure to meet
NCLB requirements and why? - How can we track progress to improve student
learning, curriculum and teaching, and why? - Communicate what we know and need to know
- To specific public and practitioner audiences
(e.g., op ed pieces) - At practitioner conferences such as the NSTA and
CSTA - Through an improved web site making assessment
information and materials accessible to
practitioners - Other
21Questions For Research On Teacher Practice
- How can we make assessments easy for teachers to
construct and use in their classroom? - How can we enhance teachers formative assessment
practices to improve student learning of science
and student performance on accountability
assessments? - How can we help teachers use summative assessment
information collected under NCLB to improve their
teaching, curriculum and students learning? - What is an appropriate conceptual framework for
this applied research - Teacher conceptual change?
- Teacher inquiry questioning skills?
- Other?
22Linking CAESLs Strands To Address Practical
Research Agenda
Enhance community among all 5 CAESL strands to
carry out research agenda
- Strand 1 Focus student apprenticeships on
research agenda related to all other 4 strands - Strand 2 Link research agenda with practical
challenges of teacher enhancement - Strand 3 Link research agenda with practical
challenges of pre-service teacher preparation - Strand 4 Re-allocate part of scare research
resources to broader practical research agenda - Strand 5 Link research agenda on public
understanding to public outreach
23In Conclusion Go Forth And Fix Things
- The practical realities and demands of NCLB make
CAESLs mission critical to the quality of
science education in this nation - A Center devoted to science assessment is
incomplete curriculum, teaching, learning and
assessment cannot be disentangled - This fact gives rise to a broadened practical
research agenda for an Assessment Center
24Concluding Comments (Continued)
- CAESL will have to plan and address
- The need for practical as well as path-breaking
research on formative and summative assessment
and its link to curriculum, teaching and learning
(even if not highly valued in academia) - The realization that the next generation research
and development agenda needs to integrate the
views of academics, practitioners, and graduate
students - The trade-off with scarce CAESL resources in
balancing assessment and practical research needs
will be very difficult but important
25NRC Committee Charge
- Provide guidance and make recommendations
useful to states in designing and developing
quality science assessments to meet the NCLB
2007-2008 implementation requirement and - Foster communication and collaboration between
the NRC committee and key stakeholders in the
states and in schools so that the guidance
provided by the NRC committee's report is both
responsive and can be practically implemented in
states and schools
http//www7.nationalacademies.org/bota/Test_Design
_K-12_Science.html