Title: SUNY GENERAL EDUCATION ASSESSMENT CONFERENCE
1SUNY GENERAL EDUCATION ASSESSMENT CONFERENCE
- Guidelines for and Implementation of Strengthened
Campus-Based Assessment
2Presenters
- Patricia Francis, Melanie Vainder, and Tina Good
- GEAR Co-Chairs
3Session Objectives
- To enable participants to return to their
institution with a clear idea of how to begin the
process of revising their campus existing
assessment plan to meet the new GEAR guidelines - To begin dialogue among ourselves focusing on
best assessment practices as we move toward
implementing Strengthened Campus-Based Assessment
4Specific Topics to be Covered
- Clarification of how the process will work with
special emphasis on issues of concern raised by
campuses - Using nationally-normed measures and correlating
a local measure to a nationally-normed measure
Issues to consider and advantages/disadvantages - Using scoring rubrics and standards Issues to
consider and advantages/disadvantages
5How the Process Will Work The New Guidelines
- Patricia Francis, Assistant Provost for
University Assessment and Academic Initiatives
6Strengthened Campus-Based Assessment Major
Implications
- One general education assessment process,
overseen by GEAR - Utilization of externally referenced measures for
Basic Communication Written, Critical Thinking
Reasoning, and Mathematics, effective Fall 2006 - Measure of campus academic environment
- Option of using value-added approach
- Cost to be covered by System Administration (with
sample size limitations consistent with existing
GEAR guidelines)
7GEARs 1 Operating Principle
- Require as few changes as possible in campus
existing general education assessment plan (and,
therefore, minimal new information)
8Campus Responses to Draft GEAR Guidelines
9Funding
- System Administration will bear the cost of all
three measurement options, based upon a sample
size equal to at least 20 of total students
enrolled in a learning outcomes area at the time
of the assessment - System Administration will also fund the
administration of the NSSE, CCSSE, or other
measure of academic environment
10Mathematics Learning Outcomes
- For Strengthened Campus-Based Assessment,
campuses will develop plans that focus on the new
math outcomes approved by ACGE and the Provost - These outcomes can be found in your registration
packet
11Mapping of Existing Nationally-Normed Measures to
SUNY Learning Outcomes
- GEAR concluded there was inadequate mapping
during Fall 2004 - In meetings between System Administration staff
and testing company representatives, we
emphasized the importance of adapting measures to
meet SUNYs needs
12Course-Embedded Assessment as an Assessment
Strategy
- GEAR has always encouraged campuses to use
course-embedded assessment, and will continue to
do so (though campuses are certainly free to
propose and use alternative approaches)
13Integrating New Campus Plans Into Existing
Campus-Based Plans
- Campuses already have GEAR-approved plans, and
much of what is included in those plans need not
be changed - In particular, campuses should feel free to
adhere to their existing assessment schedule - The major change Effective Fall 2006, campuses
must use externally referenced measures as
approved by GEAR to assess Writing, Critical
Thinking, and Mathematics
14Options 1 and 2 Using Nationally-Normed
Measures and Correlating a Local Measure to a
Nationally-Normed Measure
- Melanie Vainder, Professor of English and
Technical Communications, Farmingdale State
University
15GEAR Research Existing Nationally- Normed
Measures
- ACT CAAP
- ACADEMIC PROFILE
- CALIFORNIA CRITICAL THINKING SKILLS TEST
- QUANT Q
- CRITICAL REASONING APPRAISAL
- GRE
- ACCUPLACER (Including WritePlacer)
16Using Nationally-Normed Measures Advantages
- Less labor intensive with respect to test
development and scoring (particularly in the area
of writing), and reliability of scoring assured - Provides opportunity for campuses to compare
results with those obtained at peer institutions
- Reporting capacity provided by companies,
allowing campuses to examine overall program
effectiveness, success of individual courses, and
relationship between student variables and
performance
17Using Nationally-Normed Measures Advantages
(cont.)
- Relative ease of using pre- and post-test
approach in order to determine value added if
desired - Ability for campuses to choose from among
available modules in the areas of Writing,
Mathematics, and Critical Thinking (i.e., it is
not an all or nothing approach) - Possibility of using measures in a
course-embedded fashion, completed within a
single class session
18Using Nationally-Normed Measures Disadvantages
- Problems with student motivation in stand-alone
testing - Existing measures do not map adequately to the
SUNY Learning Outcomes for Writing, Mathematics,
and Critical Thinking - Existing measures do not yield separate
sub-scores for each of the Learning Outcomes for
Writing, Mathematics, and Critical Thinking
19Correlating a Local Measure to a Nationally-
Normed Measure Issues to Consider
- Does the local measure directly assess student
learning and does it measure the learning
outcome(s) it is intended to measure? - Is it characterized by adequate inter-observer
reliability? - Has it been demonstrated to correlate
statistically with a nationally-normed measure of
the same learning outcome(s)?
20Correlating a Local Measure to a Nationally-
Normed Measure Advantages
- Closer alignment between locally-developed
measures and curriculum - Local measure can be specifically developed to
meet all SUNY Learning Outcomes - Possibility that campuses may continue to use
previously-used measures (and therefore be able
to make direct comparisons between student
performance on the same measure)
21Correlating a Local Measure to a Nationally-
Normed Measure Disadvantages
- Duplicate testing will be needed at outset to
demonstrate correlations between local and
nationally-normed measures - Very time- and labor-intensive
- Student motivation factor
- Extensive psychometric expertise required with
this approach
22Using Scoring Rubrics and Standards
- Tina Good,
- Assistant Professor of English,
- Suffolk County Community College
23Option 3 Using Scoring Rubrics and Standards
- Discipline-Specific Panels are working to create
rubrics and standards for - Written Communication
- Mathematics
- Critical Thinking
- Process of rubric design will be transparent
- Drafts of rubrics will be posted online
- Minutes and membership are posted online
24Using Scoring Rubrics and StandardsOptions
- Use the actual rubrics and standards created by
Discipline-Specific Panels - Show how your campus rubrics correlate to the
rubrics designed by the panels - Mix and match
25Using Scoring Rubrics and StandardsAdvantages
- Provides an opportunity to re-submit already
developed rubrics and demonstrate correlations
with those designed by panels - Provides for faculty involvement in the creation
of rubrics and standards for their own programs - Allows for revision of rubrics as innovations,
philosophies and pedagogies evolve in the
discipline
26Using Scoring Rubrics and Standards Advantages
- Provides for faculty involvement in the
assessment process (i.e., through application of
the rubrics) - Rubrics can be specifically developed to meet all
SUNY Learning Outcomes - Provides for collaboration on multiple levels
throughout the assessment process
27Using Scoring Rubrics and StandardsDisadvantages
- Assessment process more cumbersome to implement
than a nationally-normed measure - The level of faculty involvement required could
also be a disadvantage, especially for those
programs that have few faculty available to serve
on assessment committees - Establishing validity and reliability of process
can be time consuming
28Implementing Strengthened Campus-Based
Assessment Resources
- The GEAR Group and Web site (www.cortland.edu/gear
) - SUNY Systems Office of Academic Affairs
- Sister campuses many best practices are
already out there! - Discipline-Specific Panels
- Other ideas?
29SUNY GENERAL EDUCATION ASSESSMENT CONFERENCE
- Guidelines for and Implementation of Strengthened
Campus-Based Assessment