Title: Assessing%20Developmental%20English
1Assessing Developmental English
- Lucia Lachmayr, Ariel Vigo, Karen Wong
- Strengthening Student Success Conference
- San Jose, CA
- October 3-5, 2007
2Workshop SLOs
- Describe the Skyline College English Departments
approach to assessing developmental English. - Identify the problems we encountered to
anticipate problems that may come up in your own
assessment process. - Determine what you may want to adapt for your own
course level assessment.
3QUESTIONS
- How can we keep this process manageable yet still
worthwhile? - How can we integrate assessment into our daily
classroom practices? - How can we balance academic freedom with
requiring faculty to be consistent in their
curriculum? - How can we convince our colleagues to get
involved? - How can we improve teaching and learning through
assessment?
4Assessment Plan Guiding Principles
- Triangulation
-
- - 3 different assignments or activities to gather
data on student achievement.
5Assessment Plan Guiding Principles
- Direct and Indirect Measures
- Direct Assessment require students to display
their knowledge or skills (essays, exams,
homework, etc.) - Indirect Assessment requires students (or
others) to reflect on their learning (number of
student hours spent on homework, in conference,
etc opinions perceptions from surveys
retention success data from department)
6Assessment Plan Guiding Principles
- Formative Summative Measures
- Formative Assessment generates feedback for
improvement in working towards a final
performance (in-class assignments, outlines,
discussion, etc.) - Summative Assessment a final determination of
knowledge or skills any evaluation that is not
created to provide feedback for improvement but
is used only for final judgments (final exams or
essays) -
7Assessment Plan Guiding Principles
- Quantitative vs. Qualitative
- - assessment does not always have to be
quantitative (numerical scales or rubrics)
sometimes performances or narratives may be
better expressions of student learning. -
8Assessment Plan Implementation Lessons Learned
9Identifying the Gaps in Assessment
- Example In attempting to come up with valuable
forms of indirect assessment, we realized we had
never surveyed students.
10Importance of Piloting
- Running through the process on a smaller scale
allows us to troubleshoot, revise, and to see
what is manageable before we begin assessing on a
larger scale.
11Simplifying the Process
- In an effort to simplify the process cut back
on labor involved, we have attempted to use one
source of data to examine multiple SLOs where
possible.
12Purpose of the Survey
- Assess all of our SLOs.
- Provide an indirect assessment measure.
- Generate data for our program as a whole and at
each level. - Enable students to assess their learning.
- Provide data for Program Review.
13Administering the Survey
- Articulate the survey statements in language that
is clear and easily accessible to students. - Administer to at least 33 of the total number of
students taking a core composition course, and at
least 33 of each level with at least 100
respondents. - Work closely with the Institutional Research
Office. - Establish benchmark that 70 of respondents will
agree/strongly agree with the statements
14What is your current education goal?
15Overall essay unity/thesis Write focused,
coherent, well-developed, largely text-based
essays appropriate to the developmental level,
organized into effective paragraphs with major
and minor supporting details, which support a
clear thesis statement, and demonstrate
competence in standard English grammar and
usage.
16Critical reading/writing/thinking Demonstrate
critical reading, writing, and thinking skills
through analysis, synthesis, and evaluation of
important ideas from multiple points of view.
17Critical reading/writing/thinking Apply basic
research and documentation skills.
18Metacognition Perceive themselves as improved
writers and thinkers engaging in academic
discourse in cross-disciplinary contexts.
19Q11 I am better able to support my opinions
with evidence as a result of this class.
20Assessment PlanStudent Essays
- Direct measure
- Summative assessment
- Collection of text-based essays from one of the
last two persuasive essays of the semester - Benchmark criteria 70 will achieve 2
21Purpose of the Rubric
Purpose of the Rubric
- Assess the first three of our SLOs.
- Generate data for our program as a whole and at
each level. - Enable faculty to gauge, as well as reflect on,
their assessments. - Allow for more consistency in assessment
throughout the department. - Create a set of criteria that is transparent to
students. - Provide data for Program Review.
22Holistic Scoring and Rubrics
- Initial qualms from faculty
- Feared scrutiny (either as contributors or as
scorers of the essays). - Wondered if the scoring of their essays would be
a source of critique of their classes - Brought up issues of academic freedom
23Challenges of Rubric
- Whether or not to weave in new ideas such as
voice. - Whether critical thinking was adequately
addressed - Making the rubric accessible to students
- Difficulties in using the rubric with particular
prompts (i.e. fact-based research essay, overly
rhetorically challenging assignments)
24Methods for Holistic Scoring
- Used a 4-point Likert Scale in rubric
- 4excellent, 3good, 2adequate, 1not passing
- Normed the data at the ends of the scale first,
the most successful (4s) and the not passing
essays (1s), to create a clear distinction
between high and low scores. - Assessment subcommittee chose sample 1s, 2s,
3s and 4s. - Normed the sample essays with faculty.
25Problems with Scoring
- Difficulty in differentiating between 2s and 3s
(in committee and as an entire faculty) - Discrepancy between the scores in the middle.
- Instructors urge to add pluses and minuses to
the scores skewed results slightly. - Some of the research essays did not lend
themselves well to the rubric and had to be taken
out of the sample. - Norming was supposed to precede scoring but we
ended up solely norming, so had to divide up
scoring amongst fewer faculty afterwards.
26Data
- First round of norming had general consensus the
second round fell apart because essays deviated
too much from the assessment plan criteria. - We had approximately 30 essays left after
norming, to be scored by 10 different faculty
members. - These remaining essays had a generally much
higher pass rate (approx. 2.7). - This preliminary data suggests that given the
appropriate assignments, the rubric can be used
as a reliable means by which to determine success
in achieving the SLOs for the English 836
writing course.
27Comparing Data Scores
28High Scores vs. Low Scores
29Results of Data
- Most students scored above a 2.
- The average median score given was a 2.
- The student success rate was varied depending on
the assignment. - Approximately half of the assignments contributed
were not appropriate to the level. - Rates of success were higher (2.7) when the
students were given an assignment appropriate to
the level. - Raises the question of whether students are not
passing their courses due to assignments
inappropriate to the level.
30Lessons Learned
- Finding sufficient samples from a variety of
instructors, though select faculty were kind
enough to contribute. - Problems with essay prompts and readings that are
not appropriate to the level. - Took out several of the essays that were derived
from an inappropriate prompt (Ex one was overly
rhetorically challenging, one was a research
paper with insufficient analysis) - We were left with two sources and thus did not
have a comparative set. - The importance of piloting the essays first, so
can work out incongruities on a smaller scale.
31Assessment Plan Wrap Up
- Ties back in to support SLOAC purposes.
- Increases faculty dialogue in the area of
assessment. - Helps to better align curricula within the
department. - Ultimately, helps students to more successfully
navigate each subsequent level in their English
courses at Skyline.
32Future Plans
- Expand the assessment to include more faculty
with a wider variety of essay assignments so that
we can get statistically significant data. - Perhaps use random essays from all sections to
have a greater range of faculty assignments
without actually assessing all essays within all
courses. - Begin the cycle to include all levels of core
transfer and pre-transfer English courses in
assessment. - Eventually involve all core courses from across
the disciplines.