Title: Assessment Techniques for Curricular Improvement
1Assessment Techniques for Curricular Improvement
- Roxanne Canosa, Rajendra K. Raj
- Department of Computer Science
- Rochester Institute of Technology
2Overview
- What is Assessment?
- Analytic vs. Holistic Approaches
- Assessment Grading?
- Terminology
- Assessment vs. Accreditation
- Outcomes vs. Objectives
- Performance Criteria
- Direct vs. Indirect
- Evaluation and Continuous Improvement
3What is Assessment?
- Assessment is one or more processes that
identify, collect, and prepare data to evaluate
the achievement of program outcomes and
educational objectives - 2006-2007 Criteria for Accrediting Computing
Programs Appendix A (Proposed Changes) - From Section II.D.1 of the ABET Accreditation
Policy and Procedure Manual
4Analytic vs. Holistic Approaches
- Analytic approach
- All students/courses analyzed to diagnose areas
in need of improvement - Holistic approach
- Focus on overall performance of the program
- Input from employers, alumni, advisory board
- Develop efficient and effective processes
- Lean, mean assessment machine
- Dont commit random acts of assessment
- Gloria Rogers
5What is Your Assessment Goal?
- Assessing all students or specific groups of
students? - Assessing students, department, or program?
- Assessing for short-term improvement or long-term
effect? - Assessing for formative or summative purposes?
6Grading vs. Assessing
- Grading
- Measures extent to which a student meets faculty
requirements and expectations for a course - Can grades infer students achievement of an
outcome? - Factors
- Student knowledge
- Work ethic
- Faculty variance in course content, grading
components, beliefs, bias,
- Assessing
- Measures extent to which a student achieves each
course (program) outcome - Can we leverage grading components for
assessment? - Use rubrics, which are pre-announced performance
criteria
7Assessment vs. Accreditation
- Institutional accreditation through Middle
States, SACS, etc. are increasingly requiring
direct assessment of program objectives and
outcomes - Jargon may be different, but the essential ideas
are the same
8Terminology (Jargon)
From ABET perspective
9Terminology Lessons
- Use terminology for your situation
- Sometimes dictated by institutional accreditation
(SACS, Middle States) - Sometimes dictated by program accreditation
(ABET) - Keep a glossary of terms handy for any external
evaluators - Stick to your terminology
- Terms are not fungible without causing too much
grief
10Proposed Changes toABET Criteria for Computing
- Old criteria
- Intents and Standards
- New criteria (2008-2009 cycle)
- General
- Program Specific
11New ABET Criteria
- 8 General Criteria
- Students
- Program Educational Objectives
- Program Outcomes (a) through (i)
- Assessment and Evaluation
- Curriculum
- Faculty
- Facilities
- Support
- CS Program Specific Criteria
- Outcomes and Assessment (a) and (b)
- Faculty Qualifications
- Curriculum (a), (b), and (c)
- IT/IS Program Specific Criteria
12Program Audit Concern
- Concern
- A criterion is currently satisfied however,
potential exists for this situation to change in
the near future such that the criterion may not
be satisfied. Positive action is required to
ensure full compliance with the Criteria.
13Program Audit Weakness
- Weakness
- A criterion is currently satisfied but lacks
strength of compliance that assures that the
quality of the program will not be compromised
prior to the next general review. Remedial action
is required to strengthen compliance with the
Criteria.
14Program Audit Deficiency
- Deficiency
- A criterion is not satisfied. Therefore, the
program is not in compliance with the Criteria
and immediate action is required.
15Program Objectives
- Program educational objectives are broad
statements that describe the career and
professional accomplishments that the program is
preparing graduates to achieve. - Long-term goals
- Should be distinct to your program
- Should be publicly available
- Must be measurable!
16Program Outcomes
- Program outcomes are narrower statements that
describe what students are expected to know and
be able to do by the time of graduation. These
relate to the skills, knowledge, and behaviors
that students acquire in their matriculation
through the program. - Should be publicly available
- Must be measurable!
17Objectives vs. Outcomes
- Example objective
- Graduates will exhibit effective communication
skills - Example outcomes
- By the time of graduation, students will
- demonstrate effective written communication
skills - demonstrate effective oral communication skills
- Gloria Rogers
18Performance Criteria
- Define and describe progression toward meeting
important components of work being completed,
critiqued, or assessed - Student provides adequate detail to support
his/her solution/argument - Student uses language and appropriate word choice
for the audience - Student work demonstrates an organizational
pattern that is logical and conveys completeness - Student uses the rules of standard English
- Provide solid evidence of progression
19What is Solid Evidence?
- Direct Evidence
- Easier to measure
- Familiar to most faculty - exam or project
grades, presentation skills, etc. - Indirect Evidence
- Difficult to measure
- Attitudes or perceptions
- For example, a desired outcome of a course may
include improving students appreciation of team
work
20Direct vs. Indirect Assessment
- The assessment process should include both
indirect and direct measurement techniques - A variety of sources should be used
- Employers, students, alumni, etc.
- Converging evidence from multiple sources can
reduce the effect of any inherent bias in the data
21Direct Assessment
- Direct examination or observation of student
knowledge or skills using stated, measurable
outcomes - Faculty typically assess student learning
throughout a course using exams/quizzes,
demonstrations, and reports - Sample what students know or can do
- Provide evidence of student learning
22Direct Assessment of PEOs
- Employment statistics
- Promotions and career advancement of graduates
- Job titles, advanced degrees earned, additional
course work taken after graduation, etc. - PEOs must be assessed separately from POs
23Direct Assessment of POs
- Common final exams
- Locally developed exit exams
- Standardized regional or national exit exams
- External examiner
- Co-op reports from employers
- Portfolios of student work
24Indirect Assessment
- Indirect assessment of student learning
ascertains the perceived extent or value of
learning experiences - Assess opinions or thoughts about student
knowledge or skills - Provides information about student perception of
their learning and how this learning is valued by
different constituencies
25Indirect Assessment Measures
- Exit and other kinds of interviews
- Archival data
- Focus groups
- Written surveys and questionnaires
- Industrial advisory boards
- Employers
- Job fair recruiters
- Faculty at other schools
26Survey of Assessment Methods
27Direct and Indirect
- Duality of some instruments, e.g., an exit
interview - Indirect
- Survey of opinions about the perceived value of
the program components - Direct
- If person asking the questions uses it as a way
of assessing students skills (e.g., oral
communication), then the survey is being used as
a direct measure of the achievement of that
outcome
28Evaluation
- Evaluation is one or more processes for
interpreting the data and evidence accumulated
through assessment practices. Evaluation
determines the extent to which program outcomes
or program educational objectives are being
achieved, and results in decisions and actions to
improve the program.
29Continuous Improvement
- Accreditation boards are moving towards
outcomes-based assessment of CS, IS, and IT
programs - Programs must have an established outcomes-based
assessment plan in place (or at least be making
progress in that direction) - Process must be documented
- Process must show continuous improvement (both
quantitatively and qualitatively)
30Faculty Responsibility
- All faculty must have a commitment to and be
directly involved in the evaluation of program
educational objectives and program outcomes, as
well as the process for continuous improvement of
the program
31Need for Faculty And Staff Buy-In
- What makes most academics tick?
- Rewards
- Money?
- Fun?
- Appreciation?
- Recognition?
- How to encourage involvement?
- We all resent any extra work!
32Where to Begin?
- Define your Mission Statement
- Define your Program Educational Objectives (PEOs)
- Define your Program Outcomes (POs)
- Define Course Outcomes (COs)
- Include specific course outcomes on each course
syllabus - Make publicly available
33Then What?
- Show how course outcomes map to program outcomes
- Show how program outcomes map to program
educational objectives - Choose measurement tools, both direct and
indirect - Collect data
34Finally
- Present data to faculty in an easily digestible
form - Charts, graphs, tables, etc.
- Faculty evaluates the data
- Are students actually learning the material that
the faculty believe (and claim) they are
learning? - Faculty make recommendations for improvement as
necessary
35The Big Picture
Performance Criteria
Mission Statement
Stakeholders(students, alumniemployers
faculty, )
Course Outcomes
Program Objectives
Program Outcomes
Assess Collect and Analyze Evidence
Assess Collect and Analyze Evidence
Revise
Evaluate Interpret Evidence Take Action
Educational Practices/Strategies
36The Big Picture
- Show relationship between mission statement,
objectives, and outcomes - Assess and evaluate objectives and outcomes
independently - Map program outcomes to program objectives
- Map course outcomes to program outcomes
- Identify weaknesses and implement focused
improvements in targeted areas
37Issues
- All assessment methods have their limitations and
contain some bias - Meaningful analysis requires both direct and
indirect measures from a variety of sources - Students, alumni, faculty, employers, etc.
- Multiple assessment methods provides converging
evidence of student learning
38Assessment Lessons
- Cannot do everything at once
- Try an approach for first round learn and refine
- Having data isnt all there is to it!
- Easy to generate lots of bad data
- One size fits all NOT!
- Programs, courses, instructors all differ
- Be ready to compromise
- Perfection is neither possible nor desirable
- Faculty evaluation and promotion
- Do not tie to data generated from assessment
39Resources
- http//www.cs.rit.edu/rlc/Assessment/
- http//www.abet.org/assessment.shtml