Title: Research and Sponsored Projects at NC State
1- Assessment Review of
- Graduate Programs- Doctoral
- Duane K. Larick Michael P. Carter
- North Carolina State University
- Council Of Graduate Schools
- Pre-Meeting Workshop
- December, 2006
2Assessment and Review
- Outline of Presentation
- Why review/assess graduate programs
- A review process incorporating periodic external
reviews and continuous program assessment
3 Marilyn J. Baker Revised and Updated by
Margaret King, Duane Larick, and Michael
Carter NC State University
4Background Information About Our Audience
- How many of you are responsible for graduate
program review at your institutions? - How many of you have this as a new
responsibility? - How many of you have recently (or are
considering) changing your procedure?
5Why Review/AssessGraduate Programs?
- The primary purpose should be to improve in the
quality of graduate education on our campuses - By creating a structured, scheduled opportunity
for a program to be examined, program review
provides a strategy for improvement that is
well-reasoned, far-seeking, and as apolitical as
possible
6Why Review/AssessGraduate Programs?
- External Considerations
- To help satisfy calls for accountability
- Especially at the State level
- Requirement for regional accreditation,
licensure, etc.
7SACS Principles of Accreditation
- Core requirement 5 The institution engages in
ongoing, integrated, and institution-wide
research-based planning and evaluation processes
that incorporate a systematic review of programs
and services that (a) results in continuing
improvement and (b) demonstrates that the
institution is effectively accomplishing its
mission.
8Why Review/AssessGraduate Programs?
- Internal Considerations
- Meet long-term (strategic) College
Institutional goals - Creation of new degree programs
- Elimination of existing programs
- Funding allocation/reallocation
- Advanced understand of graduate education and
factors influencing graduate education - Aids in identification of common programmatic
needs
9Why Review/AssessGraduate Programs
- Internal Considerations
- Creates an opportunity to focus on key issues
impacting graduate education - Causes of retention/attrition among students and
faculty - Meet short-term (tactical) objectives or targets
at the program level - Documents achievements of faculty students
- Indicates the degree to which Program outcomes
have been achieved - Suggests areas for improvement
- Helps chart new programmatic directions
10So The Questions We Need To Ask Ourselves Are
- What are we currently doing?
- Why are we currently doing it?
- Is what we are currently doing accomplishing the
external goals described above? - Is what we are currently doing accomplishing the
internal goals described above? - Is there a better way?
11Graduate ProgramReview A Two Phase Process
- Periodic formal review of graduate programs
(external review) - Outcomes-based assessment (internal review that
is a continuous and ongoing process)
12Key Features of Formal Reviews
- Evaluative, not just descriptive
- Forward-looking focus on improvement of program,
not just current status - Based on programs academic strengths and
weaknesses, not just ability to attract funding - Objective
- Independent, stands on its own
- Action-oriented clear, concrete recommendations
to be implemented
13Questions Answered by Formal Review
- Is the program advancing the state of the
discipline or profession? - Is its teaching and training of students
effective? - Does it meet institutional goals?
- Does it respond to the professions needs?
- How is it assessed by experts in the field?
14Issues to be Resolved Before Beginning
- Locus of control
- Graduate-only or comprehensive program review
- Countingand payingthe costs
- Masters and doctoral programs
- Coordination with accreditation reviews
- Scheduling the reviews
- Multidisciplinary and interdisciplinary programs
15Key Elements of a Successful Program Review
- Clear, Consistent Guidelines
- The purpose of graduate program review
- The process to be followed
- Guidelines for materials to be included in each
phase - A generic agenda for the review
- The use to which results will be put
16Key Elements of a Successful Program Review
- Administrative Support
- Departmental resources time, funding,
secretarial help, etc. - Central administrative support for larger review
process - Adequate and accurate institutional data,
consistent across programs
17Key Elements of a Successful Program Review
- Program Self-Study
- Engage the program faculty in a thoughtful
evaluation of - The programs purpose(s)
- The programs effectiveness in achieving these
purposes - The programs overall quality
- The facultys vision for the program
18Key Elements of a Successful Program Review
- Surveys/Questionnaires
- Surveys from current students, faculty, alumni,
and employers - Factors to be considered
- Time and expense to develop, distribute and
collect responses - Likely response rate
- Additional burden on respondents
- Uniqueness of information to be gained
19Key Elements of a Successful Program Review
- Student Participation
- Complete confidential questionnaires
- Provide input into self-study
- Be interviewed collectively and individually by
review team - Serve on review teams and standing committees
20Key Elements of a Successful Program Review
- Review Committee
- On-Campus Representation
- A representative of the Graduate School
- Internal reviewer from a field that gives him/her
some understanding of the program(s) being
reviewed - External Reviewer(s)
- Number of reviewers depends on scope and kind
review - Selection process can vary programs can have
input but should not make the final decision
21Key Elements of a Successful Program Review
- Final Report by Review Team
- Brief overview of program
- Strengths of program
- Areas for improvement
- Recommendations for improvement
22Key Elements of a Successful Program Review
- Program Facultys Response to Report
- Clear up errors or misunderstandings
- Respond to the recommendations (have implemented,
will implement, will consider implementing,
cannot implement and why)
23Key Elements of a Successful Program Review
- Implementation
- One or more meetings of key administrators
(department, college, graduate school, and
university) to discuss recommendations - An action plan or memorandum of understanding
drawn up and agreed on by all participants - Discussion of the recommendations with program
faculty for implementation - Integration of the action plan into the
institutions long-range planning and budget
process
24Key Elements of a Successful Program Review
- Follow Up
- An initial report on progress toward
implementation of action plan (1 or 2 years out) - Follow-up reports until action plan is
implemented or priorities change - Discussion of recommendations and implementation
in self-study for next review
25- Questions Relative to External Program Review?
26What is Outcomes-Based Assessment?
- It is a process that engages program faculty in
asking 3 questions about their programs - What are our expectations for the program?
- To what extent is our program meeting our
expectations? - How can we improve our program to better meet our
expectations? - It is a process that provides program faculty the
means to answer these questions - By creating objectives and outcomes for their
program - By gathering and analyzing data to determine how
well the program is meeting the objectives and
outcomes - By applying the results of their assessment
toward improving their program
27What is Outcomes-Based Assessment? continued
- It entails a shift in emphasis from inputs to
outcomes - It is continuous rather than periodic
- It involves regular reports of program assessment
to the institution - Its results are used by the program and
institution for gauging improvement and for
planning
28What is Outcomes-Based Assessment? continued
- Faculty generate program objectives and outcomes
- Faculty decide how outcomes will be assessed
- Faculty assess outcomes
- Faculty use assessment findings to identify ways
of improving their programs
29Benefits of Outcomes Assessment
- It provides the groundwork for increased
responsiveness and agility in meeting program
needs - It gives faculty a greater sense of ownership of
their programs - It provides stakeholders a clearer picture of the
expectations of programs - It helps institutions meet accreditation
requirements
30SACS Criterion for Accreditation
- Section 3 Comprehensive Standards - 16
- The institution identifies outcomes for its
educational programs and its administrative and
educational support services assesses whether it
achieves these outcomes and provides evidence of
improvement based on analysis of those results.
31Drive Toward Greater Accountability on Our Campus
- Professional accreditation agencies (e.g.,
engineering, social work, business) - Undergraduate assessment
- Assessment of general education
32Outcomes Assessment A Process
- Phase I Identifying Objectives and Outcomes
- Phase II Creating Assessment Plans
- Phase III Implementing Assessment Plans
- Phase IV Reporting Assessment Results
33A Procedure for Implementing Outcomes Assessment
- Identify pilot programs to create assessment
materials for each phase - Use pilot materials as a basis for DGP workshops
for each phase - Offer individual support to DGPs as they created
materials and assessed programs - Create online tools to aid DGPs
34Phase I Identifying Objectives and Outcomes
35What Are Objectives?
Program objectives are the general goals that
define what it means to be an effective program.
36Three Common Objectives
- Developing students as successful professionals
in the field - Developing students as effective researchers in
the field - Maintaining/enhancing the overall quality of the
program
37What Are Outcomes?
Program outcomes are specific faculty
expectations for each objective that define what
the program needs to achieve in order to meet the
objectives.
38Example for Outcome 1 Professional Development
- 1. To enable students to develop as successful
professionals for highly competitive positions in
industry, government, and academic departments,
the program aims to provide a variety of
experiences that help students to - a. achieve the highest level of expertise in
XXXX, mastery of the knowledge in their fields
and the ability to apply associated technologies
to novel and emerging problems - b. present research to local, regional, national,
and international audiences through publications
in professional journals and conference papers
given in a range of venues, from graduate
seminars to professional meetings - c. participate in professional organizations,
becoming members and attending meetings - d. broaden their professional foundations through
activities such as teaching, internships,
fellowships, and grant applications
39Example for Outcome 2 Effective Researchers
- 2. To prepare students to conduct research
effectively in XXXX in a collaborative
environment, the program aims to offer a variety
of educational experiences that are designed to
develop in students the ability to - a. read and review the literature in an area of
study in such a way that reveals a comprehensive
understanding of the literature - b. identify research questions/problems that are
pertinent to a field of study and provide a focus
for making a significant contribution to the
field - gather, organize, analyze, and report data using
a conceptual framework appropriate to the
research question and the field of study - interpret research results in a way that adds to
the understanding of the field of study and
relates the findings to teaching and learning in
science - Etc.
40Example for Outcome 3Quality of Program
- 3. To maintain and improve the programs
leadership position nationally and
internationally, the program aims to - a. continue to be nationally competitive by
attracting high-quality students - b. provide effective mentoring that encourages
students to graduate in a timely manner - c. place graduates in positions in industry and
academics - d. maintain a nationally recognized faculty that
is large enough and appropriately distributed
across XXXX disciplines to offer students a wide
range of fields of expertise
41Phase II Creating Assessment Plans
42Four Questions for Creating an Assessment Plan
- What types of data should we gather for assessing
outcomes? - What are the sources of the data?
- How often are the data to be collected?
- When do we analyze and report the data?
43Types of Data Used
- Take advantage of what you are already doing
- Preliminary exams
- Proposals
- Theses and dissertations
- Defenses
- Student progress reports
- Student course evaluations
- Faculty activity reports
- Student exit interviews
44Types of Data Used
- 2. Use resources of Graduate School and
institutional analysis unit - Enrollment statistics
- Time-to-degree statistics
- Student exit data
- Ten-year profile reports
- Alumni surveys
45Types of Data Used
- 3. Use your imagination to find other types of
data - Dollar amount of support for faculty
- Student activity reports
- Faculty surveys
46Data Two Standards to Use in Identifying Data
- Meaningful Data should provide information that
is suitable for assessing the outcome - Manageable Data should be reasonable to attain
(time, effort, ability, availability, resources)
47Four Questions for Creating an Assessment Plan
- What data should we gather for assessing
outcomes? - What are the sources of the data?
- How often are the data to be collected?
- When do we analyze and report the data?
48Sources of Data
- Students
- Faculty
- Graduate School
- Graduate Program Directors
- Department Heads
- Registration and Records
- Advisory Boards
- University Planning and Analysis
49Four Questions for Creating an Assessment Plan
- What data should we gather for assessing
outcomes? - What are the sources of the data?
- How often are the data to be collected?
- When do we analyze and report the data?
50Frequency of Data Collection
- Every semester
- Annually
- Biennially
- When available from individual graduate students
- At the preliminary exam
- At the defense
- At graduation
51Four Questions for Creating an Assessment Plan
- What data should we gather for assessing
outcomes? - What are the sources of the data?
- How often are the data to be collected?
- When do we analyze the data?
52Creating a Timeline for Analyzing Assessment Data
- According to objective year 1-objective 1 year
2-objective 2 year 3-objective 3 year
4-objective 1 etc. (3-year cycle) - More pressing outcomes earlier and less pressing
ones later - Outcomes easier to assess earlier and outcomes
requiring more complex data gathering and
analysis later - Approximately the same workload each year of the
assessment cycle
53Four Questions for Creating an Assessment Plan
- What data should we gather for assessing
outcomes? - What are the sources of the data?
- How often are the data to be collected?
- When do we analyze and report the data?
54Assessment Plan
55Phase III Implementing Assessment Plans
- Collecting, Analyzing, and Evaluating Data and
Improving the Program
56Collecting Data
Goal To have data readily accessible when it is
time to analyze the data.
57Typical Modes of Data Collection
- Rubrics for prelims and defenses
- Student Activity Reports/CVs
- Statistics provided by Graduate School
- Faculty Activity Reports
- Student exit surveys or interviews
58Suggestions for Collecting Data
- Identify the kinds of data you need to collect,
who is responsible for collecting them, and when
they are to be collected. - Determine where the data are to be stored and
check periodically to be sure data are up to
date. - Make data collection and storage as much a
departmental routine as possible.
59Analyzing Data
Goal To put data into a form that will allow
faculty to use them to evaluate the program.
60Spreadsheet for Rubrics for Prelims and Defenses
61Graphs from Graduate School Statistics
62Evaluating Data
Goal To use the data to judge the extent to
which the program is meeting faculty expectations.
63Suggestions for Evaluating Data
- In most cases, the primary criterion for
evaluation is faculty expectations. Allow faculty
to discuss their expectations as a way of
defining criteria for evaluation. - Guide faculty discussion by asking them to
identify strengths of the program and areas of
concern. - Evaluation is typically a judgment call
encourage faculty to trust their judgments.
64Making Decisions for Improving the Program
Goal To apply what has been learned in
evaluating the data to identifying actions that
address areas of concern.
65Suggestions for Making Decisions for Improving
Programs
- Lead faculty in brainstorming try to elicit
multiple suggestions for actions. - All suggestions should be evaluated for
feasibility and validity (do they offer a good
chance of affecting the area of concern?). - Its OK to conclude that change is not yet
warranted, more data need to be collected. - Also encourage faculty to address the need for
changes in assessment procedures.
66Phase IV Reporting Assessment Results
67Reporting Assessment Results
Goal To submit a report every two years in which
you summarize your assessment process and
findings.
68Creating a Timeline for Reporting Assessment Data
- Standard practice appear to call for an annual or
biennial assessment report - Longer cycles lose the impact on the continuous
and ongoing nature - When possible correlate with pre-existing
external review program
69Two Purposes of Assessment Reports
- Primary To maintain a record of assessment and
improvements for you and subsequent DGPs to be
used for self-studies, accreditation agencies,
boards of advisors, etc. - Secondary To provide evidence of a process of
accountability for the university.
70Questions to Guide Reports
- What outcomes were you scheduled to assess during
the present biennial reporting period? What
outcomes did you assess? - What data did you collect? Summarize your
findings for these data. - What did you and your faculty learn about your
program and/or your students from the analysis of
the data? What areas of concern have emerged?
71Questions to Guide Reports
- 4. As a result of your assessment, what changes,
if any, have you and your faculty implemented or
considered implementing to address areas of
concern? - 5. What outcomes are you planning to assess for
the upcoming biennial reporting period?
72(No Transcript)
73What We Have Learned
- The process of change takes time
- Communication is the key to success
- It is important to pilot assessment processes
before taking it to all graduate programs.
74What We Have Learned continued
- This kind of review process must be ground
(faculty) up not top (administration) down - This kind of review process requires significant
human resources - Training, data collection, analysis, and
interpretation, etc. - A key to our success is how much of this can be
institutionalized
75Managerial Tools Created for Program Review -
Website
76Assessment and Review - Connecting the Two
- Both must be owned by the faculty
- The self-study required for formal program review
must have input from the entire faculty - The resulting action plan must also be agreed
on by the faculty in the Program - The objectives, outcomes and assessment plan for
outcome based assessment must have buy in and
participation by all faculty
77Assessment and Review - Connecting the Two
- Continuous and ongoing review should inform and
enhance formal program review - Formal review self-study should include a summary
of the assessment findings and changes
implemented. - Ideally, these incremental improvements will have
resulted in a stronger program and fewer
surprises at the time of the formal review.
78Assessment and Review - Connecting the Two
- The formal review process may suggest additional
or revised program outcomes and assessment
measures - Formal review self-study should include an
outline of the program outcomes and assessment
plan for reviewer comment
79 80Managerial Tools Created for Program Review -
Website
81Managerial Tools Created for Program Review -
Website
82Managerial Tools Created for Program Review -
Website
83Managerial Tools Created for Program Review
Review Document Management
84Managerial Tools Created for Program Review
Review Document Management
85Managerial Tools Created for Program Review
Review Document Management
86(No Transcript)