Assessing the Mission of Doctoral Research Universities - PowerPoint PPT Presentation

1 / 82
About This Presentation
Title:

Assessing the Mission of Doctoral Research Universities

Description:

Program Linkages for Graduate Assessment ... Why Assess Graduate Programs? ... Common Internal Reasons for Graduate Assessment. Program marketing ... – PowerPoint PPT presentation

Number of Views:110
Avg rating:3.0/5.0
Slides: 83
Provided by: josep47
Category:

less

Transcript and Presenter's Notes

Title: Assessing the Mission of Doctoral Research Universities


1
Assessing the Mission of Doctoral Research
Universities
  • J. Joseph Hoey, Georgia Tech
  • Lorne Kuffel, College of William and Mary
  • North Carolina State University Workshop
  • October 30-31, 2003

2
Guidelines for This Presentation
  • Please turn off or silence you cell phones
  • Please feel free to raise questions at anytime
    during the presentation, we will also leave time
    at the end for general discussion.
  • We are very interested in your participation

3
Agenda
  • Introduction and Objectives
  • Reasons for Graduate Assessment
  • Comparative Data Sources
  • Developing Faculty Expectations for Graduate
    Students
  • Principles of Graduate Assessment
  • Physics Case Study
  • Taking Assessment Online
  • Summary and Discussion

4
Objectives
  • Articulate motivations for undertaking graduate
    assessment
  • Increase awareness of comparative data sources
  • Program Linkages for Graduate Assessment
  • Hands-on develop faculty expectations for
    student competence utilize diverse data sources
    to evaluate a graduate programs first assessment
    efforts etc.

5
Why Assess Graduate Programs?
  • We are all interested in the quality and
    improvement of graduate education
  • To help satisfy calls for accountability
  • Accreditation requirements SACS accreditation
    imperatives
  • To change or improve an invisible system, one
    must first make it visible
  • Schilling and Schilling, 1993, p. 172.

6
Common Internal Reasons for Graduate Assessment
  • Program marketing
  • Meet short-term (tactical) objectives or targets
  • Meet long-term (strategic) institutional/departme
    ntal goals
  • Funded project evaluation (GAANN, IGERT)
  • Understand sources of retention/attrition among
    students and faculty

7
SACS Principles of Accreditation
  • Core requirement 5 The institution engages in
    ongoing, integrated, and institution-wide
    research-based planning and evaluation processes
    that incorporate a systematic review of programs
    and services that (a) results in continuing
    improvement and (b) demonstrates that the
    institution is effectively accomplishing its
    mission.

8
SACS Principles of Accreditation
  • Section 3 Comprehensive Standards Institution
    Mission, Governance, And Institutional
    Effectiveness
  • 16. The institution identifies outcomes for its
    educational programs and its administrative and
    educational support services assesses whether it
    achieves these outcomes and provides evidence of
    improvement based on analysis of those results.

9
SACS Principles of Accreditation
  • Section 3 Comprehensive Standards Standards
    for All Educational Programs
  • 12. The institution places primary
    responsibility for the content, quality, and
    effectiveness of its curriculum with the faculty
  • 18. The institution ensures that its graduate
    instruction and resources foster independent
    learning, enabling the graduate to contribute to
    a profession or field of study.

10
SACS Accreditation
  • The intent of the SACS procedures is to stimulate
    institutions to create an environment of planned
    change for improving the educational process.

11
Language
  • Much of the assessment literature employs a fair
    amount of industrial or business speak
  • Feel free to develop and use your own
  • Keep it consistent across the institution
  • Produce and maintain a glossary of terms

12
So What Do We Need to Do?
  • Do our departments have a clear mission
    statement?
  • Do we have departmental plans to evaluate the
    effectiveness of our degree programs?
  • Do our degree programs have clearly defined
    faculty expectations for students?
  • Are they published and are they measurable or
    observable?
  • Do we obtain data to assess the achievement of
    faculty expectations for students?
  • Do we document that assessment results are used
    to change or sustain the excellence of program
    activities and further student gains in
    professional and attitudinal skills and
    experiences?

13
So What Do We Need to Do? (Cont.)
  • Based on assessment results, do we reevaluate the
    appropriateness of departmental missions as well
    as the expectations we hold for student
    competence?
  • The amount of work needed to satisfy
    accreditation requirements is proportional to the
    number of No responses to the above questions.

14
IE Chart
15
Needed to Succeed
  • The department should want to do this process
  • The department must use the information collected
  • The institution must use the information
    collected
  • Use participation in the process as part of
    faculty reviews

16
Focusing Efforts
  • It is important to achieve a strategic focus for
    the program, decide what knowledge, skills,
    abilities, and experiences should characterize
    students who graduate from our program

17
What is Important to Measure?
  • To decide this, it is first vital to ask
  • What are our strong areas?
  • What are our limitations?
  • What do we want to accomplish in
  • Education of students?
  • Research?
  • Service?

18
Purpose Statement (sample)
  • The Anthropology Department serves the
    institution by offering courses and scholarly
    experiences that contribute to the liberal
    education of undergraduates and the scholarly
    accomplishments of graduate students. Program
    faculty members offer courses, seminars, directed
    readings, and directed research studies that
    promote social scientific understandings of human
    cultures. The Department offers a bachelors
    degree major and minor, an M.A. degree, and a
    Ph.D.

19
Developing a Plan to Evaluate Degree Programs
  • How to start a departmental plan top down or
    bottom up (Palomba and Palomba, 2001)
  • Top Down As a group of scholars, decide what
    are the important goals or objectives for the
    program.
  • Bottom Up Identify the primary faculty
    expectations for student competence in core
    courses in the program and use this list to
    develop overarching expectations for student
    competence.

20
Develop an Assessment Plan
  • Desirable characteristics for assessment plans
    (Palomba and Palomba, 1999)
  • Identify assessment procedures to address faculty
    expectations for student competence
  • Use procedures such as sampling student work and
    drawing on institutional data where appropriate
  • Include multiple measures
  • Describe the people, committees, and processes
    involved and
  • Contain plans for using assessment information.

21
Words to Remember When Starting an Assessment
Plan
  • It may be best to tackle the modest objectives
    first.
  • Assessment plans should recognize that students
    are active participants and share responsibility
    for their learning experience along with the
    faculty and administration.
  • It takes a long time to do assessment well. So
    be patient and be flexible.
  • The overriding goal is to improve educational
    programs, not to fill out reports or demonstrate
    accountability.

22
Use a Program Profile to get Started
  • Related to Operational Objectives

23
Data for Profiles
  • Admissions Applications, acceptance rates, and
    yield rates
  • Standardized Test Scores
  • Graduate Record Examination (GRE)
    http//www.gre.org/edindex.html
  • Graduate Management Admission Test (GMAT)
    http//www.gmac.com/
  • Law School Admission Test (LSAT)
    http//www.lsac.org/
  • Undergraduate GPA
  • Headcount or Major Enrollments (Full/Part-Time)
  • Degrees Awarded

24
Profiles (Cont.)
  • Formula Funding Elements when appropriate
  • Time-to-Degree and/or Graduation/Retention Rates
  • Support for Students (Type of Assistance)
  • Faculty Headcount (Full/Part, Tenure Status)
  • Faculty Salaries
  • Faculty Productivity or Workload Compliance
  • Research Proposals Submitted/Awarded
  • Research Award/Expenditure Dollars
  • Instructional and Research Facility Space

25
Comparative Data
  • Survey of Earned Doctorates (SED)
  • National Center for Educational Statistics (NCES)
    Institutional Postsecondary Educational Data
    System (IPEDS)
  • National Research Council (NRC) Reports
  • Higher Education Data Sharing Consortium (HEDS)
    Graduate Student Survey (GSS)
  • American Association of University Professors
    (AAUP) or College and University Professional
    Association (CUPA) Faculty Salary Surveys

26
SED Data
  • Is administered annually and has a very high
    annual response rate
  • Doctoral degrees awarded by broad field and
    subfield by gender, racial/ethnic group, and
    citizenship.
  • Institutional ranking by number of doctorate
    awards (top 20) by broad field and by
    racial/ethnic group
  • Time-to-Degree (three measures) by broad field,
    gender, racial/ethnic group, and citizenship

27
SED Data (Cont.)
  • Financial resources for student support by broad
    field, gender, racial/ethnic group, and
    citizenship
  • Postgraduate plans, employment, and location by
    broad field, gender, racial/ethnic group, and
    citizenship
  • Reports are available at http//www.norc.uchicago.
    edu/issues/docdata.htm

28
IPEDS Data
  • Fall enrollments by major field (2-digit CIP
    code) of study, race/ethnicity and citizenship,
    gender, attendance status (full/part-time), and
    level of student (undergraduate, graduate, and
    first professional)
  • The discipline field data is reported in even
    years only.
  • Annual degrees conferred by program (6-digit CIP
    code) or major discipline (2-digit CIP code),
    award level (associate degree, baccalaureate,
    Masters, doctoral, and first professional),
    race/ethnicity and citizenship, and gender.
  • Reported annually

29
IPEDS Data (Cont.)
  • Useful for identifying peer institutions
  • Available at the IPEDS Peer Analysis System
    http//nces.ed.gov/Ipeds/
  • These data are also published in the National
    Center for Education Statistics (NCES), Digest of
    Education Statistics

30
National Research Council
  • Research-Doctorate Programs in the United States
  • This information is dated (1982 and 1993) with a
    new study scheduled for 2004 (?).
  • Benefit is rankings of programs. But some
    critics suggest reputational rankings cannot
    accurately reflect the quality of graduate
    programs. (Graham Diamond, 1999)
  • The National Survey of Graduate Faculty
  • Scholarly quality of program faculty
  • Effectiveness of program in educating research
    scholars/scientists
  • Change in program quality in last five years

31
Profile Comparison for History and Physics NRC
Ranking
  • History department ranked 46.5
  • Physics department ranked 63
  • (Goldberger, Maher, and Flattau, 1995)

32
Profile Comparison for History and Physics -
Faculty
33
Profile Comparison for History and Physics -
Admissions
34
Profile Comparison for History and Physics -
Students
35
Profile Comparison for History and Physics -
Productivity
36
Describing Faculty Expectations for Students

37
Why Describe Faculty Expectations for Students?
  • To sustain program excellence and productivity
  • To give faculty feedback and the ability to make
    modifications based on measurable indicators, not
    anecdotes
  • To inform and motivate students
  • To meet external standards for accountability

38
What Are Our Real Expectations?
  • Read each question thoroughly. Answer all
    questions. Time limit four hours. Begin
    immediately.
  • MUSIC Write a piano concerto. Orchestrate it and
    perform it with flute and drum. You will find a
    piano under your seat.
  • MATHEMATICS Give today's date, in metric.
  • CHEMISTRY. Transform lead into gold. You will
    find a beaker and three lead sinkers under your
    seat. Show all work including Feynman diagrams
    and quantum functions for all steps.
  • ECONOMICS Develop a realistic plan for
    refinancing the national debt. Run for Congress.
    Build a political power base. Successfully pass
    your plan and implement it.

39
Steps to Describing Expectations - 1
  • Write down the result or desired end state as it
    relates to the program.
  • Jot down, in words and phrases, the performances
    that, if achieved, would cause us to agree that
    the expectation has been met.
  • Phrase these in terms of results achieved rather
    than activities undertaken.

40
Steps to Describing Expectations - 2
  • Sort out the words and phrases. Delete
    duplications and unwanted items.
  • Repeat first two steps for any remaining
    abstractions (unobservable results) considered
    important.
  • Write a complete statement for each performance,
    describing the nature, quality, or amount we
    consider acceptable.
  • Consider the point in the program where it would
    make the most sense for students to demonstrate
    this performance.

41
Steps to Describing Expectations - 3
  • Again, remember to distinguish results from
    activities.
  • Test the statements by asking If someone
    achieved or demonstrated each of these
    performances, would we be willing to say the
    student has met the expectation?
  • When we can answer yes, the analysis is finished.

42
Steps to Describing Expectations - 4
  • Decide how to measure the meeting of an
    expectation can we measure it directly?
    Indirectly through indicators?
  • In general, the more direct the measurement, the
    more content valid it is.
  • For more complex, higher order expectations may
    need to use indicators of an unobservable result.

43
Steps to Describing Expectations - 5
  • Decide upon a preferred measurement tool or
    student task.
  • Describe the expectation in terms that measure
    student competence and yield useful feedback.

44
Try it!
  • What Faculty Expectation? Our sample is this
    Graduates will be lifelong learners
  • Decide Under what condition? When and where will
    students demonstrate skills?
  • Decide How well? What will we use as criteria?

45
Try it!
  • Under what condition?
  • Condition Students will give evidence of having
    the ability and the propensity to engage in
    lifelong learning prior to graduation from the
    program.

46
Try it!
  • How well? Specify performance criteria for the
    extent to which students
  • Display a knowledge of current disciplinary
    professional journals and can critique them
  • Are able to access sources of disciplinary
    knowledge
  • Seek opportunities to engage in further
    professional development activities
  • Other?

47
Principles of Graduate Assessment
  • Clearly differentiate masters and doctoral level
    expectations
  • Assessment must be responsive to more
    individualized nature of programs
  • Assessment of real student works is preferable
  • Students already create the products we can use
    for assessment!

48
Principles of Graduate Assessment (continued)
  • Use assessment both as a self-reflection tool and
    an evaluative tool
  • Build in feedback to the student and checkpoints
  • Use natural points of contact with administrative
    processes

49
Common Faculty Expectations at the Graduate Level
  • Students will demonstrate professional and
    attitudinal skills, including
  • Oral, written and mathematical communication
    skills
  • Knowledge of concepts in the discipline
  • Critical and reflective thinking skills
  • Knowledge of the social, cultural, and economic
    contexts of the discipline
  • Ability to apply theory to professional practice
  • Ability to conduct independent research

50
Common Faculty Expectations at the Graduate Level
(continued)
  • Students will demonstrate professional and
    attitudinal skills, including
  • Ability to use appropriate technologies
  • Ability to work with others, especially in teams
  • Ability to teach others and
  • Demonstration of professional attitudes and
    values such as workplace ethics and lifelong
    learning.

51
Areas and Linkage Points to Consider in Graduate
Assessment
  • Deciding on what is important to measure
  • Pre-program assessment
  • In-program assessment
  • Assessment at program completion
  • Long-term assessment
  • Educational process assessment
  • Comprehensive assessment (program review)

52
Use Natural Linkage Points
  • Admission use diagnostic exam or GRE subject
    test
  • Annual advising appointment/progress check
  • Qualifying/Comprehensive exams embed items
    relevant to program objectives
  • Thesis and dissertation develop rubrics to rate
    multiple areas relevant to program objectives
  • Exit exit interview exit survey at thesis
    appointment, check-out, or commencement

53
Pre-Program Assessment
  • Re-Thinking Admissions Criteria (Hagedorn and
    Nora, 1997)
  • Problem Graduate persistence.
  • GRE is only designed to predict first-year
    performance.
  • UG GPA and GRE are not measures of professional
    and attitudinal competency.
  • A variety of skills, talents, and experiences is
    necessary for success but not usually included in
    admissions criteria.
  • Evaluating the fit between the program and the
    student is important.

54
Other Pre-Program Assessment Tools
  • Portfolio and/or structured interviews featuring
  • Research interests and previous products
  • Critique of a report or research paper
  • Plan for a research project
  • Prior out-of-class experiences
  • Inventories to assess motivation, personality,
    fit to program

55
In-Program Assessment of Student Learning
  • Based on faculty expectations
  • Methods may include assessment of
  • Case studies, term papers, projects
  • Oral seminar presentations
  • Preliminary exams, knowledge in field
  • Research and grant proposals
  • Portfolios
  • Problem-Based Learning or Team projects
  • Input from advisors, graduate internship director

56
Assessment at Program Completion
  • Allows demonstration of synthesis of knowledge,
    skills and attitudes learned
  • Ideal comprehensive assessment point --but a
    sense of where the student began is desirable to
    assess change, growth, and value added
  • Qualitative analysis may be appropriate
  • Portfolio of research, scholarly products

57
Assessment at Program Completion (continued)
  • Methods may include assessment of
  • Thesis/dissertation oral defense
  • Professional registration or licensure exam
  • Published works, conference papers
  • Portfolio
  • Exit interview
  • Exit survey

58
Long-Term Assessment
  • Common sentiment graduates can adequately
    self-assess the outcomes of their program only
    after they have been applying their skills for
    several years following graduation.
  • Pursuing long-term assessment, based on
    identified learning objectives, is an important
    component of a graduate assessment program.

59
Long-Term Assessment (continued)
  • AAU (1998) important to track graduates of
    post-baccalaureate programs
  • to gain information on expectations vs. learning
    experiences
  • to gain data on outcomes and placement.
  • Other reasons to them involved in the life of
    the school to bring them back as speakers,
    mentors, advisory board membersand donors.

60
Long-Term Assessment (continued)
  • May include assessment of
  • Job placement and linkage to degree
  • Career success
  • Production of scholarly work
  • Evidence of lifelong learning
  • Awards and recognition gained
  • Participation in professional societies
  • Satisfaction with knowledge gained

61
Long-Term Assessment (continued)
  • Common Assessment Methods
  • Follow-up interviews, surveys or focus groups
  • Journal publications
  • Citation indices
  • Membership lists and papers presented in
    professional/disciplinary associations

62
Value of Assessing the Educational Process
  • Widely viewed as key to graduate retention
  • Helps understand the strengths and needs for
    improvement of graduate coursework, research
    experience, teaching experience, advising, and
    support services.
  • Environment and process assessment see Golde and
    Dore (2001) survey for Pew Charitable Trusts.

63
Ways of Assessing the Educational Process
(continued)
  • Graduate student advisory groups
  • Surveys of students, focus groups
  • Peer review of teaching
  • Institutional data time to degree, graduation
    rate
  • Advising process
  • Mentoring process

64
Assessing the Mentoring Process
  • A primary graduate learning and professional
    enculturation process
  • Mentoring at UC Berkeley (Nerad and Miller,
    1996)
  • All faculty advise individuals, but mentoring is
    the shared responsibility of all members of dept.
  • Individual faculty mentors to students
  • Departmental seminars and workshops

65
Comprehensive Assessment Program Review
  • The combination of an internal self-study and an
    external review of the program by qualified
    faculty peers forms a very powerful and
    comprehensive assessment device.
  • Program review encompasses an examination of
    resources, processes, and student learning
    outcomes.

66
Program Review Examples of Areas to Evaluate
  • Achievement of Faculty Expectations
  • communication skills appropriate to the
    discipline, professional and attitudinal
    competency, ability to conduct independent
    research, etc.
  • Processes
  • coursework, research opportunities, teaching,
    internships, comprehensive exams, theses, and
    time in residence
  • Resources (Profile)
  • faculty, students, library, instructional and lab
    space, financial support, extramural support, etc.

67
Putting the Pieces Together
  • Adapted from Baird (1996) matrix of faculty
    expectations, linkage points to use in conducting
    assessment, and some possible methods to use.
  • Adapt for use by each department by inserting
    appropriate faculty expectations for each program.

68
Case Study
  • See case study handout
  • Doctoral program in Physics at Muggy Research
    University (MRU)
  • First time through their assessment process
  • Data in hand What now?
  • You are the consultants!

69
Case Study Debriefing Questions
  • What do you see in the results?
  • What do you recommend?
  • What actions do they need to take?
  • In light of their mission, what should they do
    next time?

70
Taking Assessment Online
  • Georgia Techs Approach Online Assessment
    Tracking System (OATS)

71
OATS-Purpose
  • Annual Assessment Updates are a key piece in
    Techs efforts to demonstrate compliance with
    SACS Principles of Accreditation.
  • Annual Assessment Updates concept was generated
    by GT unit coordinators in 1998 as a way of
    documenting Techs responsiveness to SACS
    recommendations re assessment practices.
  • Many people have requested that the process be
    moved to an online environment.
  • The online process provides structure, formalizes
    best practices in assessment of student learning,
    and thus facilitates demonstration of compliance.
  • SACS 2005 will be an electronic remote review.

72
Annual Assessment Update
New Method
Previous Method
  • What Did You Look At?
  • How Did You Look At It?
  • What Did You Find?
  • What Did You Do?

RESULTS
73
Feature Comparison
  • Old System
  • Many different formats
  • Hard copy only
  • Difficult to track progress over time
  • Flexibility (but no consistency across Institute)
  • Difficult to provide feedback internally and to
    facilitate institutional sharing of good practices
  • OATS
  • Consistent format
  • Database storage
  • Ability to track progress over time
  • Flexibility maintained
  • Process facilitates accreditation e-review
  • Easier to provide feedback facilitates
    institutional sharing

74
OATS Application
  • Includes user id/password logon
  • Web accessible from any location
  • Defined format structureObjectives, Methods,
    Results, and Actions/Impact
  • Allows posting of formatted text (tables, charts,
    etc.)
  • Allows notes and written feedback
  • Review at School/Unit and College level keeps
    everyone in the loop
  • OATS Production Date October 1
  • Assessment Updates due December 1 this year

75
Main Menu Current Year and History
76
College Level Ivan Allen College
- example -
Sent to College
77
School Level History, Technology Society
- example -
Sent to College
Sent to College
Sent to College
78
Degree Program Level BS in HTS
- example -
79
Summary
  • SACS requires assessment of graduate programs,
    research and public service
  • Make it relevant to the program
  • Keep it simple and focused
  • Consider different assessments for each stage of
    student progress
  • Start now it takes several years to fine tune

80
References
  • See references in back of handout

81
Session Evaluation
  • What one aspect was the most useful to you?
  • What one aspect most needs improvement, and what
    kind of improvement?
  • Other suggestions?

82
Thank You!
  • Questions? Contact us!
  • Joseph.hoey_at_oars.gatech.edu
  • Lorne_at_wm.edu
Write a Comment
User Comments (0)
About PowerShow.com