Understanding Middle States

1 / 42
About This Presentation
Title:

Understanding Middle States

Description:

Average credit enrollment per FTE faculty. Common state performance indicators ... Counts of contacts, inquiries, etc. Questions to library information desk ... – PowerPoint PPT presentation

Number of Views:12
Avg rating:3.0/5.0
Slides: 43
Provided by: airpoBin

less

Transcript and Presenter's Notes

Title: Understanding Middle States


1
Understanding Middle States Expectations for
Assessment
AIRPO Buffalo, June 12, 2009
  • Linda Suskie, Vice President
  • Middle States Commission on Higher Education
  • 3624 Market Street, Philadelphia PA 19104
  • Web www.msche.org E-mail LSuskie_at_msche.org

2
Today
  1. Understanding Standard 7 Institutional
    Assessment
  2. Sharing assessment results
  3. Using assessment results
  4. Telling your story to Middle States
  5. Questions an MSCHE reviewer might ask

3
Understanding Standard 7Institutional Assessment
4
Planning Assessment as a Four-Step Cycle
1. Goals
2. Programs, Services Initiatives
4. Using Results
3. Assessment/ Evaluation
5
What Goals Are We Talking About?
  • Institutional goals (mission strategic plan)
  • Administrative goals
  • Division goals
  • Administrative unit goals
  • Student learning goals
  • Institutional
  • Gen Ed curriculum
  • Academic programs
  • Student development programs
  • Support programs

6
1. Mission Goals
2. Planning
8. Admissions
9. Student Support Services
3. Resources
10. Faculty
4. Leadership/Governance
11. Educational Offerings
5. Administration
12. General Education
6. Integrity
13. Related Educ. Activities
7. Institutional Assessment
14. Asmt. of Student Learning
7
Institutional Effectiveness Are We Achieving
7. Mission Goals
Community Service
Scholarship
14. Student Learning
Diversity
Productivity/ Efficiency
Access
Revenue Generation
8
Strategies to Assess Institutional Goals
9
Assessments of student learning
  • Direct evidence (clear, convincing)
  • Tests examinations
  • Assignments, papers, projects
  • Portfolios
  • Field experience evaluations
  • Indirect evidence
  • Retention, graduation, placement rates
  • Surveys of students alumni
  • Grades

10
Performance indicators
  • Measures that are monitored in order to
    determine the health, effectiveness,
    efficiency of an institution
  • Michael Dolence Donald Norris
  • Key performance indicators (KPIs)
  • Key quality indicators (KQIs)
  • Performance measures
  • Performance metrics
  • Balanced scorecard
  • Dashboard indicators

11
Popular performance indicators
  • Student retention graduation rates
  • Job placement rates
  • Racial/ethnic enrollment breakdowns
  • Dollar value of sponsored research grants
  • Licensure certification exam pass rates
  • Faculty workload
  • Student/faculty ratio
  • Average credit enrollment per FTE faculty

12
Common state performance indicatorsNational
Center for Public Policy Higher Education
  • Preparation
  • Number quality of teachers graduating in
    critical fields
  • Participation
  • Enrollment by race, gender, income
  • Affordability
  • Discounted tuition fees as proportion of median
    income
  • Completion
  • Actual predicted graduation rates based on
    student preparation aptitude
  • Benefits
  • Degrees awarded in critical fields
  • Sponsored research publications
  • Learning

13
Other examples
  • Participation rates (e.g., in student activities,
    cultural events)
  • Expenditures per FTE student
  • Counts of contacts, inquiries, etc.
  • Questions to library information desk
  • Referrals to counseling center

14
Program reviews (academic other)
  • Common criteria for academic program reviews
  • Quality
  • Resources, activities, outcomes, etc.
  • Need
  • Demand for the program
  • Competing programs
  • Centrality to mission
  • Cost and cost-effectiveness

15
Baldrige National Quality Program
  1. Leadership
  2. Strategic planning
  3. Student, stakeholder, market focus
  4. Measurement, analysis knowledge management
  5. Faculty staff focus
  6. Process management
  7. Organizational performance results

16
Other assessment strategies
  • Surveys, interviews, focus groups
  • Secret shoppers
  • Observations of students, meetings, activities
  • Document reviews
  • Meeting minutes, transcript analyses, e-mails,
    online discussions
  • Online institutional portfolios
  • Quality improvement tools
  • Run charts, histograms, pareto analyses, six
    sigma analyses
  • Activity-based costing Compare outcome against
    cost

17
Your assessment strategy must align with a goal
to be useful.
18
Sharing Assessment Results
19
Why are you assessing the program or curriculum?
  • Validate it to others (accountability)
  • Make sure it isnt slipping
  • Improve it

20
Keep assessment summaries useful to you and your
colleagues.
  • Who needs to see the results?
  • Why? What decisions will they make?
  • What do they need to see to make those decisions?

21
What decisions might the assessment help with?
  • Learning goals
  • Are our learning goals sufficiently clear and
    focused?
  • Curriculum
  • What is the value of service learning?
  • Should our courses have more uniformity across
    sections?
  • Teaching methods
  • Is online instruction as effective as traditional
    instruction?
  • Is collaborative learning more effective than
    lectures?
  • Are we developing a community of scholars?
  • Assessments
  • Have our assessments been useful?
  • Resource allocations
  • Where should we commit our resources first?

22
Keep assessment summaries short and simple.
  • Fast and easy to read and understand
  • Use short, simple charts, graphs, and lists.
  • Use PowerPoint presentations.
  • Avoid narrative text.
  • First aggregate (sum up) data, then drill down
    into details as needed.
  • Round results.
  • Sort results from highest to lowest.
  • Percentages may be more meaningful than averages.
  • Avoid complex statistics.
  • As you collect results over time, show trends.

23
Tell a story.
  • Key questions to address
  • What have you learned about your students
    learning other institutional goals?
  • What are you going to do about what you have
    learned?
  • When, where, and how are you going to do it?
  • Doug Eder
  • Focus on big news.
  • Identify meaningful vs. insignificant
    differences.
  • Find someone skilled at finding the stories in
    reams of data.

24
Using Assessment Results
When Assessment Results Are Good
Celebrate!
  • Publicize!

25
When assessment results are disappointingExample
Student retention results
  • Goals
  • Set a special target for male students.
  • Program (curriculum)
  • Make the advisement program mandatory.
  • Implementation (pedagogy)
  • Increase professional development for advisors.
  • Assessments
  • Identify student goals upon entry and upon exit.
  • Resource allocations
  • Fund professional development for advisors.

26
Telling Your Story to Middle States
27
(No Transcript)
28
What Should Institutions Document?
  • Clear statements of goals
  • Organized, sustained assessment process
  • Principles, guidelines, support
  • What assessments are already underway
  • What assessments are planned, when, how
  • Assessment results documenting progress toward
    accomplishing goals
  • How results have been used for improvement

29
How Might Institutions Document This?
  • Need not be a fancy bound document!
  • Need not be in a consistent format or single
    repository
  • An overview in the report to MSCHE
  • A chart or roadmap in the report to MSCHE or an
    appendix
  • More thorough information in the on-site
    resource room, online, and/or burned onto CD
  • A few samples of student work
  • Exemplary, adequate, inadequate

30
Do you need special assessment software?
  • What are your needs?
  • How will you use the software?
  • Are faculty staff ready to use it?
  • Do you have IT support?
  • Ask vendors for references.
  • What are the real costs?
  • What is the cost-benefit balance?
  • Dont rush involve faculty in deciding.

31
Questions an MSCHE Reviewer Might Ask
32
Is the Institution Engaged in Good Assessment?
Used
Reasonably accurate truthful results
Cost effective
Valued
Clear important goals
33
For Each Goal
  • How is the goal being assessed?
  • What are the results of those assessments?
  • How have those results been used for improvement?

Goals
Assessments
Improvements
34
Do Institutional Leaders Support and Value a
Culture of Assessment?
  • Is there adequate support for assessment?
  • Overall guidance coordination
  • Are assessment efforts recognized valued?
  • Are efforts to improve teaching recognized
    valued?

35
How Much Has Been Implemented?
  • Are there any significant gaps?

36
What Do Assessment Results Tell Us?
  • Do results demonstrate
  • Achievement of mission and goals?
  • Sufficient academic rigor?

37
Have Assessment Results Been Used?
  • Have they been appropriately shared discussed?
  • Have they led to appropriate decisions?
  • Curricula and pedagogy
  • Programs and services
  • Resource allocation
  • Institutional goals and plans

38
Is the Process Sustainable?
  • Simple
  • Practical
  • Detailed
  • Ownership
  • Appropriate timelines

39
Where is the Institution Going with Assessment?
  • Will momentum slow after this review?
  • What Commission action will most help the
    institution keep moving?

40
Middle States Five Rules for Assessment
  1. Keep it useful.
  2. Tie assessments to important goals.
  3. For student learning, include some direct
    evidence.
  4. Use multiple measures.
  5. Keep doing something everywhere, every year.

41
Bottom Line on Moving Ahead
  • Keep assessment useful.
  • Keep things simple.
  • Especially in terms of time
  • Dont create unnecessary rules.
  • Value assessment.
  • Just do it!

42
Volunteer for Middle States Evaluation Teams!
  • Go to our web site (www.msche.org).
  • Click on Evaluators.
Write a Comment
User Comments (0)