Faculty Evaluation Systems: Student Evaluations of Faculty - PowerPoint PPT Presentation

About This Presentation
Title:

Faculty Evaluation Systems: Student Evaluations of Faculty

Description:

Student Evaluations of Faculty What are we measuring? The evaluation of teachers is a mark of a good college. Ernest Boyer, 1987, p.155 – PowerPoint PPT presentation

Number of Views:211
Avg rating:3.0/5.0
Slides: 29
Provided by: Roberso3
Category:

less

Transcript and Presenter's Notes

Title: Faculty Evaluation Systems: Student Evaluations of Faculty


1
Faculty Evaluation SystemsStudent Evaluations
of Faculty
  • What are we measuring?
  • The evaluation of teachers is a mark of a good
    college. Ernest Boyer, 1987, p.155

2
IntroductionCommunity College Faculty Evaluation
  • Community College Teaching Mission
  • Diverse mission of a comprehensive community
    college
  • As a Community College Teaching is at the Center
    of our Purpose.
  • RSCC a Teaching Institution First Foremost
  • Can we Measure (Evaluate) Teaching Excellence?

3
Defining Teaching Excellence Three Major
Emphases
  • 1) Input
  • Student, teacher and course characteristics
  • 2) Process
  • Classroom atmosphere, teacher behavior, student
    learning activities, course organization,
    evaluation procedures
  • 3) Product
  • End-of-course learning, affective/cognitive
    change, skills acquisition
  • Long-term learning, affective/cognitive change,
    skills acquisition

4
Presentation Outline
  • Part 1 Faculty Evaluation Systems (Purpose)
  • Part 2 Faculty Evaluation System Student
    Evaluations of Faculty (Student Ratings - One
    Piece)

5
Part 1 Faculty Evaluation- Whats the purpose?
  • Overall institutional purpose is key to
  • proper evaluation.
  • Formative Summative
  • Improve Pedagogical Methods Student Learning
  • Faculty, Course Curriculum Development
  • Excellence in Teaching
  • Tenure Promotion

6
Faculty Evaluation Purpose Accountability
  • Renewed attention to undergraduate
  • instruction
  • State Legislature, Coordinating and Governing
    Boards
  • Accrediting Bodies
  • SACS
  • "The institution regularly evaluates the
    effectiveness of each faculty member in accord
    with published criteria, regardless of
    contractual or tenured status."
  • "Standards for all educational programs include
    all on-campus, off-campus, and distance learning
    programs."
  • Source The new SACS Comprehensive Standards
    governing Educational Programs.

7
Evaluation of College Faculty
  • First student ratings took place in the late
    1800s (Sioux City).
  • Last 2 decades, evaluation of college faculty has
    increasingly become a way of life for those in
    higher education.
  • Today, virtually all American colleges assess
    faculty performance using student evaluations
    along with other instruments.

8
Faculty Evaluation Systems
  • Comprised of multiple measures
  • Student Evaluations of Faculty
  • Peer Evaluations
  • Self Evaluation
  • Supervisor Evaluation
  • Exit Interviews Alumni Surveys
  • Teaching Portfolios/Dossier
  • Classroom observation

9
Part 2 Student Evaluation of Faculty (Ratings)
  • Most widely used structured method of evaluating
    faculty in higher education
  • Integral to most systems is the instrument by
    means of which student opinions observations
    are collected.
  • Student appraisal of instruction seems
    appropriate because, students are the only ones
    who observe teaching for an entire course.

10
Student Evaluations of Faculty (Ratings)
  • Do these ratings accurately portray teaching
    effectiveness?

11
Student Evaluations of FacultyQuality of
Ratings? Legitimacy?
  • Higher education tends to question the quality
    and legitimacy of the info. collected through
    student ratings of faculty .

12
Concerns about Evaluations (student ratings)
  • No surprise a significant of studies focused on
    objectivity and validity of such evaluations
    2,000 published studies
  • Concerns include
  • Low validity reliability
  • Appropriateness of items (instrument)
  • Correlation between grades evaluation
  • Measure How good an entertainer rather than how
    good his/her teaching skills are
  • Popularity contests that reward classroom
    entertainers easy graders
  • Students dont take these seriously

13
Concerns Continued
  • Student opinion can be affected by variables
    beyond the faculty members control..
  • Class size
  • Course content

14
Student Evaluations of Faculty Grades Ratings
  • Research findings support the notion that
    instructors CANNOT purchase FAVORABLE student
    ratings through easy grading.
  • Research also indicates the relationship between
    grades ratings is a function of the better
    achieving students greater interest and
    motivation.

15
Despite reservations
  • More than half of the higher education
    institutions in the nation continue to use
    student appraisals of faculty within their
    faculty evaluation systems.
  • Student perceptions of end-of-course evaluations
    indicate that do take these seriously, but they
    need to understand how the results are used

16
The Quality of Evaluative Information
  • Reliability Free from error (Computing
    Measurement- accurate and consistent results)
  • Validity Meaning (Measuring what you think you
    are?)
  • Generalizability Representativeness
  • (Whose opinions are reflected by data?) (Does
    sample of info. portray the totality of the
    persons teaching?)
  • Utility (What purposes can the data serve?)

17
The Instrument Asking the right questions
  • Definition of teaching quality (input, process
    product)
  • Purpose of evaluation
  • Usefulness of evaluative info. Information
    should be detailed, diagnostic, and focused on
    specific teaching behaviors and course
    characteristics to derive maximum benefit of
    information.

18
The Instrument Evaluative Questions
  • Two types of evaluative questions found in most
    student evaluation instruments
  • 1) Student reaction to instructor traits or
    behaviors characteristics of course material
    and social and physical environment
  • 2) Student outcomes (Progress toward general or
    specific educational goals)
  • 3) You might find a third type Demographics

19
The Instrument Format of Questions
  • Whether outcomes or trait questions
  • Multiple Choice/Scaled Responses
  • (Best of points for a particular scale
    empirical question)
  • Some better than others
  • Open-ended
  • Augment multiple choice items
  • Helpful constructive recommendations for
    improvement

20
Areas of Measurement Grouping the Questions
  • Global require high inference (making judgments)
  • Instructor
  • Course
  • General (instructional climate)
  • Specific Items (descriptive diagnostic)
  • Course management
  • Instructor characteristics style
  • Reading material/assignments
  • Exams
  • Student behavior (effort, involvement, etc.)
  • Student outcomes of instruction
  • Open-ended

21
High Inference Questions vs. Specific Questions
  • Specific multiple choice/open-ended probing
    questions
  • Diagnostic information
  • Most useful for course/instructor improvement
  • Very general or high inference rating scale items
  • How would you rate this instructors overall
    teaching ability?
  • Little helpful diagnostic information
  • Most useful for administrative decisions,
    advisement of students

22
RSCC Faculty Evaluation Instrument - completed by
students
  • Examples
  • Question 9 The Instructors examinations and/or
    assignments reflected the content and emphasis of
    the course.
  • Question 10 The instructor returned
    examinations and/or assignments within a
    reasonable time and with appropriate
    explanations.
  • Interpretation?
  • Self or course improvement?
  • Exam or assignment improvement?
  • Question 9 reworded The instructors
    examinations reflected the course content.

23
Flexibility of Instrument
  • Many institutions tried a single institutional
    instrument
  • Criticism from faculty and students as irrelevant
    to the specific needs of particular programs
    and/or courses
  • Solutions
  • Leave room on the questionnaire for the
    instructor to add his/her own items
  • Module approach different instruments for
    different programs/courses (labs, distance
    learning courses, clinicals, small or large
    lecture etc.)

24
Flexibility of Instrument
  • Does one size fit all?
  • Different instruments for different types of
    courses?
  • Lab
  • Clinical
  • On-line
  • IDEA Room
  • Small lecture or large lecture
  • Seminar
  • Other

25
Utility of Assessment Information depends on
  • Quality of instrument
  • Reporting
  • Relies on faculty perceptions of evaluation
    process quality issue
  • Student perceptions of the evaluation process

26
Summary
  • In sum, then, provided that data are gathered
    carefully, reported appropriately, and
    interpreted judiciously, student evaluations
    appear to make a useful contribution to course
    improvement, faculty/self development, personnel
    decisions, and, possibly to student learning.

27
Remember
  • Research tells us that Students early
    evaluations are quite stable and instructors
    should be aware that the first few days of class
    may be very important in determining the eventual
    image students evaluate at the end of the course.

28
Questions?
  • Can we measure teaching excellence?
  • Future direction?
  • (Planning/Implementation Time)
Write a Comment
User Comments (0)
About PowerShow.com