GTEnvE Ph'D' Comprehensive Exam Survey - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

GTEnvE Ph'D' Comprehensive Exam Survey

Description:

Numerous student concerns voiced anecdotally ... remove 'hazing' approach to oral component. Survey Evaluation. Survey coverage of concerns ... – PowerPoint PPT presentation

Number of Views:74
Avg rating:3.0/5.0
Slides: 27
Provided by: gtg61
Category:

less

Transcript and Presenter's Notes

Title: GTEnvE Ph'D' Comprehensive Exam Survey


1
GT-EnvEPh.D. Comprehensive ExamSurvey
Committee Members Saritha Vishwanathan Santosh
Chandru Eunhyea Chung Soonchul Kwon K.J.
Liao Scott Rogers AEES Faculty Advisor Dr.
James Mulholland
  • AEES DAE Committee
  • March 28, 2007

2
Motivations Purposes for Study
  • Numerous student concerns voiced anecdotally
  • Recent faculty interest in diagnosing any exam
    concerns deficiencies
  • Need for formal assessment of student experiences
    in taking the Ph.D. comprehensive exam
  • Address specific concerns of students faculty
    in all important aspects of the exam experience
  • Represent sufficiently all student subgroups
  • Able to apply previous DAEC assessment work to
    this problem

3
Exam Recent History
  • Major change in exam format
  • Before 2004
  • written component select any 6 questions from
    available questions (each professor submitted one
    question)
  • written oral components both taken before
    faculty evaluation of performance occurs
  • allowed 2 opportunities to pass
  • 2004 to present
  • written component 3 sections (biological,
    chemical, physical) must be passed before moving
    onto oral component
  • can take either all or only part of written
    component at one time
  • unlimited written component attempts
  • allowed 2 opportunities to pass oral component
  • Cumulative summary of fates of written
    component examinees (after 2004 change)
  • students that took all 3 sections at least once
    25
  • passed all sections upon 1st try 12 (48)
  • passed all sections upon multiple tries 20
    (80 total passing to date)
  • others have not / never passed all sections upon
    at least one try

4
Assessment Approach (1)
  • Survey Design
  • September 2006 January 2007
  • Collected question ideas from students faculty
  • Question selection using established DAEC
    criteria
  • To normalize perspectives, requested data
    regarding first exam attempt
  • Employed documented survey design principles
  • Sensitivity Concerns
  • emotional subject for many students
  • respondent comfort in smaller sample size
  • lack of hindsight perspective especially for
    2006 examinees
  • DAEC involvement in survey implementation
  • Quality Assurance
  • critical review by GT Office of Assessment

5
Assessment Approach (2)
  • Distribution
  • February 2007
  • Hand-Delivery Hand-Collection increases
    response
  • Sample Pool eligible accessible
  • those who had already taken exam
  • current Ph.D. students Ph.D. alumni
  • on-campus off-campus (in Atlanta)
  • What about the inaccessible?
  • time expense to send and retrieve survey
  • potential confidentiality breaches
  • logistics involved
  • most took exam before major 2004 change
  • if respondents needed, can be included later

6
Assessment Approach (3)
  • Analysis
  • Univariate Analysis
  • descriptive statistics regarding individual
    variables
  • smaller sample size -- caution in normal
    distribution assumptions
  • Multivariate Analysis
  • test possible variable relationships
  • smaller subgroups -- not as rigorous as done
    previously
  • t-tests replaced with mean SD computations
  • correlation analysis prone to more error
  • Sensitivity Analysis
  • gauge potential bias caused by DAEC member
    involvement
  • with small subgroup of members, small potential
    for serious bias
  • omitted for brevity in this presentation
  • can be obtained from DAEC files

7
Respondent Breakdown (1)
  • Number of Respondents 28
  • Response Rate
  • hard to count exactly the eligible, accessible
    possible respondents
  • estimate 35 - 40 were eligible accessible
  • response rate 70 - 80
  • underrepresentation of current off-campus
    students

8
Respondent Breakdown (2)
  • Demographics
  • When respondent first attempted exam
  • Respondent first language
  • native English speakers 8 (28.6)
  • non-native speakers 20 (71.4)
  • Year of study tenure
  • DAEC members 5 students (17.9)

9
Results Exam Timing
  • Time of calendar year best
  • Time of study tenure best
  • comments
  • those indicating later
  • time to take all courses
  • more time to formulate research plan
  • time for knowledge to sink in
  • those indicating earlier
  • sufficient time to take courses
  • get it out of the way
  • knowledge from classes fresh
  • time to retake if fail
  • more free time to study
  • Best amount of time between written and oral
    components

10
Results Preparation Written Oral (1)
  • Overall Preparation
  • Written
  • 95 CI 3.07 0.30 (28 respondents)
  • Oral
  • 95 CI 2.93 0.33 (27 respondents)

For five-point scale questions, 5 corresponds
to the extreme presence/positive end of the scale
(e.g., extremely prepared), and 1 corresponds
to the extreme absence/negative end of the scale
(e.g., not prepared).
11
Results Preparation Written Oral (2)
  • Time allotted for exam study
  • When respondent began study
  • Study time per week

12
Results Preparation Written Oral (3)
  • Coursework
  • Helpfulness 95 CI 3.93 0.30 (28
    respondents)
  • Able to take all needed classes?
  • Yes 18 (64.3) No 10 (35.7)
  • Reasons given for not being able to take classes
  • not enough time before exam
  • course availability
  • first-year courses too demanding to take enough
    key courses

13
Results Preparation Written Oral (4)
  • Coursework (continued)
  • Particularly helpful courses
  • chemical principles (17 respondents), microbial
    principles (13), process principles (12), fate of
    contaminants (8), air pollution (3)
  • key reasons
  • course material mirrors comprehensive exam
    questions
  • teach important, fundamental concepts
  • Suggestions
  • instruction should better connect coursework to
    Ph.D. exam to general Ph.D. applicability
  • challenge of courses and exam should be comparable

14
Results Preparation Written Oral (5)
  • Old written comprehensive exams
  • Helpfulness 95 CI 3.44 0.47 (27
    respondents)

15
Results Preparation -- AEES Mock Exam
  • Mock exam participation
  • Yes 22 (78.6) No 5 (17.9)
  • Helpfulness to written component preparation
  • Helpfulness to oral component preparation
  • Suggestions
  • more preparation by students participating as
    examiners
  • use standardized question list
  • more focus on questions outside of student
    research area
  • faculty involvement
  • simulation of written component

16
Results Testing Written (1)
  • Unexpectedness of Content
  • 95 CI 2.25 0.37 (note scale measures
    unexpectedness)
  • Particular content unexpected varied comments
  • Unexpectedness overall preparation r -0.47
  • unexpectedness inversely related to preparation
    for many
  • Aspects associated with specificity/broadness
  • Different interpretations of specific broad
  • Some thought exam overall too specific, others
    thought too broad breadth vs. depth?
  • Biological physical questions/sections noted as
    too specific by some

17
Results Testing Written (2)
  • Difficulty relative to expected 95 CI 3.21
    0.19
  • Ampleness of time to answer questions
  • 95 CI 2.57 0.32
  • Particular sections/questions consuming too much
    time
  • chemical principles questions/section (multi-step
    requiring much calculation time)
  • biological questions/section
  • Balance of knowledge tested
  • 95 CI 2.86 0.31
  • Comments
  • 3 sections provide balance
  • questions/sections biased toward particular
    courses
  • numbers of types of faculty not balanced, so exam
    not balanced
  • imbalance in professor styles of asking questions

18
Results Testing Oral (1)
  • Unexpectedness of Content
  • 95 CI 2.11 0.44 (27 respondents) (note
    scale measures unexpectedness)
  • Aspects associated with specificity/broadness
  • Different interpretations of specific broad
  • Some thought exam overall too specific, others
    thought too broad breadth vs. depth?
  • Comments
  • questions irrelevant to research too specific
  • questions focused mainly on research
  • specific to weaker areas of written component
  • questions asked in specific ways by professors
    hindered understanding

19
Results Testing Oral (2)
  • Difficulty relative to expected
  • 95 CI 3.26 0.28 (27 respondents)
  • Difficulty unexpectedness r 0.44
  • Difficulty directly related to exam surprises for
    some?
  • Balance of knowledge tested
  • 95 CI 2.74 0.36 (27 respondents)
  • Difficulty knowledge balance r -0.42
  • Difficulty result of imbalance for some?

20
Results Communication
  • Understanding of grading criteria
  • Communication of evaluation of written exam
  • Other communication issues
  • awareness of exam format conditions
  • connection of coursework to exam
  • varied instances of content unexpectedness
  • styles of questioning
  • others not yet discussed?

21
Results Outcomes
  • Written exam performance vs. expected
  • Research applicability of knowledge tested by
    written exam
  • Some did not answer question missed
    accidentally, intentionally?
  • 95 CI 2.46 0.35
  • How applicable should knowledge be?
  • Proper goal of exam
  • screen out unqualified
  • test EnvE fundamentals
  • test abilities in specific research area

22
Results Suggestions (1)
  • Exam satisfaction

23
Results Suggestions (2)
  • Suggestions for specific changes
  • better means of testing ability to perform
    research
  • written component should test EnvE fundamentals
    oral should focus solely on research topic
  • ability to select problems on written component
    according to research area
  • written component should assess problem-solving
    approaches over knowledge recall
  • should be able to pass exam if student makes B
    or higher in core courses
  • standardize faculty participating in oral exam
  • remove hazing approach to oral component

24
Survey Evaluation
  • Survey coverage of concerns
  • Some did not answer question missed
    accidentally, intentionally?
  • Coverage exam satisfaction r 0.33
  • Some students that were dissatisfied with exam
    may have other concerns to address?
  • Comments about survey
  • Seems pessimism exists for some about ability for
    assessment to lead to meaningful changes

25
Concluding Remarks
  • Key findings
  • Student-faculty communication deficiencies
  • Balance issues
  • Ampleness of time to answer questions on written
    exam
  • Applicability of knowledge tested to Ph.D. work
  • Adequacy of time to prepare for exam
  • Feasibility of change
  • Continuing dialogue
  • among students
  • among faculty
  • between students faculty
  • Testing effectiveness of changes to exam and to
    associated procedures

26
Thank you for your time!
  • Please feel free to contact DAEC in order to
    contribute additional input or to ask any
    questions....
Write a Comment
User Comments (0)
About PowerShow.com