Title: ECU Center for Faculty Excellence
1- ECU Center for Faculty Excellence
- Using a Program Outcome Model to Develop an
Assessment Program of Faculty Development
Services - Presented to the 2009 UNC TLTC Conference
- March 2009
- by Dorothy Muller, Kevin Gross, and Joyce Joines
Newman
2Objectives of this Session
- Share background information about the ECU Center
for Faculty Excellence - Describe our recent program review and
recommendations for development - Share our uses of technology in meeting CFE
mission - Explain our plan and first steps in moving to
outcomes assessment beginning with a logic model
3CFE Reorganization
- ECU Center for Faculty Development established in
1995 - Reorganization and creation of the ECU Center for
Faculty Excellence in 2006 (overview) - New/larger facilities to include conference rooms
- Two new positions instructional consultant and
statistics and research consultant - Expanded programming
4CFE Mission
- The mission of the ECU Center for Faculty
Excellence, a unit - within the Division of Academic and Student
Affairs, is to provide - faculty (including tenured, tenure-track,
adjunct, and emerging faculty) with resources and
services that foster and support their success at
the university in teaching, research, and service
and to work with other units and offices to
accomplish that mission. The CFE is committed to
teaching and learning principles and initiatives
designed to promote scholarly teaching, recognize
and reward outstanding teaching, provide
assessment of and growth in teaching and
learning, nurture research, and invite peer
collaborations and review.
5CFE Program Review
- We were charged to provide more programming for
new, continuing, and emerging faculty. During
2007-2008, we offered 65 programs (sessions,
Faculty Interest Groups), in addition to
individual consultations, and new faculty
orientation (a week-long program prior to the
beginning of school in August). During fall 2008
approximately 60 session/workshops were conducted
through the center. - In Summer 2008, two external reviewers who had
reviewed our self-study spent a day meeting with
various campus constituencies and talking with
center staff. - Based upon that review and our SACS preparation,
we determined to enhance our assessment by
supplementing satisfaction surveys with outcomes
assessment.
6Technology in the CFE
- Technology is important in helping the CFE meet
our mission. - Website with program information and links to
resources - Blackboard site for New Faculty Orientation
support - MediaSite recording and posting of many of our
programs (to accommodate DE faculty, as well as
on-campus faculty) - Development of a Second Life site for online
office hours - Online registration and evaluation surveys using
database registration and Perseus surveys
7Website
- Repository for documents, such as peer
observation modified instruments, etc.
8Blackboard
- The Pirates Aboard New Faculty Orientation
Blackboard site provides information and an
example of Blackboard as a teaching platform.
9MediaSite
- Where possible and with permission, MediaSite
recordings are made
10MediaSite
and placed on the server for faculty viewing.
Currently, 136 are available.
11Second Life
- We have received space in Second Life and are
beginning to create our virtual CFE with office
hours.
12Online Registration
- At present we are using an online registration
system developed by an IT Consultant. We are
also investigating using SharePoint.
13Online Registration
- The online registration allows for email
reminders and workshop rosters.
14Surveys of Satisfaction
- After the session, they are asked to complete a
short Perseus survey, which we use in planning.
15Surveys of Satisfaction
- These formative surveys measure faculty
satisfaction and provide suggestions for future
programming.
16Outcomes Assessment
- But surveys of satisfaction do not tell us if we
have accomplished our goals of developing and
enhancing competence and community to foster and
support faculty success at the university. - Therefore, we are developing a program outcomes
assessment methodology to evaluate the
effectiveness of center services using Measuring
Program Outcomes A Practical Approach (United
Way of America, 1996) as a guide.
17 Traditional Measurements
- Inputs Resources dedicated to or consumed by a
program (money, volunteers, facilities,
equipment, supplies, staff time, training,
constraints on program such as regulations) - Activities What the program does with the inputs
to fulfill its mission (strategies, techniques,
types of activities) - Outputs The direct products of program
activities, usually measure in terms of volume of
work accomplished (number of classes taught,
sessions conducted, materials distributed,
participants served)
18 The Program Outcome Model
Measuring Program Outcomes A Practical Approach,
p. 18
19What is an outcome?
- Benefit to a participant of a program
- May be during or after the program
- May be initial, intermediate, or long term
20 What is NOT an outcome?
- Operations such as recruiting or training staff
or volunteers, purchasing or upgrading equipment,
and various support and maintenance activities - Number of participants served
- Participant satisfaction (often measured by
evaluations) - These examples do not represent benefits or
changes in participants and thus are not
outcomes.
21 Outcome Measurement
- Outcomes Benefits or changes for individuals or
populations during or after participating in
program activities what participants know,
think, or can do, or how they behave, or what
their condition is - (related to behavior, skills, knowledge,
attitudes, values, condition or other attributes) - Outcomes can be confused with outcome indicators,
(specific items of data that are tracked to
measure how well a program is achieving an
outcome) or with outcome targets (the objectives
for a programs level of achievement)
22 Why measure outcomes?
- To see if programs really make a difference in
the lives of people! - To provide clearer evidence of actual benefits
for people - To help programs improve services, adapt, and
become more effective - To give managers and staff a clearer picture of
their purpose and leads to better service
delivery - To show both where services are effective and
where they are not as expected
23 Results of Outcome Measurement
- Outcome data can
- Strengthen existing services
- Target effective services for expansion
- Identify staff and volunteer training needs
- Develop and justify budgets
- Prepare long-range plans
- Focus administrators attention on programmatic
issues - Assure potential participants and funders that
programs produce results
24 Limitations and Potential Problems of Outcomes
Measurement
- Outcome findings may show that participants are
not experiencing intended benefits, but do not
show where the problem lies or how to fix it. To
correct problems, an organization probably still
needs to collect traditional data. Outcomes
measurement is in addition to existing data
collection efforts, not an alternative. - Outcomes measurement does not prove that a
program, or it alone, caused the outcomes. Only
program impact research can separate a programs
influence from other factors that might affect
participants. - Outcomes measurement doesnt reveal whether the
outcomes being measured are the right ones for a
particular program, the ones that reflect
meaningful change in participants. - There are no established, readily available
indicators and measurement methods for the
outcomes of some programs.
25Deciding Where to Start
- It is best to start with just 1 or 2 programs.
- A program is a set of related activities and
outputs having common or closely related purposes
to which resources are assigned. - The ECU CFE has started with the statistics and
research program.
26 CFE Inputs (Research/Stats)
- Resources dedicated to or consumed by the program
- Consultant position
- Consultant training
- Facilities office space
- Equipment and supplies computers, software,
statistical/research resources - Staff support
27 CFE Activities (Research/Stats)
- What the program does with the inputs to fulfill
its mission - Meeting with individual faculty to discuss/work
on research - Research and planning for individual
consultations and follow up tasks - Planning CFE workshops
- Teaching CFE workshops
- Networking with faculty and staff resource people
- Cultivating collaborations
28 CFE Outputs (Research/Stats)
- The direct products of program activities
- Number of faculty assisted
- Number of meetings
- Hours in meetings
- Follow up hours
- Number of workshops
- Number of workshop participants
29CFE Outcomes (Research/Stats)
- Benefits for participants during and after
program activities - Article submitted for/accepted for publication
- Present research at conference
- Submit grant proposal/research funded
- Gain knowledge and skills in statistical analyses
- Gain knowledge and skills in research design
- Gain confidence in abilities as a researcher
- Network with other researchers at ECU
- Learn ECU research related infrastructure
- Acquire tenure or promotion
- Contribute to scholarship in field
- Establish a research agenda
30 8 Steps to Outcome Measurement
CFE is working on Step 2
Measuring Program Outcomes A Practical Approach,
p. 6
31 Step 1. Get Ready
- Assemble and orient an outcome measurement work
group (can seek additional input, feedback,
expertise as needed) - Decide which program to measure
- Develop a timeline
- Share your game plan with key players
32 Step 2. Choose the Outcomes You Want to Measure
- Gather ideas for program outcomes
- Construct a logic model for your program
- Select the outcomes that are important to measure
- Get feedback
33Brainstorming Your Outcomes
- Use a worksheet to brainstorm about the inputs,
activities, outputs, and outcomes - Dont worry about measurement at this time that
comes later. - Avoid thinking too big think at the individual
(i.e., person) level not institutional level.
34 Constructing a Logic Model
- Use a worksheet to brainstorm about the logic
model. - A logic model is a description of how the program
theoretically works to achieve benefits for
participants. - It is the If-Then sequence of changes that the
program intends to set in motion through its
inputs, activities, and outputs. - Level of outcomes
- Initial most closely related to and influenced
by programs outputs. - Most direct program influence.
- Rarely represent major change, closer to outputs.
- Intermediate link a programs initial outcomes
to longer-term outcomes. - Often are changes in behavior that result from
participants new knowledge, attitudes, or
skills. - Longer-term the ultimate outcomes a program
desires to achieve for its participants. - Less direct program influence over achievement.
- Greater likelihood of intervening forces.
- Should not go beyond the programs purpose or
beyond scope of target audience. - Construct your program logic model using a
diagram. - CFE research/stats logic model
35Next Steps
- Steps 3 through 8 identify how the process
continues. - Refining your measurement system may require
completing Steps 5 through 7 more than once.
36 Step 3. Specify Indicators for Your Outcomes
- Specify one or more indicators for each outcome
- Decide what factors could influence participant
outcomes - Use indicators you can influence
- The specific items of information that track a
programs success
37 Step 4. Prepare to CollectData on Your Indicators
- Identify data sources for your indicators
- Design data collection methods
- Pretest your data collection instruments and
procedures
38 Step 5. Try Out Your Outcome Measurement System
- Develop a trial strategy
- Prepare the data collectors
- Track and collect outcome data
- Monitor the process
39 Step 6. Analyze ReportYour Findings
- Enter the data and check for errors
- Tabulate the data
- Analyze the data
- Provide explanatory information related to your
findings - Present your data in clear understandable terms
40 Step 7. Improve Your Outcome Measurement System
- Review your trial-run experience
- Make necessary adjustments
- Start full-scale implementation
- Monitor and review your system periodically
41 Step 8. Use Your Findings Internally
- Provide direction for staff
- Identify training technical assistance needs
- Identify program improvement needs and strategies
- Support annual long range planning
- Guide budgets justify resource allocation
- Suggest outcome targets
- Focus attention on policy programmatic issues
42 Step 8. Use Your Findings Externally
- Recruit talented staff volunteers
- Promote your program to participants referral
sources - Identify partners for collaborations
- Enhance your programs public image
43Questions for Discussion Board
- Does the logic model for the CFE research/stats
program - Include all the activities and outcomes that are
important for the CFE? - Make the appropriate connections between the
CFEs inputs, activities, outputs, and outcomes? - Are the outcomes identified as important to
measure - Relevant to the mission/objectives of the CFE?
- Outcomes for which the CFE should be held
accountable? - Likely to be effective in communicating the
benefits of what the CFE does for ECU faculty?
44Contact for Additional Information
- Center for Faculty Excellence Website
- www.ecu.edu/cfe