Title: Evaluating Supported Employment Programs
1Evaluating Supported Employment Programs
Anthony (Tony) Plotner, MS, CVE,
CRC plotner_at_uiuc.edu
Kathleen (Kat) Oertle, MS, CVT,
CRC oertle_at_uiuc.edu
- October 31, 2006
- Rehabilitation Services Administration (RSA)
Region V - Community Rehabilitation Providers Rehabilitation
Continuing Education Program (CRP-RCEP) - at the University of Illinois, Urbana-Champaign
- www.ed.uiuc.edu/illinoisrcep/
2Objectives
- At the completion of the presentation you will
know - what evaluation is,
- the reasons to conduct an evaluation,
- how to plan and implement an evaluation
- and what to do with the results.
3Outline
- What
- Evaluation and Research
- Challenges and Myths
- Why
- How
- Whats next
- Field Experience
4What is Evaluation?
- Program evaluation is
- the process of carefully collecting information
about a program - in order to improve the program
- to make informed decisions about the program
- and to make judgments about program quality
5Evaluation ? Research
- Documenting the impact of a program is not the
same as research - Program evaluation findings contextual
information - Evaluation can be a part of a research project
6Views About Evaluation Challenges and Myths
- There is not enough time to do evaluations
- Evaluation is too difficult
- Evaluation is very expensive
- Evaluation is forced by outsiders
- Intuition is enough
- Evaluation may be threatening
- It takes away from doing my job
7A Few of Our Evaluation Assumptions
- Stakeholder involvement and participatory
evaluation - a process
- the ideal
- important for utilization to occur
- Assessing program quality for program improvement
8SUPPORTED EMPLOYMENT PROGRAM EVALUATION
Evaluation Planning and Designing (Evaluation
Team)
Reassess Evaluation Questions and Plan
O N G O I N G P R O G R A M E V
A L U A T I O N
Data Collection (Evaluation Team)
Observations at Employments Sites SEP Staff
Meeting Work Area (The Office)
Documents Analysis of Consumer Files SEP Staff
Training SEP Meeting Minutes Instructional
Materials
Interviews with SEP Staff Employers Consumers
Data Analysis (Evaluation Team)
Key Findings And Recommendations (Evaluation
Team)
Key Current Evaluation Activity Future
Evaluation Activity
Action Planning And Implementation ( Evaluation
Team)
Reporting of Results (Evaluation Team)
9Crafting the Framework
- What
- Why
- Who
- Where
- How
- What
10Getting Started
Evaluation Planning and Designing
- What is the program to be evaluated?
- What is the issue?
- What is the behavior of interest?
- What is the goal(s)?
- What are the community concerns?
- What is the intention of the evaluation?
Planning and Designing
11Why Do Evaluation?
- To clarify program goals
- To determine whether the program is working as
planned - To answer crucial organizational questions/make
decisions - To comply with funders requirements
- To examine a programs impact on participants
- To verify that resources are used to meet unmet
needs - To maintain and improve quality
Planning and Designing
12Audience
- You
- Consumers
- Funders
- Administration
- Community
- Professionals
Planning and Designing
13Forming the Evaluation Questions
Why
?
?
?
What
Planning and Designing
?
Who
?
14Evaluation Questions Some Examples
- Is the SEP staff providing quality services to
their customers (e.g., supported employees,
employers, and staff)? - Are the needs being met of SEP customers? How are
staff anticipating customers needs and evolving
to meet these needs? - Are the job development and marketing efforts
effective and how can this area be improved? - Does the SEP staff develop and maintain
relationships with the community and current
employers?
15Information Sources
- Consumers
- Staff
- Employers
- Community Members
- Records
- Professional Literature
- Field Standards
- Your Perceptions and Experiences
Data Collection
16Some Methods of Data Collection
- Survey
- Interviews
- Observation
- Focus Group
- Review of Program Reports
- Records Review
Data Collection
17Data Quality Criteria
- Triangulation
- Instrument Evaluation
- Valid (measuring what you think)
- Reliable (measuring consistently)
- Metaevaluation
- Field Standards
- Checklist
- Peer Review
Data Collection
18SEP Evaluation Plan Guiding Document
19Example SEP Evaluation Plan
20Example SEP Evaluation Plan
21Example SEP Evaluation Plan
22Example SEP Evaluation Plan
23Management Plan
- Timeline
- Responsibilities
- Product
24Evaluation Activity Implementation Plan
25Example Evaluation Activity Implementation Plan
26Making Sense of the Data
- Criteria for Judging Program Quality
- Program Mission
- Program Objectives and Goals
- Best Practice and Field Standards
Data Analysis
27Criteria for Making Judgments of Program Quality
Data Analysis
28Example Criteria for Making Judgments of Program
Quality
29Now What?
Key Findings And Recommendations
- Recommendations
- Reporting
- Action Planning and Implementation
- On-going Evaluation
30Findings From Our Field Experience
- Untapped Potential
- Lack of SEP Training
- Limited Outcomes
- Nonexistent and One-dimensional Relationships
31(No Transcript)
32(No Transcript)
33Conclusion
- View evaluation as learning
- Integrate evaluation into the way we work
- Build evaluation in upfront
- Ask tough questions
- Make measurement meaningful
- Be accountable
34Some Sources For More Information
- THE JOINT COMMITTEE ON STANDARDSFOR EDUCATIONAL
EVALUATION is incorporated as a private nonprofit
organization. In addition to setting standards in
evaluation, it is also involved in reviewing and
updating its published standards (every five
years) training policymakers, evaluators, and
educators in the use of the standards and
serving as a clearinghouse on evaluation
standards literature. http//www.wmich.edu/evalctr
/jc/ - The Evalutation Center - PROGRAM EVALUATIONS
METAEVALUATION CHECKLIST (Based on The Program
Evaluation Standards) The Evaluation Center's
mission is to advance the theory, practice, and
utilization of evaluation. The Center's principal
activities are research, development,
dissemination, service, instruction, and national
and international leadership in evaluation.
http//www.wmich.edu/evalctr/checklists/program_m
etaeval.htm
35References
- Brooks-Lane, N., Hutcheson, S., Revell, G.
(2005). Supporting consumer directed employment
outcomes. Journal of Vocational Rehabilitation,
23, 123-134. - DiLeo, D. Langton, D. (1996). Facing the
future. Best practices in supported employment.
St. Augustine, Florida Training Resource
Network, Inc. - Lavin, D. (2000). Reach for the stars. Achieving
high performance as a community rehabilitation
professional. Spring Lake Park, Minnesota Rise,
Inc. - Leung, P. (2006, January). Evaluation training.
Paper presented at the meeting of the Association
Community Rehabilitation Educators, San Antonio,
TX. - Luecking, R. G., Fabian, E. S., Tilson, G. P.
(2004). Working relationships Creating career
opportunities for job seekers with disabilities
through employer partnerships. Baltimore, MD
Paul H. Brookes, Publishing Co.
36References
- Patton, M. Q. (1997). Utilization-focused
evaluation The new century text, edition 3.
Thousand Oaks, CA Sage, Chapters 1- 4. - Plotner, A. J., Oertle, K. M., Trach, J. S. (in
preparation). Community rehabilitation provider
success Where we are at and how we can improve.
University of Illinois, Urbana-Champaign. - The Joint Committee on Standards for Educational
Evaluation The Program Evaluation Standards
retrieved 02/19/05 from http//www.wmich.edu/evalc
tr/jc/. - Schwandt, T. A. (2002). Chapter 11 Notes on
being an evaluator, Evaluation Practice
Reconsidered, pp. 187-194. New York Peter Lang. - Stufflebeam, D. L. (1999) Program evaluations
metaevaluation checklist (Based on The Program
Evaluation Standards) retrieved 02/19/05 from
http//www.wmich.edu/evalctr/checklists/program_me
taeval.htm