Title: Evaluating HRD Programs
1Evaluating HRD Programs
2Learning Objectives
- Define evaluation and explain its role/purpose
in HRD. - Compare different models of evaluation.
- Discuss the various methods of data collection
for HRD evaluation. - Explain the role of research design in HRD
evaluation. - Describe the ethical issues involved in
conducting HRD evaluation. - Identify and explain the choices available for
translating evaluation results into dollar terms.
3Effectiveness
- The degree to which a training (or other HRD
program) achieves its intended purpose - Measures are relative to some starting point
- Measures how well the desired goal is achieved
4Evaluation
5HRD Evaluation
- It is the systematic collection of descriptive
and judgmental information necessary to make
effective training decisions related to the
selection, adoption, value, and modification of
various instructional activities.
6In Other Words
- Are we training
- the right people
- the right stuff
- the right way
- with the right materials
- at the right time?
7Evaluation Needs
- Descriptive and judgmental information needed
- Objective and subjective data
- Information gathered according to a plan and in a
desired format - Gathered to provide decision making information
8Purposes of Evaluation
- Determine whether the program is meeting the
intended objectives - Identify strengths and weaknesses
- Determine cost-benefit ratio
- Identify who benefited most or least
- Determine future participants
- Provide information for improving HRD programs
9Purposes of Evaluation 2
- Reinforce major points to be made
- Gather marketing information
- Determine if training program is appropriate
- Establish management database
10Evaluation Bottom Line
- Is HRD a revenue contributor or a revenue user?
- Is HRD credible to line and upper-level managers?
- Are benefits of HRD readily evident to all?
11How Often are HRD Evaluations Conducted?
- Not often enough!!!
- Frequently, only end-of-course participant
reactions are collected - Transfer to the workplace is evaluated less
frequently
12Why HRD Evaluations are Rare
- Reluctance to having HRD programs evaluated
- Evaluation needs expertise and resources
- Factors other than HRD cause performance
improvements e.g., - Economy
- Equipment
- Policies, etc.
13Need for HRD Evaluation
- Shows the value of HRD
- Provides metrics for HRD efficiency
- Demonstrates value-added approach for HRD
- Demonstrates accountability for HRD activities
14Make or Buy Evaluation
- I bought it, therefore it is good.
- Since its good, I dont need to post-test.
- Who says its
- Appropriate?
- Effective?
- Timely?
- Transferable to the workplace?
15Models and Frameworks of Evaluation
- Table 7-1 lists six frameworks for evaluation
- The most popular is that of D. Kirkpatrick
- Reaction
- Learning
- Job Behavior
- Results
16Kirkpatricks Four Levels
- Reaction
- Focus on trainees reactions
- Learning
- Did they learn what they were supposed to?
- Job Behavior
- Was it used on job?
- Results
- Did it improve the organizations effectiveness?
17Issues Concerning Kirkpatricks Framework
- Most organizations dont evaluate at all four
levels - Focuses only on post-training
- Doesnt treat inter-stage improvements
- WHAT ARE YOUR THOUGHTS?
18Data Collection for HRD Evaluation
- Possible methods
- Interviews
- Questionnaires
- Direct observation
- Written tests
- Simulation/Performance tests
- Archival performance information
19Interviews
- Advantages
- Flexible
- Opportunity for clarification
- Depth possible
- Personal contact
- Limitations
- High reactive effects
- High cost
- Face-to-face threat potential
- Labor intensive
- Trained observers needed
20Questionnaires
- Advantages
- Low cost to administer
- Honesty increased
- Anonymity possible
- Respondent sets the pace
- Variety of options
- Limitations
- Possible inaccurate data
- Response conditions not controlled
- Respondents set varying paces
- Uncontrolled return rate
21Direct Observation
- Advantages
- Nonthreatening
- Excellent way to measure behavior change
- Limitations
- Possibly disruptive
- Reactive effects are possible
- May be unreliable
- Need trained observers
22Written Tests
- Advantages
- Low purchase cost
- Readily scored
- Quickly processed
- Easily administered
- Wide sampling possible
- Limitations
- May be threatening
- Possibly no relation to job performance
- Measures only cognitive learning
- Relies on norms
- Concern for racial/ ethnic bias
23Simulation/Performance Tests
- Advantages
- Reliable
- Objective
- Close relation to job performance
- Includes cognitive, psychomotor and affective
domains
- Limitations
- Time consuming
- Simulations often difficult to create
- High costs to development and use
24Archival Performance Data
- Advantages
- Reliable
- Objective
- Job-based
- Easy to review
- Minimal reactive effects
- Limitations
- Criteria for keeping/ discarding records
- Information system discrepancies
- Indirect
- Not always usable
- Records prepared for other purposes
25Choosing Data Collection Methods
- Reliability
- Consistency of results, and freedom from
collection method bias and error - Validity
- Does the device measure what we want to measure?
- Practicality
- Does it make sense in terms of the resources used
to get the data?
26Type of Data Used/Needed
- Individual performance
- Systemwide performance
- Economic
27Individual Performance Data
- Individual knowledge
- Individual behaviors
- Examples
- Test scores
- Performance quantity, quality, and timeliness
- Attendance records
- Attitudes
28Systemwide Performance Data
- Productivity
- Scrap/rework rates
- Customer satisfaction levels
- On-time performance levels
- Quality rates and improvement rates
29Economic Data
- Profits
- Product liability claims
- Avoidance of penalties
- Market share
- Competitive position
- Return on investment (ROI)
- Financial utility calculations
30Use of Self-Report Data
- Most common method
- Pre-training and post-training data
- Problems
- Mono-method bias
- Desire to be consistent between tests
- Socially desirable responses
- Response Shift Bias
- Trainees adjust expectations to training
31Research Design
- Specifies in advance
- the expected results of the study
- the methods of data collection to be used
- how the data will be analyzed
32Assessing the Impact of HRD
- Money is the language of business.
- You MUST talk dollars, not HRD jargon.
- No one (except maybe you) cares about the
effectiveness of training interventions as
measured by and analysis of formal pretest,
posttest control group data.
33HRD Program Assessment
- HRD programs and training are investments
- Line managers often see HR and HRD as costs
i.e., revenue users, not revenue producers - You must prove your worth to the organization
- Or youll have to find another organization
34Two Basic Methods for Assessing Financial Impact
- Evaluation of training costs
- Utility analysis
35Evaluation of Training Costs
- Cost-benefit analysis
- Compares cost of training to benefits gained such
as attitudes, reduction in accidents, reduction
in employee sick-days, etc. - Cost-effectiveness analysis
- Focuses on increases in quality, reduction in
scrap/rework, productivity, etc.
36Return on Investment
- Return on investment Results/Costs
37Calculating Training Return On Investment
Â
Â
Results
Results
Â
Â
Operational
How
Before
Expressed
After
Differences
Results Area
Measured
Training
( or )
in
Training
1.5 rejected
.5
720 per day
Quality of panels
rejected
2 rejected
Â
Â
1,440 panels
1,080 panels
360 panels
172,800
Â
Â
  per day
  per day
Â
  per year
Housekeeping
Visual
10 defects
2 defects
8 defects
Not measur-
  inspection
  (average)
  (average)
  able in
Â
Â
  using
Â
Â
Â
Â
Â
  20-item
  checklist
Â
Â
Â
Â
Â
Â
Â
Â
Â
Â
Preventable
Number of
24 per year
16 per year
8 per year
Â
  accidents
  accidents
Â
Â
Â
Â
Â
Direct cost
144,000
96,000 per
48,000
48,000 per
  of each
  per year
  year
  year
Â
Â
  accident
 Return Investment
Â
Â
Â
Â
Â
Â
Â
Â
Total savings 220,800.00
Operational Results Training Costs
ROI
Â
Â
Â
Â
220,800 32,564
6.8
Â
Â
Â
Â
SOURCE From D. G. Robinson J. Robinson (1989).
Training for impact. Training and Development
Journal, 43(8), 41. Printed by permission.
38Measuring Benefits
- Change in quality per unit measured in dollars
- Reduction in scrap/rework measured in dollar cost
of labor and materials - Reduction in preventable accidents measured in
dollars - ROI Benefits/Training costs
39Ways to Improve HRD Assessment
- Walk the walk, talk the talk MONEY
- Involve HRD in strategic planning
- Involve management in HRD planning and estimation
efforts - Gain mutual ownership
- Use credible and conservative estimates
- Share credit for successes and blame for failures
40HRD Evaluation Steps
- Analyze needs.
- Determine explicit evaluation strategy.
- Insist on specific and measurable training
objectives. - Obtain participant reactions.
- Develop criterion measures/instruments to measure
results. - Plan and execute evaluation strategy.
41Summary
- Training results must be measured against costs
- Training must contribute to the bottom line
- HRD must justify itself repeatedly as a revenue
enhancer, not a revenue waster