Title: NRS Data Monitoring for Program Improvement
1NRS Data Monitoring for Program Improvement
2ObjectivesDay 1
- Describe the importance of getting involved with
and using data - Identify four models for setting performance
standards as well as the policy strategies,
advantages,and disadvantages of each model - Determine when and how to adjust standards for
local conditions - Set policy for rewards and sanctions for local
programs - Identify programmatic and instructional elements
underlying the measures of educational gain, NRS
follow-up, enrollment, and retention. - Â Â Â Â Â
3AgendaDay 1
- Welcome, Introduction, Objectives, Agenda Review
- The Power of Data
- Why Get Engaged with Data? Exercise
- The Data-driven Program Improvement Model
- Setting Performance Standards
- Adjusting Standards for Local Conditions
- Establishing a Policy for Rewards and Sanctions
- Getting Under the Data
- Data Pyramids
- Data Carousel
- Evaluation and Wrap-up for Day 1
4ObjectivesDay 2
- Distinguish between the uses of desk reviews and
on-site monitoring of local programs - Identify steps for monitoring local programs
-
- Identify and apply key elements of a change
model and - Work with local programs to plan for and
implement changes that will enhance program
performance and quality.
5AgendaDay 2
- Agenda Review
- Planning for and Implementing Program Monitoring
- Desk Reviews Versus On-site Reviews
- Data Sources (small group work)
- Steps and Guidelines for Monitoring Local
Programs - Planning for and Implementing Program Improvement
- A Model of the Program Improvement Process
- State Action Planning
- Closing and Evaluation
-
6STOP! Why Get Engaged with Data?
7Question for Consideration
- Why is it important to be able to produce
evidence of what your state (or local) adult
education program achieves for its students?
8The Motivation Continuum
- Intrinsic Extrinsic
- Which is the more powerful force for change?
9NRS Data-driven Program Improvement (Cyclical
Model)
- STEPS
- Set performance standards
- Examine program elements underlying the data
- Monitor program data, policy, and procedures
- Plan and implement program improvement
- Evaluate progress and revise, as necessary, and
recycle
10Whats Under Your Data?The Powerful Ps
__Performance_(Data)_ Program Policies
Procedures Processes Products
11NRS Data-driven Program Improvement Model
Set Performance Standards
NRS DATA
Examine Program Elements Underlying the Data
Plan and Implement Program Improvement Evaluate
Improvement
Monitor Program Data, Policy, Procedures
12Educational Gains for ESL Levels and Performance
Standards
Exhibit 1-2
13Questions Raised by Exhibit 1-2
- How were performance standards set? Based on past
performance? - Are standards too low at the higher levels?
- Is performance pattern similar to that of
previous years? If not, why not? - What are programs assessment and placement
procedures? Same assessments for high and low
ESL? - How do curriculum and instruction differ by
level? - What are student retention patterns by level?
14The Power of Data Setting Performance Standards
15Essential Elements of Accountability Systems
- Goals
- Measures
- Performance Standards
- Sanctions and Rewards
16National Adult Education Goals
- Reflected in NRS Outcome Measures of
- educational gain,
- GED credential attainment,
- entry into postsecondary education, and
- employment.
17Performance Standards
- Similar to a sales quota how well are you
going to perform this year? - Should be realistic and attainable, but
- Should stretch you toward improvement
-
- Set by each state in collaboration with ED
- Each states performance is a reflection of the
aggregate performance of all the programs it
funds
18Standards-setting Models
- Continuous Improvement
- Relative Ranking
- External Criteria
- Return on Investment (ROI)
19Continuous Improvement
- Standard based on past performance
- Designed to make all programs improve compared to
themselves - Works well when there is stability and a history
of performance on which to base standard - Ceiling reached over time, resulting in little
additional improvement
20Relative Ranking
- Standard is mean or median performance of all
programs - Programs ranked relative to each other
- Works for stable systems where median performance
is acceptable - Improvement focus mainly on low-performing
programs - Little incentive for high-performing programs to
improve
21External Criteria
- Set by formula or external policy
- Promotes a policy goal to achieve a higher
standard - Used when large-scale improvements are called
for, over the long term - No consideration of past performance
unrealistic, unattainable
22Return on Investment
- Value of program Cost of program
- A business model answers question, Are services
or program worth the investment? - Can be a powerful tool for garnering funding
(high ROI) or for losing funding (low ROI) - May ignore other benefits of program
23Decision Time for State Teams
- Which model(s) do you favor for setting standards
for/with locals? - Is it appropriate to use one statewide model or
different models for different programs? - How will you involve the locals in setting the
standards they will be held to?
24Question for Consideration
-
- How do the standard-setting model(s) that
states select represent a policy statement on the
relationship between performance and quality that
states want to instill in local programs?
25Adjusting Standards for Local Conditions
-
- Research suggests that standards often need to
be adjusted for local conditions before locals
can work to improve program quality. - WHY IS THIS SO?
26Factors that May Require Adjustment of Standards
- Student Characteristics
- An especially challenging group
- Students at lower end of level
- Influx of different types of students
- Local Program Elements
- External Conditions
-
27Shared Accountability
- State and locals share responsibility to meet
accountability requirements - State provides tools and environment for improved
performance - Locals agree to work toward improving performance
28Locals should know
- The purpose of the performance standards
- The policy and programmatic goals the standards
are meant to accomplish - The standard-setting model that the state adopts
and - That State guidance and support is available to
locals in effecting change.
29Shared Accountability
- Which state-initiated efforts have been easy to
implement at the local level? - Which have not?
- What factors contributed to locals successfully
and willingly embracing the effort? - What factors contributed to a failed effort?
30Shared Accountability
31What About Setting Rewards and Sanctions?
- Which is the more powerful motivator rewards or
sanctions? - List all the different possible reward structures
you can think of for local programs. - How might sanctioning be counter-productive?
- List sanctioning methods that will not destroy
locals motivation to improve or adversely affect
relationships with the state office.
32Variations on a Theme Exercise
- (Refer to H-10). Brainstorm as many possible
rewards or incentives as you can for recognizing
local programs that meet their performance
standards. - Then brainstorm sanctions that the state might
impose on local programs that do not meet their
performance standards. - Select a recorder for your group to write one
reward per Post-It Note and one sanction per
Post-It Note. - When you have finished, wait for further
instructions from the facilitator.
33Summary of Local Performance Standard-setting
Process
Procedure Goal
Select standard-setting model Reflect state policies Promote program improvement
Set rewards and sanctions policy Create incentives Avoid unintended effects
Make local adjustments Ensure standards are fair realistic for all programs
Provide T/A Create atmosphere of shared accountability
Monitor often Identify and avoid potential problems
34Getting Under the Data
-
- NRS data, as measured and reported by states,
represent the product of underlying programmatic
and instructional decisions and procedures.
35Four Sets of Measures
- Educational gain
- NRS Follow-up Measures
- Obtained a secondary credential
- Entered and retained employment
- Entered postsecondary education
- Retention
- Enrollment
36Educational Gain
37Follow-up Measures
38Retention
39Enrollment
Enrollment
Community Characteristics
Class Schedules and Locations
R e c r u i t m e n t
Instruction
Professional Development
40Data Carousel
41Question for Consideration
-
- How might it benefit local programs if the
State office were to initiate and maintain a
regular monitoring schedule to compare local
program performance against performance standards?
42Regular Monitoring of Performance Compared with
Standards
- Keeps locals focused on outcomes and processes
- Highlights issues of importance
- Increases staff involvement in the process
- Helps refine data collection processes and
products - Identifies areas for program improvement
- Identifies promising practices
- Yields information for decision-making
- Enhances program accountability.
43BUT
- How can states possibly monitor performance of
all local programs? - Dont we have enough to do already??
- Where will we find staff to conduct the reviews?
- Youre kidding, right??
44Not!
45So.Lets Find Some Answers
- How can you monitor performance of locals without
overburdening state staff? - What successful models are already out there??
- How does your state office currently ensure
local compliance with state requirements? - Can you build on existing structures?
46Approaches to Monitoring
- Desk Reviews
- Ongoing process
- Useful for quantitative data
- Proposals
- Performance measures
- Program improvement plans
- Staffing patterns
- Budgets
- On-site Reviews
- Single event, lasting 1-3 days
- Useful for qualitative data
- Review of processes program quality
- Input from diverse stakeholders
47Advantages and Disadvantages of Desk Reviews
Advantages Disadvantages
Data, reports, proposals, etc., already in state office Assumes accurate data that reflect reality
Review can be built into staffs regular workload Local staff and stakeholders not heard
Data is quantitative can be compared to previous years Static view of data no interaction in context
No travel time or costs required No team perspective
48Advantages and Disadvantages of On-site Reviews
Advantages Disadvantages
Data is qualitative review of processes program quality Stressful for local program and team
Input from perspectives of diverse stakeholders Arranging site visits and team is time-intensive for both locals and state
State works with locals to explore options for improvement provides T/A Requires time out-of-office
Opportunity to recognize strengths offer praise identify best practices Incurs travel costs
49Data Collection Strategies for Monitoring
- Program Self-Reviews (PSRs)
- Document Reviews
- Observations
- Interviews
50Program Self-Reviews
- Conducted by local program staff
- Review indicators of program quality
- Completed in advance of monitoring visit and can
help focus the on-site review - Results can guide the program improvement process
51Document Reviews
- Can review from a distance
- Proposals
- Qualitative and quantitative reports
- Improvement plans
- Can review on-site
- Student files
- Attendance records
- Entry and update records
- Course evaluations
52Qualitative and Quantitative DataÂ
53Observations
- Interactions
- during meetings
- At intake and orientation
- In hallways and on grounds
- In the classroom
- Link what is observed to
- Indicators of quality
- Activities in the program plan
- Professional development workshops
54Interviews
- Help clarify or explore ambiguous findings
- Provide information re stakeholders opinions,
knowledge, and needs - Administrative, instructional, and support staff
- Community partners
- Community agencies (e.g., employment, social
services) - Learners
-
55Fill in the Boxes Monitoring with Indicators of
Program Quality
-
- In teams of 4-5 and using H-12, fill in the
data sources you would expect to use, the
questions you would ask locals, and the
strategies you would use in conducting a desk
review versus an on-site review.
56Steps for Monitoring Local Programs
- Identify state policy for monitoring gather
support from stakeholders. - Consider past practices when specifying scope of
work for monitoring. - Identify persons to lead and participate in
monitoring. - Identify resources available for monitoring
locals. - Determine process for collecting data with
clearly defined criteria for rating conduct
monitoring. - Report findings and recommendations.
- Follow-up on results.
57Data Help
- Measure student progress
- Measure program effectiveness
- Assess instructional effectiveness
- Guide curriculum development
- Allocate resources wisely
- Promote accountability
- Report to funders and to the community
- Meet state and federal reporting requirements
- Show trends
58BUT
- Data do not help
- If the data are not valid and reliable
- If the appropriate questions are not asked after
reviewing the data or - If data analysis is not used for making wise
decisions.
59A Word about the Change Process
- Factors that allow us to accept change
- There is a compelling reason to do so
- We have a sense of ownership of the change
- Our leaders model they are serious about
supporting the change - We have a clear picture of what the change will
look like and - We have organizational support for lasting
systemic change.
60Stages of Change
- Maintenance of the old system
- Awareness of new possibilities
- Exploration of those new possibilities
- Transition to some of those possibilities or
changes - Emergence of a new infrastructure
- Predominance of the new system
61A Word of Caution
- Start small dont overwhelm locals with a data
dump. - Begin with the core issues, such as educational
gain. - Listen to what the data tell about the big
picture dont get lost in too many details. - Work to create trust and build support by laying
data on the table without fear of recrimination. - Provide training opportunities for staff on how
to use data. - Be patient, working with what is possible in the
local program. - Source Spokane, WA School Superintendent Brian
Benzel
62Planning and Implementing Program Improvement
- Stages of the Program Improvement Process
- Planning
- Implementing
- Evaluating and
- Documenting Lessons Learned and Making
Adjustments, as needed
63Planning Questions
- Who should be included on your program
improvement team? - How will you prioritize areas needing
improvement? - How will you identify and select strategies for
effecting improvement?
64Guiding Questions for Strategies
- Is the strategy
- Clear and understandable to all users?
- One specific action or activity, or dependent on
other activities? (If so, describe the sequence
of actions.) - An activity that will lead to accomplishing the
goal? - Observable and measurable?
- Assignable to specific persons?
- Based on best practices?
- One that all team members endorse?
- Doableone that can be implemented?
65Implementation Questions
- Who will be responsible for taking the lead on
ensuring that the change is implemented? - Who will be members of the change team and what
will be their roles? - How will expectations for the change be promoted
and nurtured? - How will the change be monitored?
66Evaluation Questions
- How will the changes that are implemented be
evaluated? - How will the team ensure that both short- and
long-term effects are measured? - Who will interpret the results?
- Who will be on the look-out for unintended
consequences?
67Possible Evaluation Results
- Significant improvement with no significant
unintended consequences Stay the course. - Little or no improvement Stay the course OR
scrap the changes? - A deterioration in outcomes Scrap the changes.
68Documenting the Process
- Document
- what worked and what didnt
- lessons learned and
- logical next steps or changes to the plan.
- Use as guide for future action.
69State Planning Time
- In your state teams, consider the questions
on H-14 and begin planning. - Consider the stakeholders you want to include in
your planning for data monitoring and program
improvement. - Consider the problems you anticipate facing and
propose solutions to those problems. - Complete H-14 to the best of your ability and be
prepared to report on your plan in one hour.
70Thank you
- Great Audience!
- Great Participation!
- Great Ideas!
- Live Long and Prosper!
- Good Luck!!